var/home/core/zuul-output/0000755000175000017500000000000015137035700014526 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015137053613015475 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000407042115137053472020265 0ustar corecore:W|ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD -/$~i.߷;U/;?FެxۻfW޾n^8>?xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm 0{vR̍>dQQ.aLk~g\UlxDJfw6xiz*|dd#)3c 0'Jw A57&Q"ԉQIF$%* 4B.K$*/Gmt΍L/1/ <Je63I[wdt6o[ .`:J ]HmS>v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;iF]u54!h/9Y@$9GAOI=2,!N{\00{B"唄(".V.U) _.f*g,Z0>?<;~9.뙘 vKAb;-$JRPţ*描Լf^`iwoW~wSL2uQO)qai]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.N`$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'[-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4Aْ .vqHP"P.dTrcD Yjz_aL_8};\N<:R€ N0RQ⚮FkeZ< )VCRQrC|}nw_~ܥ0~fgKAw^};fs)1K MޠPBUB1J{Ⱦ79`®3uO0T-Oy+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩLH.:;ߡ֖QʡCOx]*9W C;6)SCVOאUʇq )$ {SG!pN7,/M(.ΰdƛޜP16$ c:!%Piocej_H!CEF L훨bِp{!*({bʂAtĘ5dw9}ŒEanvVZ?C}!w,ƍͩ?9} [oF2(Y}Q7^{E}xA|AŜt;y}=W<*e'&Ж0(ݕ`{az^su/x)W>OK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7쿻чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ah?lm$K/$s_. WM]̍"W%`lO2-"ew@E=0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظĦ FqQI.ȨHWo;Nw$͹O$oEE-eq=.*Dp,V;(bgJ!gF)892sw*+{[or@x,))[o新#.͞.;=fc<)((b۲Eumw峛M2,V[cm,S~ AF~.2v?JNt=O7^r.@DEuU1}g$>8ac#sĢB\PIPfwJQJ;Qxm &GBf\ZA$Ba-z|A-I @x70 晪MV)m8[6-Te@`E|=U D(C{oVa*H7MQK"<O%MTTtx袥:2JޚݶKd7UZihRk71VDqiގ\<:Ѓ3"gJJčE&>&EI|I˿j2ǯɘCGOa9C1L ={fm&'^tigk$DA' elW@Tiv{ !]oBLKJO*t*\n-iȚ4`{x_z;j3Xh ׄ?xt.o:`x^d~0u$ v48 0_ | E"Hd"H`A0&dY3 ً[fctWF_hdxMUY.b=eaI3Z=᢬-'~DWc;j FRrI5%N/K;Dk rCbm7чsSW_8g{RY.~XfEߪg:smBi1 YBX4),[c^54Sg(s$sN' 88`wC3TE+A\.ԍל9 y{͝BxG&JS meT;{З>'[LR"w F05N<&AJ3DA0ʄ4(zTUWDdE3̻l^-Xw3Fɀ{B-~.h+U8 i1b8wؖ#~zQ`/L 9#Pu/<4A L<KL U(Ee'sCcq !Ȥ4΍ +aM(VldX ][T !Ȱ|HN~6y,⒊)$e{)SR#kהyϛ7^i58f4PmB8 Y{qeφvk73:1@ƛ.{f8IGv*1藺yx27M=>+VnG;\<PvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI+mj(^>c/"ɭex^k$# $V :]ظ,9BVQXodՔz q[*ڔC"1Ȋ-R0ڱ}oF4 3vFf#8^Vє+k@ :)@%9@nA B q 62!/ 6G (" u:)fSGAV(e֖t܁ ft~c.!R0N<R{mtdFdHÃФsxBl] " Δ<=9i/ d ␙F9Ґ)Hnxps2wApP!se]I)^ k?'WY|5EhܔS=lgӌ4U?jO_-T: ͰĵOVW*h X5 qq㘣ٛ̑6bɣ!I'Mahi9;]2`Ҟ6+?ZtRmyΪJU r!}R}Q& eHi0%!Q.1APdд.Ar:h@Zy+H3p7ٞ%eOi9u[ txYΖx_eɑvťJ*V.0+^ԧFIcu '‹y9Hj }1f]fsQJIVrQWq>N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQK$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU?tWӻ"N88YܭU}p c-csC{T-MfpqǸה-w֢ҙ40䎢^ R&%b&>BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dTb$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=OʯM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_Mq8!ד|$@D.ݮl`p48io^.š{_f>O)J=iww#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}޹na4p9/B@Dvܫs;/f֚Znϻ-RBz-p^:ZYUv`Ƌ-v|u>r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7bSem'k?I+/ػ޶^W|9_*[K'M6mNn8E0Ҍ9ѭ8CJI1/m}ICqHL (AKY<~Dƪ0hN!XmRcU|wA 0~ƪQݪ)T"rqQ$hRbԬ 7;SYy,TGSmX&Bf~j S'Gk)Xb7Y@NՌ1Vʛb?VUi,V$dPd<]5vP^#Z ^p4񌇣dz͘|i=N!pˌi!}sWcNI={:%e+ONݰ 4 =0"__s}ӈܗ؋H,G/+b5 ;csuF ?eqe9cxY\EzlAgm,q5ω,`w-׋KL;BX+&yY| bWY*L{8?dX֫|ø?jĈI&y:x'+]xME&{]9vuE8#+0w %^Nψ~I.R >,pKӇpiBNomlf.5怳Dq#yDݱ-mM׺Z^KMp-b}QȐ* l|6\i_\X {?EhfN0E` &2i_y|M״<׳B!tߟް4<^K_L{oGtӵts<)DSkT-?8#6 -b#d1[^RZް /E&*#Q _+F8gMbɪoz}Iaܴ ƃZl@%"El`b?QiGSX(=־e ֊2G>3b\o96" }(uӲlmo8̱y(825y,If[? @qDQ\J-9O"awؚL%Pf-Ȭ*YQ<}4|0 q-`9-; K|ikִWB/r Bd. `Ղg!IJbj7 OLPK&BL kR"j;Uu/\ɢ0oUy\O_`|:S& i1I _.:;K?WS:+$=\P~EE_P =)s–Ub:9i~[w?<)V,a~=m,3hKD6=эM MwIw`:_)%h 7NQĭ # ѳ4U^%U3NjzcCmÊ#Ű8u-.7M~?=_u٠[i9 g<4zO6ᶀcꓳ]^~aU*owT2h7wзVw!]'Y%狺:n飳)ݶnuף:VV,LDoǥ8%gsPoAD?fǝ|CJ((|ůgѣ zwySgZ6Y?u&Ve^.M2y}$>~f ?Zjf{lRAz+<ump=-XQ'oN[w]i0vK0u)Xzb{S[I`e+Ny#&9k"C7D%sX°L!p:LEkF1HY߲ .!5y]~ $H&}* T@ݰ߃~*t ?b^C! "_O%l^!{ѯ'8G'@; Q۬z8WkMVX`!iT'@ItM3 ,nҼ4 xx, ŢH-rZWD><0t-T N1߃sڕy~yx"`3 {.ЂefW4$!˾nSK;<8Jj#+"q9 VY;&"Q~[:<?k=gNޓ*xy6Zu!za&x*)/ѹ Ŏڗ")چ1kJ?xհ[`X%1tm-u"uo[}; )kܒnZ]{=̔k[TM-3ꑺD^n8|Cl Ƕ\+ٴҚgy)R?L #y"as391оX]BCْwcWGJ)Q=LWu--.ŗFdSTs^`"o*q)DDZRy &K ڟ&@*j),%WX^wB3*7I1ȃ c65Hz"3 z"n;hѺ7:o<@3] -LH[B.h,wKI~u.ͩlsFϷ)F"UqTbz}[q+r҃q+P{Ro 0]5eƧ@9-^%BjhUu:W3@JɫM:!-ϒUth$U=ࢆ(@0ؒ`-o|NWu(и~S!?`[tKSu)["1OEl?- 5ΣVjR!iXRՃwhtUj`ëK݁R=ݮqxmo+6S<KCcߵM!"Yp}ꪉ% `üJ[ae{މ14~5A(~̩D(^ubDNܶT! w?؋΍a,@rOB8%Lgp72(tt mG3!!0,dI}b2!n.d N,tJ'¼f }8͇`vgH wyYuEtu-+X.YT% O[ʬ7ـ0_ =PP{#(fnתNF <gqאa5{=Cvjh!빗l~_0cGi2F6 P~z4]&:2|]h~NV}෶1'Ka~}znDi(} `,R ϱ&ٻq$WPک[=>Wv$YL2D׍wfic{ N=~Yĩ S?j5eHn{C\hmxS=h/Mv(I?OTW2]%>΢(J ,GU4~B!=p=Ok0U/Vnl׵~a?eC0t7t؊oߚ.VY={ZŻ '6Y,H-=UjQ'E@roUnP6WʪjT$^|^3Xlఁw+`S`a|8rsPZ7{;~ޫv~y;`|ri#1x{ F/tvNAD/#2";舱8~w:ED^ޫ 8 ruĦXDi@ ڱfvW} cbP!Wb0 e[V~u^! <${@/0؉F(#4_? EU0bߡvק֎?EU@1RH$Ӯ(A#)];DP FPg? YkD.x9o@jןq(SUa0 g/EWUFC9^ܩ @#@(Lcg!5*92p3&Z'] x\@a@|m;綁RKA{~Ķ8[nItxU? 3qm6 ppc!nbem&ڍ5X| <n9({{Z`,[Tsg^WNDo;Apa3. nMbl_}+4_cQ4yY Ю~yhgHFtdsXQY܄߾{7!Lj&\ PۣW^MB~Dɩ=N;@|smݞzq+.LjlZs*m4pneݓfs{Ǽ\x]`Vj) h)mG,f]w 0 I. wY s[gqD\] r-E۬=0qx"]Щ=1f ,TXG-w&t/[sM j֪oj3/&]b j-c Nu(#ς1[`v1J0cKmsig Cw:u~Th'j…jM2)RC^eM!o4sX@ASp,`fX$\.%*Gfs˨6h'ViIBevk|b9&(+ȸȧ^FpU(sVRX:Sk`W-*V"}B,u]T8QTٍ"̮ЪY.Uя時[kRĆpH :0,x/h|(<;y^o%\FU%. ]:/qVMg±U(m8'QYxpV3{#̶ Wdo06eM @U?0]Õ b} 4#2l6[?ۘ (Ϸ"B4 yU9MbzRy Hu|\TAD`]=;%WX[ "s<=If9 n0Sz rn^.GbphgPE:x yKy.e} I%FwpE~2M,2A;+*nF5_-yZ~%#XztG ДiZp>:83eNiЭ4cM!J VHwSuT^7̞*tD?~lQ?r"OF^rxDn5˰CSO'U%OAޫUf`DɈ;:Q8`'mT>ǁ;C%B~1 o~G",컎MSrYQNQpAÒwOw[Y6ף?XpHihwFꢜh8=j쁢>Ea4cmS!YU* Fًة`4"'E,afXI؛dv~޾$x#:irjLRqZ,CQ80UiH(B mc[OtlRj)&~BwxR؝A”a#VjWٽQGھ?5 7vGCH[ 0C?XۧtB㿉@~˲_-J8q #z'!z4sڬ\&M'0 nT/"6L'!} ,g6=#,h/s!$Td)ζ /"X%FP.(@Pl;AezJ&uʞPlZeOVQޮ؍j)Um>r7]{]Z{ae~::l.u6]@PwsAu(޺z m'DA 7\P;A' o h.h ()fP4ݣ^4zq xŤV˫h~\dy-b!>T6 xq`OyZ ~Ym*R)ex^Q]Ja Fyx_1F[YYåM,ky sG{xj+ G"P(ԇdub4S;7}ZYvOTV0('YcAh^BُIĶV߿?ClG3u-h* 38NqgJYɢzSkz5Eb=TL>5bpLn ʲ)By"E4#w ZrorI;"ŊYpmZ.\5, \F2M *.xSE)8YBОu͎xeSkOà),Hk %l>-d--=%lSς^xHS*hny/C58Y=\12"cjx6Ag2ڤ:靂LUǬt'GYRͦl.~3\4ystao۷ƳjeazvyΆ+ʢ*\81\H39B x2WhLutLχ.S`Q}ZQiHƊu l{Id#r0GؐOTٙ i&9 .vښ?TJtou/;S-;Xt[L<ʿ TZa_-Ԫ,Z NA5j cTs?m|'=xֽb*/U 3`|CH=H<7Ck]/WFDJV[wƅ^Y`'+23|jv řr[T86âE1Ƣ\ vf>b9g$X56fug=nv4&a@)ڃ{ZzxYE:% wr 7, P9t'R !/ǹ2ւ@"LoDCayN:KKE,Ry(%eQp<5#y :RhFg_x[vutD5#dXY%ksɓ~dtu;"{/M'&cOlק+aD#uPՍMPep%sELQYw'Wex}Bx;C&:NNiGGFpBlBc2ꇶ={ \pۈ~rlUښ!?@G7+L3رMOδ4u7cVjQ\{_5@Փm`W.3 BqV- C n)'\H9!$OZuG^Dd썟.iNYp""YF0/kH횤 TH]pj(PdBc yݼ|. c1; !U٨ P^,t¿;ͲW*eok`v1fUl>q 7d$%Wʯ!`1~j"*FclWpkC $u6TDJ7{twX=ÐW(4;òقخ{UC֓wy1&=)>Zc2vivzI4 ^(W{f0<; j§UؑǮWThDU!5j~0|FE7!6hÞ& X_ӗ6ɰA8TeC km+Y*hQ$[>Gx vr#B#Rcu~q}m]q*4g8F]4JZ^ M) smoR|9{#;?Mj97f3}~mb=8)tabH ;̬P>'B~W,JPkQHuEs`7@tAo~'|~aw{?>|P5N>c}xo@uSuN_0g׳s*'+gzo>s=ȌSNt= W+WӺ+0J]7x[2fU4V+Qʔi3 p!*2ƛtDXN,NӌD;- Ƨk:A#)jlLbI μM+Gݰ~Hb}iZF]c8ug3yb)&xftvoB1acٳ-,p @R@XAC":,5 RC#Dz:ϝF圩r qRGSXtJ`ldҘsM1Bt#?{VbnX>hkĔVi*K*&9i Ĉaiʦ_wY)@jE #5\0bκfp#tA'^J'ZE *{γKZv1tџYiQ3S `O\5l!$' رdr)$`@yc)dZIkm1`$G׻OcV7Q*4CR`MUK$YmNIkUBTqJ0])A"VL441 F :nZ$&3ʋE%w磌J]~iG-X[)\DgβAsT g)j#F}΍DL)q,JA'шRGK /+SIrDY+y5fPc\h[ݩ߫aZ!6FI^VY"gXЁ BueUe$zkGv-$ڱZ'Rs5)njdkXrֳtjl$vn?`XCiX\ 6'?B$[  nf)Q|rj+|1fr@!x&*ZTrtCֹ_z![oN?8˳ҷDJL,i; 76h3,?B 3E#E;Ik|Q͑A/$qp8)382!^C N &\ 1i{r] |p!]psA|#Zb؊BE~iYy od o߮H*ާ$!Ե2 }Gfȥ1[]uya$vy)>85Ip2guZ'AYK:D m0b: Ne2x0KhF8HGRl:8dsO}sgA`XR2BBj0ÇÞkn f$m(  V:*Eζ]F[$c&ExѬYm0gCZ>~t$HbVBgCj{i;YgGI- 8bQ8d_k*IRҲ"qDHQ}KpҜ,!o!Vf#GšF)>(k2if}طb LZlלIk$8*( DLNY`РfRUӎIPaQ|3HjǺdS5L }I0SQ"$bĐ"ڙeT[t7HY%cʨ?Cn 5q$~ vp}ܸ@sy se<>c:N F)\c4 Bk"&FU2X 4uʀh"IlذhP`dkyR搝Ia||K=te15' F-5U+_1xz;oV,6lQ ֺuFսE85e 5haF5Ǩ8>C{_{#L44oH\'L59Ocy$?v:aM^[t*^82e `c(,ΆbqqsfG|65C2==xYDͦ bİJ \VYP]FZKPgnI489OA7!dtFEO$8r]K7+5 &(_ FzEFrgQ>ȃOt|tR@Z >,)'ap$XdMNh.4xl⏙f B҅r$ md4!z㸹yzz Qi&qXީ?ߵbG\Qcbٲg&({ټ\< AGʃ_mS~2!*(W Ip}&,+ƞ㈈/{jn$8 j㱒DR:oH})ң.Ev}RoG:ikq9R1]`T&ZYr#=̎ć!7gE4ro%bfpڞY VCJ\ {oFVEDxn!sֈόh֏ kTڅN`$N_b+YjMII9^Ż47-hGnsEMڝ1cv: ^ l{` e0%@PD1ЁU ]Vk%d_LYY4hEdtkPW@z4=oD[^/uh&O<1eMFe1(j0t1u~V>nu%wMS`];> bzQhOD[:3~W֕&,#8*UykHkT!rԦYjH7|!Fgy%,]L? 2CX;h4$oY"$9J‘1L=}uL>~cud*A5LNÞknȆ?ELөMUUh]Ba8 V|l<П+3Q1䙰dp9u-z40MI{1$8e- cBs897Nmܤl.d S7m,o(wæ7|xX\ϩMKRFN0!5]L,hapw-kuZ?շ(ΪG헙`Zf.F>- iOA;8#rJA4xQ,Q u4?oV_EіWjA)Ij6-|DolY>߬)S~QY V)AW9yXyI"H_$;5EnB0?F ۯn6=Hzw YQTUh\N7[`!1/48:=X{ZT=ic >erH*=owwSt${ r'@3EMW.t0Vwm_!4/H g1Qm%n!)Q2vbxəo|g]0mJrTaծmp(·3EJ^qwZaоj~+pL!T Ql;񋗩Q)Q*%azb]ўq$Fnc!cE߄3gpRU9J17hnV jgFzpW;1.VmE0.vnoϳ RʨW9Ϗܔ4ў I1'XW!2DFFVEBIw皱?\GBZuP|;Zj3gӤLw{xɩ J0a^BT&f"7|CL4Nz=4El\5C.-R^3fvbH ]}[zWCh䰟qݩA Ƴ5^?_ҥ8|&YwP0:Ih2p$ zI: q?i ~;8흜xCIž){1/'0nQ>)@4O#ĥĸ@ uz~L8 hKCxÛ_<[uPcE De06x6&?oN=VBݮWt@4 rjug7 owG0Cj7_7z_֡=np"bPA8:KJL bf' @kPsɾvN~i%|0}3c]prT~'@=y i)t&f<[dz;wSOig8AT_A_|&w1JkUO0^A 2P# ǃL{pZ%Cqw^Ņ^5E8-َMʀc p_krD@˘2CnkMժY2|(Gf6Qw< 2f6:W 6JičJʡee;oإNzW/Ǿ=ą6獘A1%~A<@ /7|0wm{ޡ+n23)A]ZFԍ8a,$rByaW@\֍tÍ&_vW%Xh=LFu`S.@([' .+EEϵ,?/N8WK8Qr bo~.M&^ozȉ>, "/emͳ6O";`*-(bDlZK~=3Bi}.pvGǤ2˾N$gF;P)؎CiY'wxpb~i4BeT}O; AZM.3,PQ$j]5 Ҿs>_jVvf'YE51*o?C#@dJB8]cjc!/}?(xYäA8}}+gj{.vX|srvyWϋD@rfw; W>S/yջJe:Wu/hLP$ z }0De)MD+]BhpJPș_ Q[ >Ŏ~# Yv\'|~)8vץyf=E]S[E)"qSH+ڥ Vij*n6Háe?yI1kZ(E<@1 sۥ \6iײEڠnwmV8ȏ9j>̌cI S .mȸZ=~ռ]J+-RZj >,Ś:opld{UPhn~7\'` nXq֛6XA%D:$iQXrܞiU֤Eڠ5nԏ[keG4agL'~vDpky$NO|fSi?1h6I%^8? RGRH)lwY-6Y]Or^=h@~b_rzYLr\?f478<˙_O$bO1qXPhNjyי/Qku?gV=#O>fFib2Kʼn~|)YUΓ:bOn:fx`hڧ~dp-sȣ(&[S-=qH@aJ4+-BٶƓfeL:>L ۆ)CYK0AOwmoD^>9+->x;1-5`G2 *499:V튒T'H9o9خ|Zy/zǨm~?yZPTZO3Tf(*"9 ?6d!7.+FikO8jƟ]F;'CW+t4?X])?Ykn*U<,,1#6Xk#m!1XBLȍ\Zb%Fale;\u>fg.\z̸?HxFz@Egՙ~Si"+ܡ56j^3@:$<šłE!PI;ECx75wFe.h$դfԭfԭ|sq}WqIi?C*sUV '!DQ X"")8$Ǒ(Ǡ S_͊_kh&5G3V [5#}p`3c\l# A#i!@Iɵ4vnE4nkU&֌UCe`!hPa4XKH,"pc,a"vT:3xgT*kF݊jiF7b聖#&XC/W_O\.\}-i;\.*(\>ի>ԒTL{80:I67IO( `&W뵨4z'o߸\l7 )Gpo sjZ\et 1G(3h֬~O)tT9R4( !4glK)V2o~|tC:ȓٛ`<lʝ(n) egIm8Rpi  >X))csx4p|&HҀ "9&i zvLnuU<Ï1<,~A ptpߛԝCi h4{ [Z_-lIdγw&+eJ[_@(n展eVe=@U7OX *Z{TmU f>~[1ź 䪤C y˵Oε~=Q~ky!( an5`ڻ[ n%ַF plbt(\:IΨ–InRhf4&@%ijmVcu=q>Z*Fw8:#+RertzPT=ׁktS== kɃ5(e{r"5w 4zeEݛaTat /Ex ?@*A|=Jd46!x6&3U~+Jk t.U8/(>m…V$&S27Y8m9PXCИ,dBSpN"$DVb PlTl `LjNQ$1ĹL*.-av*T9Fq"J(F3$m3S 0:qgSE#xeʀ# %qdUJM]m먎HQdqַȊh"j$D *[@GiA\y`7RUcӪFsU Hry}qU_UK;+a ȸJ7wUXX$zEH#A}pVWpfg.I7wo*ɿuNFR:m]la`:lL5SΩ H Nq9mԝhh4XCL޳m,W~hPC9MSM$nbYrEɩ[俟YRʶlҴL17(Zj93;;$lڰl.:0xaKMyh&HHid8 "09 ,4u>xn⋹hNYR%-e\#IqDM64\*גR})4#r9P@,`Ay*M='VIiUFSUdJS}DdGTw) \-$_\ ,&ð3C4ؤ4ٚqk(Qb95*s2BY+) @+YP:q8ɼ {<ǀҚ - qj,+=k fMWl`hlEQ} SOKy(W[c3̃h&|'( #DLxC"6.(JԁCmGi[amMqeCXqnie`"ThJ+Pc$rg`P 6V%T#)Ak iF%L 7#Cd鍓k&X7LUW L{t2ї@xoӯU=#1QqoA$e`1G;7:KT_-?dU" zm~W5F@|L jɈ8E7>Li!XBz᠍F%CNL1Ryb ȔXxq[З1jMmTC)H6ScI8 `űf઎mhSg3/3 /s|hYg#T$) EXf4e$8SiH5>#`mO~O \&cTp q,@sZt 4AfÊh֒|r_$|&iK0fsl6iN{8º`G&S u 7 •=AR?6 UG ؓ*Pg3'}ڎmW^Ɇ-}Ճ!U2<JGjL}' fi[}9= f8:#=G(_yQ,؍L-ݝp$1NRTb.yL2Lp-apVUG  my`YF2g,Z7$ ,~clSPF@qْT\T +dbefh( ,iT̠4Z6HU#}rBpIz,|` V)<b3pYAֽ7hqQs'n CNkBKPeRXH'a4i- #Ih8Vuԗ!R?$V\6;^7Z/nR8ÃՄŶTc=ըb Ft&H)SVaUcG&ぴO0 KkZ4t^ZFT1_FM1BzHʛxk"g^lT2䮓m͛Ub6dSQ<-ZzqOlX.7dš PCL f<)kŸ-ß.|wE\o컴DY | I;_,3*J-1=^ȗUV> ]r?Fuע)}ַ v徥^& ,jݯ[~w8r\bOW$N9hs3*8:_QXPqTyi1m9ױ5M C q-vT͈FSܵ \^}hG綀A0PjES}fD}\jI`B#8i< \qc\`L}yػuDPʙd`YBc'H0!>Jb[Żf5>ܦa%NOoX;?3Cjjp*Gbol|CN$C8߻x&!.| r>J[S#TB7$5J`eє͐A{z5bZEËr߃b( +K O2ߟ V8pc܎Q+@,_8lzu>[槃1!9P L@u:ūXOYxD\#s5D=r ۛ}ŵGgv'/$S ZPN/k7hRiu7ԦWPM5Y$bCM$`I|pMJD4:^A_ұ~* ޗoFkFm $u|#16i rLì -G"~!va5s$/t`$_Ջ ;5MBEM#%'fJQt^fq;r"iL͖ ԗ-9BOf 9l/` $|v^?p TZT &D+Q08 3DXM0S꘯AQ(ًw8nHbcJݍ$,r0[<G"|O& >!}cs7+fw|sd#nдxIo:Q~g.<41d \lҩa+lfJ s*){n" EqR$e](@xE+|J EySk8˓YVlT &O i*MиlZK(bc),4yo R蝋oO"6#EH~^Ӧdګ"DShkPw kpOuIvuM׷݅ѧXvҶezxkLbuG+&W]eGn_wc,Wj E"v UVmL tHzK3yIysmS|Ok"Qo_P$^/~4,>ca:iz9;x~˯Nx֕fV60"Lݿ #̤v)٭?IIXV~(-o[2_nxt@S  9., >E ޷Xs*9xR0#~5Kr3v PL>TQa m!Lçd6 7S >lN QK]\`$8OCų"9dU*E}ZkLKy]j(cFo${9櫳xAݫW}xKd;8qq$П <ۀc؂d\Y`̲I-:'1bePm?6,m9,BeT*9n۱mvcg-l@RFO,uJ0 5ab^]lb؆cuW=ՆM@ R2ER6w1َM+)ueo |y"%"8i,aq{4+"NNU9bm%Zh˼' |~N G$.q†y+5Շ߾uç kt-?s)bhpWۓ>y|)}DyEfV}Eq_mSDO .h[w%m$IW8}ؗ:֗JAA(J_? &=ȬP5ʌT;!×(=~Uʋ?O-ݻwn~)~ GKwwJw' `xmt ƥ9agd:?3?FI%|ծa="ĭG6_?{}xK[N5E\nPÆ?W9h\L &BIM0u/E7/χJlBR GCò.ɔ;] SdL+5*t*t,?:&=3M0WÞ?;9V3l]8?e&ƕOIfH&b" b'U.‹|ğ>Hכd}NN[-SQx.3vM")Qv{$x!j I䞛˞ox_ŗވu756ߩz2O͆H% JYADžGߟF/^P1q  -S՝_.CpST"JL-UT%893̸ca 6#ѓN bhԤ(dDZB/Pc@[>ׅ꣩>空,akc} B K$l 0؃F肍Z>l3/r5} ' (/ B+mDNy/nX| _ͧE. #ZGdXPiK:SxkQGlt "P'ܢ 4)z- 7.,a<﷚Zz}q#7Wqآ >VE=F~#?6QKz{;shXBJ3$\t ⹮9A- гln2 Z'U4@3y09FFr_ۂ?}&g,Y0`!8\+65sbq95Ըc}1&݂ڐi{ǚI0O38@х|,VN#!Rk#dXa*uH.|Gǁ dt j89r:~g˫ݕ3 >HNFRlV  :PCI3 +$% b@T Iqb/mޭ>WD=B,:&AS&/"^*L"(s=裖E_)(O@ vހw?C=eIؤ̉x+32Z 3 ңrc!w1ż< @nlADFH09S,kGľ`y-nHl rPȞ6M*l6Q1y&D~ TN[WR&c!{ \NڄZl=1TxLi6۾NKzcmM,T^Xn1T+,6:Unx|,)}>rχ %xErnNBc.ÍT_":jXo44ʊ Lg,QT1 TCݘ*jHP9Mo"[PKxCx@- -7W3~ۢ 0Xj Zy<Ш3 gns1aQ3C &XnY ME<w\V˺ai'VyO1w@- - % ` L`^6ghg ccѕ|cKTFgt$$ mJܞmYCGV|,z:}y]Q>˫̖9g8pդs`,*^ZbTN+..#̴G, XF@,f;ʃ{a[>]1]v"gMܶ{s}Zj*NM̹@ײ~N b) n|/g̣tqǢK7Q><]Kդ5ϳzW*nTSw@]f3؟vl=IK9ow>~ZyG][7$/I* ,'" 2Wƥiq@!:jXՌ=J5Q}5{t4][#gZQ[X7gͷQm AGX< PqOǸww2Yfr.Ӏ7 LX%xB{FꎢBă.C<ݍޡyƎ1rɎsFǨ‘F c_n5`Kc{t4(T$?6ˆ? 1M% 75Tg?Лe嚎- |xMy[Q" fX'~OJ%?MĔ놂*A;3t΃;;_itL3/|Lr@Am 2 a!(%WQ3g!{򡖏\ئq=] 6-;. Q7ߗ/ l 9/+ZOϩng\nFL8*$:'' cܛsGQGB'l0?bV:dVYO:E"1gwӽy_]{~ciQz68L(*ɷBm| aCry{VղPb:]ܬ߱[#-*<\ٽo}ƮM|Q{t4<](KH:u+D6 eJ$3|3K;IGG5B ZoTg. 6zʱ |>A# 7zœwfTvebRMrI*ţ!1E$O5R"hJ˜(=bΘbPxm=ue% KKQ_vP-A- aU7v)=: )DoiS-=]&,."bºGGӐqV!=:ZCtz_ hУAN8{^18rpb" <$El/|~%%*%,uNȹ*zt6 dt6\Om<] Q=:B'(_yAje7-@r-|96_s_ `tde\k} XZ3D)hxq!D\dŚϪN9Sr1%;S:ZN5pf>Są*p!8l@Uձ _ux|GGzc!3vOۀ5GGó%~LJa|$Ԩp`H !Z:>%}cA䢢I:Іmޭ>fzFx΁&ćZl4-E>%ӷO1ǂICM9gܑtªwW(`Q\"ԃEB zS,e"tC- /h~ǠgS5>J!!z )>88]|$Ef@3_m_C<q,G^9] ^?XUîdR6^us4x #Ό?2-Qm9d)hQwmrV2g&s+{[ w}d[)> 8P|:ֶ}-w~?mS)R0lt=pe@7'!n(;}5itG`]ɟלpQ|$p{SQ/e -m&sCN- b]L+/1F%h8T9PܖqBz_y6@WpC#S ` Up3oȱPzed7ʆZ>[7_s(< H`I3V ܅:1bS2ӆf'yICn b^.ϑ4vL;kMKl5kp`tT[3^JmEt|6_2`dGG(nB*TU"pM:34vnB!!fw}?LzDonٻm$WXp[x%\%٫|M*ɥ@˒"l_HRڹrRăFĉx\ph|+oQ%U$ayJ|˩r:n1o3_G/nzv|~Rb$T/8yg4M^u@0 $ۿ?j>⽳J'ϑ<ˑpuZ3%\7P$\BV ?Px\pF1U/Pc=/.0JLLZ9sr;2U uؿ7}2ě$ŃNXgiX.N*]\P'U^pu0O$;-⤷rm ?OO92Ҕ1"8͉R,>3L'mrf 3MBցě.?ϢorowJxx׉kpYU?u*2Dfr &:^SԴߓovBT(}nDOL&$l.> bKBãEE<kS'1S'ϹCDbc:V{ijF6+,1Z,QXlC#wtlkG-`s%K=2TE"EoE2r3oo/?0WQ=("o|quՍqKg]?t^{6hy=TyMtZG $|4_~xUwpMPJRDLPIa4Y)H{r51|ځ""1M%bƏE=^t xkTo7r/ •eD(S.M2&xAVXbZoDaf{jP__sY7jgQDi^C>l7_׶q@ڸ9z޹QIRhC c t¡d^F}hmh31Cxm>= 3gc>b>d7Ype( %- êG|WQT4$NjY_qh>|[ls|F*ڙr0y[ KwjѠTw +'dYoײ&r[)#JjjD/NmB噥.ǤTԿ_EԻ.GKR27xT@d1VƟ- Or1-nXs}kD|AAvVa8@?y{vtDis`e61e6}PkDcO81FAT*QTy?ߡm _g`6b2 *{!6+D1ZDQ踠ZQ5" ϜQ6V8IeQj0aӦXϮvCt&%~حhAw_1ظOQ>^d1m:/Fe{eZ/6R0wa/H *&Qlu5gSb67I^بƴ!>v pju@F3aD\ 0RNqܾg >䩡^ׂ"Fb2a&4NƤ$f_BҘ$V=t!cΒ&9 [P$$a@B=)?%~/(0vT҉ʵ>*ѧJ.3DPC` 1v"mc"̄L/ ``EL H*$9FFI75͘nD]7x1Ԏ4qi`xLXX9 cuvXծ^Ƿ u-b5bvJZ(wN)F_+2A,bb ,692$w}V75)&hצ(Wgb"_ uMb=qv nNc˃`}z?9a'(s7~q][_*[yww[M$?P&?M]KVꛩ{[n|Rq^SUe<>WOm'op4h."p g~\q MmY]YYv5exx.?>vZ˓{Vp3YK=OqMMUM҇r>鲿Ϋt_/ Ԏ ͼ?uھtSt=IgygT7WY ǹV!f`QӚ3LIlQGF0gX>0~݊~!»N+dgaP\٤?=NpsQS4sd.GwIWg[F>볩DOC'3< M}2 o8md=64#fY()WjqvR^`-Z%G_#i7^m!|ok7{j`m-m~YoNKj́ս}%67n6m ke9yr?l~|=)dʍүjZCNl~q]eכbAT񉦊EBGÁVI3euN>^_i/-KFy*@!-bRIn""+"T*w:ο='B‰0w -%1&w\LK:~1_N+[<YLm>jc#Ǝ^q  J$wUQN_QW HqQ6o*;@AI«ib)*gDǼ_1P&,rUCvTOeż X/! c]EÑVQPrBksBSS08v%DP@J΂@ d\UjS©Jˇ{+&r48 99 l.c._o=}̘|)NhCvK Nh3A8jr&';)I>n ev U ڪRcB­v[p$W8GRC؅>o:4Kjt;nka2 :1"f%u(An:Fxj*~mӭhR [~33Z߅ iNnto'jNnpxr [3V63T-bIsT&.pIKrNwhJ-SFv:%ܹN((%Omc/gpA +%a9}8RbEJ{Bj$p2fz.u 9j8>k)帱|*vtцz=_=Bj7*D!m@ iќ:9.[n0B|D8$)~`Nr%uNz #V!MG KVU!j:ʰZE&F~lXu? xFA .a#P%=tt8#mD (pnJ >>9AX(Xn-rBp{{:_Sg1Gڞm}pߥsήc"9Pg1GYҤ[nAD%8WLb^]ʐK٩w랞3 _jc1M]s!UqƋ0 kN`%_ uva rjтwj~M*S!@EgScșvE~vwCpAɢFb-fjQȏ%nB" ɕfQ\y3+A Ĉ$Ia'1f|1N ^/9Pg e4cTR܃L|nPY`vy,-C\KT@?-bL衽S1n!N "Sb|;&# i5Xf%pQ(%ޏ^Ŏ'j6 KšAX{tOj+8y{v]OiQ >0\wxm+wWƼ7'7c_G#T  Z8N timz; WaZ0aeRSUúg`w+;M+;>/^v gwdzƽEnnw㫶 ..w}b'ޫtzoܚht7.G7ylbތ~ٻޯX̦=kY^o2 4ӷ纱6~jhy;;@SP\fG^_nqz\ ybۭ[Y5flQACUuxsqAzQ]@UPT yGmu p桌1"9%=m˻EeRь?)fׁ*yVU꣨]XާJPrd-b!@S,<zRّ1DLFq=*H#C|ɜةBKy8 }Z B|(5}FyXGմ:pWgSj8|M(N'iкT0RZP!4CL "eBw4Y6ԫ˄ڈT cTe "`)E gdCG 4  AB- A >m6 lz<"Z}UFBnw~ٛѬ;neU;c^3m[~yÊ:z.:0zSg-ͳ/[:<ո REGMǫo3Xj>utkV'n<vI ,  ,?a]}'J^3f0f7B#y3-tt16oxTk:yjw3g%Z+ 6]v=*VA ((PₒUSZom̜֊et$VV痿%k uIB5fJeRi[ ƙF00bBJn|> 6V[O ^Ѝ2 U"NSX4q򝳨MϟnW5VfEh$ޫ4K> O0%U|^ ;[=rsX^ ˝jܠT\vJΥtVrGE\ .KbΞ?!TPN>3YPCg 0R\Ņ,:*ӪJۛ1<ڹĹnӝaXf Sgk=_Y+ȏKBJȊh_47`@0Q"dV?YVaz jbn>L"05C[iy.43i:e#DXcR0܎QQ}% Px@TF8Q ai"@~ibFZ*(X7X .~AJK!•`\&,}(\ɏlQގ٬Iv;{uj Rj|ꉶ߶\߄pʉs'N #]mVc-eM֨yWjsGaBH.1¯'/(ߜ;*' ko6 (PIEa9 eC-~Ѱo+Ib0u؉`B>]\͝bQ"oqHvTzdE'Na 8nxZ-^HLkiMü%%+ c3+FD@m Gqog=.}`w&G"qv0؝k }*}Aޣ(3`'ކ/.(^CB `@[Pfy&C.:x۝p8*s?isf3l4f3C7 :g2Z+nCԁ[fED @W0 5)yD\S `7 uℒ.űL5GBf0Dr)& {yN-KY;:Ŗe d,=Ԟqd?aT8zIfJ3-EdT$$C*.QHB3M* a3sK$2{$ Ric*SZd%@%0ILJ'rE $yWz6񁤛-VC0OlUR I^9~v"}xjS]SVKd17sk "(F(ǶGce0! TCB|ʎU_UvG1J0r'.=K<؉]yυN!=G@cİL)ǍSNRZ%a_!$&ԿbH8L\' wO&Wvˌ>yCs7tJU=A8{[D cج&Ql-+ϳUxJ' Zr.gO^O95K(-0<_&g#39!. W!5 Id<]LQ-C3#EvDCɬ ,1SOl"//"DNz{Nԩfgt~Iy20o˜b&_C QdyU('>j\SмR RXj˺,("rb}Iydj/SF[L'lowdl]`I~tR/Zȉrzy9h/<.Tc{(6pQqy}Ff_k))3T` 4%ir{z6EnӭhR `Oq|f]UҘ3P;{..eva.-Kzj="RJyY= ~z}V ˋa{԰hp9Z o]'pfg7tW1B-bHnRfѮ\ט=bj*,یy]=/5mc#+m׋ %p/+^/E HA<Jg/HZڔp?6@56S 2:E&Φvnvu59D־428Zʮ̀[=ry,2 s3Z&pa k;I [m?DK`3Ȁ{+A<†N@>ǴD\K N0Ĕl6uHx@0;!z~}MnN3(oZ:*j,3X FE8 V WٔEIlη˙޷TuAL5uĊ8Χ0NTaq 4L" ʣ:8/%B`à02{=M>]L }c!Yf1HKoyouN\,I5/FAȳQj$a0 #T~rsC=< 55;?Jr~M,kʼn_ܧtTNɛHrHb+Q( rȑYbaqu@ ة?}Ϝ> 88^$b<AI4}E,]Own M>Tw"UۊG@j( fM# t06xU,xY~0TRa*bj>7G1k^o}Eu㛟v6 "{xGo!Exa82*uGM'?C4p0k`bRlP%%S״:<( NX͡# hFtѠ"ʸwt+9O0 aJ,iW-`Pb`Q*P%ֈ^kD`q4RIE3!$YZ0`VQOtY#.a^4-7Oc$׶3dflڴ\, Ch4s̖f7 s_vQNG08rv uͨG=< #OuqSzlE,XQ WValeGG&RvȑIBGap4M nebGPIPcu%Ҡ\NKL0`Ycj1C˙zBUN _D`I( Nc{l: "ְ9Ð , ͈:ϣ0><}_$20F0jX,)>BacqQ U\/vӮ}Ir|/KOq %4E`hHj |zxG_5+xȸBR~ '·qNTP_II[fwҷ0 J [J׎"lT*<<,f^IJ{E0 Cƈ%y( ̯mg_PKk.Ax^PJxgGapΖ(1TʒG2<~LRFH_em9>͏cJ7 +Ĝ57Qn( `H3.s~34_RTfhEkVe8ѱY0("xD H%?PfjQr:Ƒ9B0(|x͔xI0\F&TH)o9.Y^2ba7ɞYd窤1O?~G![G?У0頫GYp9=dCf]t@0LEB6HT9mtԿgpaX}S[9"%uGap D 8I`Q kQZe@4)8(3J ܮ>/0SZ!HL/HZk!Ov_WZW2H;rv%;J0*9DuԞQ&G8I^ hu)eI=FR9A6;#A*$ e1-t2' 4N3cSMbDH:,( NvyӚtel6+!yD8¤0Ba$dLS 6&O8ncr/&Sch ,ѲD֤cjAW(P䌿Oұg 9΢04V"n2T|aHX<Ǖx209"b@l3KYUVHҖGap2xT42!n|@YRkdH%{k % Q! 077C h<-trߑ~o5_ HMVã08W Jɹ٩2ю~LC, gn,6=ihJ]*e lNdt'YALcbvdc@0 E竕Rr{x'C@c3ߜ"\ &T0k)l&ڮ\ DZ(¸OaN2zxˋ$I$!2^8# $kG*uYtknI1a0$h:i20E"J~< ]JIJɵQ RmF"Cҳ$f3-HC3i>Qhz\€=|רGIp0q4m cs( NüS R>*u5XG]r!0zM0,֧ !,ZGapx~ѫZHMzxGےGJl(c5M񙯹H_$9')=^D#;AMK2 *ܠY=daLg(0k7mr1鐬$xa8Xc/wˇ!6!QGap4$ :0󌳌#e(:gt٢ٴ.3( !)EsPρZ 5FmU UbCFbot o|sP,X o~_f λ׸.6Btb[t/I+ Q*B `)A=So&˅e}\׹EߴÄ">U7ТLC^L"ifM+Nrì.Yר65E8Zs+?6ƊloC pY0Hq~O$ER OD!0ie^tVR8=f~>}Gf:\w ?L󟦰*Rot>m`W}<_fG_[?!ZX"q[?p]T"x۽L^5I:?~n2r׍ez3۬!.gf9y =y}O[og/n&u|^Ӿ9$-l}~6ϮWyݴw~ fuZϷPs?r OV\PqUU,- sunfntdaV޽l kf3X!mM„+wP8z:v{osn}ٝnf^Y8LlXֹ{WƑJC/ ~ȞI$3HW %*$ &jQ\Z6Ed/;Ugx/1|OKzN%w 3z& xLzRJrԟ>ow1aӂizƕ!\p!z˞&̓l3Z:'M('MIr҄4!'MIr҄a );(yݔ=4eaIﯖ>%Vl$]HC}a\L͹O t (4a.xJ}Y<5%IIȦ^I> yC;$d쓐} (ŞxaN~W\a΋8Q- ,|K&(ʹ-5П= d\n5lЃKxiB =n2UY(^yIB[0 ]f>~5j fޒokTۘhWh$Ff7?+ōW&Ld0+N^[! HICk%)7ٌ #h=@$/tLl}8f2FaV}KNgwɮRi&Ir6gj U>\J!e5;n "VQk$f/wJsƫ̹qSZF-ئVkGѡv8@-=aƂRV)' V$%$㤪׊}fuPmb󈏎wQ4|a3Da=>%%()Ac9 5.A]k w7K4|&nz QQ|'K^OObvo9N}}4SH0FJxv߹{%rv}S[8 K:g=gn:=#^}Ӂzuu|ozCzfQfpa&i/9w.oXmxk _;w]BV~# LQGax.ftEcMtM)5Se E(QE)ANٰ9LC.+_{PrMu3^lZ&=!5ů-l\rSs6|N8Y,ޏpYLI/\>- aa&o:j}y%LՋbib$Yc$\*g[}0~4<gDz֊F_YvUFϰkUBȹy4WPI9{z[dp߱ Ir=` i5={ }%LNo\HEhCeX0MB(E9|x>/R ]_ӛ5uN#$ely4H)ʝ\x:F}hv,.D*qZ |_6FvG߾<G)mYLgfp%3R>tHF{:̵7ػTTJ"3ؕdӇk3rm@(^:hӣE$=&E DzuoPMC6 ,HQ ”*^׎Juq<bB^ZQt?Oi&SjBF(#D+1"&;!<$9 A+K!Ήt\Ըjiᘲ11NDFh \G8zBE$ Riˌ_m(A~i2nMT'jꕅicDiB+?KJ2B"U SӶ@) u赍:a~դǀsA^E#i1Z? UU8QQ9`rc e8z&v t؀uq_|la?܆6Doɰ7kAQ kk2"2C(p H8V`@#`ED`*4"[Fg"a 1!S&aJ`Zb" FmμĄ9&nNk#kp+FpduJl*smeԓK }a*2./ٻso"mx}M r=Xoe6Fc+u8HScyz\4x 3̇w@hFEύ.v u;"E RFsL*"K04fi%`X$d'>ݖK}te}x4zX̪YDuBDdq&[(W]PV`(`LCSH3ߕjjHm2#uHr}l[C =$X(¢1(8h/qTi$F |@o8 /%uJybkܯi_#Q¸lP+̊,)(c#>|\8?Ll`Pfś^̹5, 9ຟ[? {9ʼn.bSH1 S +IKA)opfG~%P0 *:mI5iatH 0oCw8b>M4g:調nl| e*_”b\OZ'Tq` u<MK7A˷ LNG%6 ] "\¨ާ|W\b~{ьcU7/Nfg$::sf> h|f&Qq>>rV9⫦vΟ36Ÿ׶65C673YLT^p>.x3|y9ӫ6gٶՈuUV7U#0V)'LXrHjs\ ǯ@:yͫo}WO1Q'89} `fsM]u~l֢7hZv4l`O%rԋ o.rڽ>j4fR%QӋsr8u)EJZwZ$EW:y4rᗏs -^Q꟫-t+w6QV]:Vrd3: RӞpD![Yb}&73>T/H!q 08G:*=X3k:%/4#XhƲt׻ K+珇8tI"Cxk3 q]ג^lyfnf'{}α)ia "p$Y , $Zu:9ԙEKݘ ĵpxGvy,$M*E$44:Qޘse)\Xr^F۠?(>qע-]!?Lpa9YTS0?sG}zr@T uIW 'T @IRw ^Ń yWGz^iWz^iWHǵC*/_,w)S»_6C܅$T\5\5WiUZs\5Wi8Γss r\6e`s\6v.fQ񅗁}j^:8n-$R6FEO[HyFM4*B4^a(-xϻwg)R5A(p#X𞰠 T{p^@=lR9'RGݍ+cČcd-{due=!-)]3B&62zk/b$FQ1G EmudXDcZ)"V_"D4׶0j9p8M۾vM>]khiwm@V0f̷Ueh6l?5y˒Sff:~=wgwfc}18ft)1U1#|c>p1# -*41C|c>p18>=wzp.ag=F% U2`rޣi'=4ywf|?:%he);vH]"b,Pcbl" BaJIsQ$tBI 2EI4hiB$ Lc,-g!)fי+t՝I>zKuR 0.L+c-j=h[hhKU!WIR5Н*!yLwMF(5 փfP88AziN~{$Qe\QHPmHN̠h$}[ 'M-KpDRQB@Aӎ4Wp_6[CBptT0x ls5 F- Xs΅N*3 cOg7Y!'[87R #O&NHm@]`+0Oa;/EC-7=:Iv=`8i`ٻ4$tNCp\xez"f.:G=e@_V7 @4܊ǢzYYѥ85uޤtxf DZшxTgR,#:jZb\j qQ J)KFR.BFTt8֗٠GƖ*ɧѓ|Ⱦ'f>9czCqAK2K6hXoD & 9-]ͱn8y,aPߢz3]Wn~?qZmP&\eY븴2 )d(DqqytS M։j3H 9{6 c^quz2nт@ɩPhʹK(ThE:L)pZQ)gM󦆳cr5o@x٭ؖժeנ|0\4[l+?RN5.l@3.5V:U3&<+2n,wG3𰗟B6h yu'2+pf{}_(bٚ)_P`@ S3BK)(udl3ꄾgWÀdJYKdyw 7n=aFC0c,.k3+G~ 7bxضߖruWL]9'Tq` ՝'W%pquDנ'>0*͌0yhvV\k{me`vw0x.se߻X,շ+$9bia!Tѓ"'7tU FnIa!`*&->o =w՛nJ^瓬ju99_cl$4ay߃mokTM/ Ǚ>9~Ûoߤ߽}՛w0QrW@ LK7 Ca` N;]˦]ѵDMzueMN߻--ٞ[(܂(~j_&.U '?VgEK++0l`RN=nD1 AQ"LrP@/گ I'qnV'^ I"4QFǂHGŹkfT^j{$pJ4cކ{|n}α)ia "p$Y , Σm&I>G;O>Vim)jijNh֣eE0\w ɧnfJO!u IUROyƁ$` Y^T%@oS'6.=ե@uw1!ח V 3=W?׳e[Nx]C÷P4POPJyR#3OcD=6%Zv"kY"M-{H0ǁNO&PAD@! X! UBA*J8>^mەMn4 aL}jы>]]إ{_{}lޥJWbjNaUGKq K6;64qj==c3Ą 5S@vM/t9nZhf1lheS~ѭ^.ݭv?dJZ+ /hW{Ly 03-]x N!kzOg{OF'+vkvQy<89yW醘:J5c<u_MYM],17 )nssIevޅ&KstM,ܚ6dN rKZpɿM}9)lMG zLճiP=uhJhI9٤{8#buu+YTPdCJsg.gxe &e"G͹N[ )&rxJxhnפE3QBd<*#{q,h{iF(/&oߵ{?ٮ B݂~#"{f~G٫ׯ_dAHpH))yė #}>|02 ωx;@< gpհ%czXINUu=wYf4unIl.^柾qcl5$}NjulrËP"@-ݦG&0n6[{ڏ>NnSNCk<)۔m6trM9ݦnSN)-A/9Ĵ{u/ͷky]·S ?d|wxIuS莤Jҷ7CR:W> -7`E#, 3*qT9M2y glG}ӔvTMߏ*O< QIbt>#Q@1鉱A L BaJ0EJ'0)%OJAY&60#g!)fǙ!lc a }4UqāUK/4ОA7]z奯&gP!]$P|zn=[ L⢡:'P@z@w \ Xڨ2znA-6ƹqpg]9e{9ƞ9GuNYƵX9QIk 6Th JB!DHD"#ZQ\1 Ycég_}}lVrZ,@#GhK1w\b4 >\8!ҽeXs!OL +d4xF*`Q`IrM`2*k <j5ܴui_ z=+.UI`9#b {wR8 :8N (`7)l&8Vp4"8&az=:PlT^ڔAhb%ڔ=aZ޹u5*R$Td4Ax7%9`G%Q`ɠj$e"D.iIEKmi} zD`l㐧cfcaHd_ 3^=8⠍F%CNICBaO7"P Vƃǜecr7<hB0wo z=Iՙ~W|mft鄗ǥ8Hq764A6nGy~*: 6C3rjʣ_ \xl7oȟPX(.ZOAR Ў%g=AgzfnX!TLD'6uX 9C(oKxѾ2S e̢цʞ쒷fC(Qx4WʓѨYrלotnqtGyX0oUR" ('cK k ℚCR},[]r>=,_aAx=X*Jb+" 9J4{tR>V #MIlPMY/s"U %/J;\8Y[}Rʀ- MpBT\Th[MvSfuLؖB.|o5TtvZA$nj, V(]d(ŪTeK΂t]Ԭ+$)۫MD@ăMn<Nbr=C!Uu }:_<`W#m٫l۪-7ږ( N_sqQP-U8pҘ!p70\F*_ӽq]RIytΏ~qV)Q,?/Pi`.-jIDrRl2.s%~<]-n'K!,+9F/>~S٣w/=}RH=~GXtvM~{L࣡}-v.FgЎDZ!ꖐwΙ=k ^~xhR3k|߅R \$'eT.l].HtY;hֳg!^Z.Nں{,Ѷ-q>j+h;Ib4e֐ SA| &(@sO֘Zd2h}p>må̻W'bH@AVŭF aH\gQe _بl^=Os$ .O;_}Дv]uHb2IlK[ U((,2c3T'- Bx˛y$ R}bYU*,<N0*!47,\jNG s45U}&לmOxm *KDQ&+K&ѡUEQBN4"49+)Oe^λ熛bDߙ@xÕl rv)hy+XBsRI)# {l!ykr*2LngI N0=?B Rz޸HE4 ӎfaR7W= M: $ M=RWLY#HcTgӳuBr "HWo -tPlU|ZtTɬ, E`y̩G"/ବ;2~۽Vˇᳮ^[uT0dr¼G'NdpF9B6 ixv6tр#ttvԽ,ǎY>U:ON~3=雌VcSd:m gA)5FE~$q{렠Sz>yk='4D룬2!׊i?ъ_mjϦvJ_>2o/WE.*;G`9*VU{sQ 8QX@:`POSh `-Z\Jԅʳfd*Zp{k~]aTGD(iTs% -+[>C2ib"fkL AKMj~S'Lȴh1h3 ~Vk6fd{ܷ8u@PAHE]b9C(4\s]ye!csx)ʹC=Ph #WܷčdZwjceC(4?#G%6հCEUM%qVlm ;.WvLBʙ#I$2E(}%es4B42%4Ȝh+yQL4T{ZŒE0  'bh FEUPh^; W!sQqJQ&jJٻX^RPhh|T15AYMJʹ2pC(6O˿ )нۿa`l|m=[hP1aki ,nu}as܇JhR̬~]Bگ>[1& 2Js[TAV<ӵ *^SQ!xjٙr[B1֥WQ>O/NK `D+=Ro# 5FEdՄ][ܛ''ӷ4/wrR/td. cwJNdQv뤭!_fo 1d`ߑ9-kj 1mm})LY*2al[D} MڷV򮦞C%}= {!.@ګb׽6]Ls3X,%%.\fCn^ܬJ7_-6LO!h棷eqZCdztz>oGwk_{x!?;/UAI~Ip5Qu_Oѧ[:8*xon?e43q</u,Ї|~#~O #սk~ z*fff~`_HF ҝz}I6Y]7pC!rw9]8&6p$xx:Xz7 ux:AL=L*'Τڔ?eofRPV(ޞJKJFf))Ѡ qׁ Q}kh7u%ZoMUJ:Hb5`/ٺ;A'43dhk}A4<$R4Ƌ(#{%NIQҚ(chQeCͦyeluޣ~ E˜KzZ8/lφzieAF( FȘmuu\TQJq@"%P` Ⱦ𮬷?] Mwg;7s+{ܡ^?gˊtoo}1 <ovp |mҩfWE03ewJi]1&Xk~i쎵->!h1p]5QCKejxjW{'?L]>j'nRccA S铘Y녒hؔ.lsZ F.C>΃hh v2:luV7Hޣņ7iwhG{fF#F;}y=!RWZS3ߨe]L2ե[Uw|fv?Jկ(.ƂS rUaU J R1 Ar|ΦKN(F G xG1kn]AdN:$Y f-B%] o'8Mkj K?-M6ifp%Dk =S:D:ϝq\jALB:ϼ+J չԜЪGscEoNE;jiX;[ NX;7Kbvm+cWN7ܦ?G7.ݘR2OHhMl$B ճXt~?b񡦲n7\dY/t5hy EYib#$,61/_@q2یAkei}8-<Ʋ:En;~-H{8zGܾgV<2. 2h5 @]} aW>%k|sZ/\T<"V % }}SC}{̡!uȘ|Q&rIֻ*q*G'-1^Q-q J3ʕK,u18q{c8\rOh|ԖhIM"]9?RfsF_zr8< (Y၂KOQD-C4@t`pSDHCH/:P]1sD6yiLk6L69W5+dF#P@q5 WyTOc%P(~-h3r!SR֣N$r&"R0lz%P)xy3 5mH g6( &>[F_"mtNIpnϹRTSC[EjV}GѵJFa"9xBc`ɐ{Q R#߾cRnp9zŻ55cgaZlTS9x=d@*{*U3gԣ^O 7˯Йm䫆 }&m7?UHhOXǁ8;+KɼSc.5h%IVS,nMe7i'PRU^@tu>*m~8garRy1G.[TMg*3R]Q_Uv7_*$FZdP_`Yq\Ń]S>FUsfWw.YQKb‡e`v.t4gۙ] uIU? bj0Aa$SG#׎F4 Y^|4'ٟ,ƣYnJ  0XD&HܘfqZvqҪAz%qo׶EyV^yV+.S7+UUv\G.nFآ#C  H|I=ZD7Gm"Y)"w)&nѫ{WzMr|ZSmNĮm kl[u$dP$KXY \b0 @3#"γ$U ^b*D/T5$/Fq))\H>mT%20^&<FVCemk][[w$'7@7^Ő 6țn  C"( x-G!JO౺i jQqG^`5oA&i,;=Z }ѡןJ͆z|2Qh }т#J)8TpnU-|s&U*_\yo2LGfL*A 5!km YgDX| pZ=XS]eQ8UfmeJ2+cK2m4${UcY Q!mCgpܞXL *K;m=Ė ;sAY\hI<2Dy:g0B[MÌhr<vˣ?z"O>nbpunP/*?29>lDK4K?d$&Xnh4wLt0GbCE`?,o(T}`'AQ]VsD/;rIOn 뚢gIWg]-nN}ơws4EpҗzxYPn3MOoDed޾4 bQ~^IwySɠp۠_FQz׻ghRA"!§] ᒊމ4=mIe2Ҳ}??. ;Я;$gv煚[O266)4#@OQ>/2;-~<=қtJ} {6͝t?W;Pl@=-"a%{F3 >[Nfax]zW E`ga>}V tS"i"NޛcoS^ *״0Ms΅HTMgO.bǢMdY/AĘqBS\FZ2wrPrS͖^e4B4 !*'P )z0b T{)pJs*pMndz =4 K?s5KUwhbz}*{oⓚFr> )Cu GM6pD-׎ۘצO[=\#(1M`QjL֦dtDGJ[\"3Gq,(@О)MX2DB`IAL2Z)==oX#gK=;.Uyg>8+Lp@Xe2܆%,U})Im֐Uުn*pZfGW2wWBJ DRDCCwAN{&"G7#MC0R'ۮNV hDTdVŠK!7:QBܡs %='ch G#㤚k5&fqӚDR(OKM\i0x-$5Oݰ^n0ԑ& yR fRT)x^PR(.apf"D:ќz,X*y*јPs]DdFCEe-{TOc%P(: }r5 NKYFdʙ`xPJ^%*: I%ugɤ2pG:H]YZ˱ ׳0^-"NUh/v)!xлŧƠJ>m&2ଃzÜu~sy:VjxgvZ^/ q NR2GJtѥ9Cͷd1~ъsCו%\M ?"U SRKr3+T~8CgauxRy1G.rg 7wmI_ ^!`{ ~J)R&)b!)Ci(2W֔fFɶ,5C__nn/W?*8ZT8ݿ%p1ڋ%zG%Z$|}FUh),Cu|_>xs7vm.Fbn̗(We~Jb4x7++;[vFa̤3MCꧩlfUTށ0 (vhuAÜl˭uuֽF`/gBa!ibĥb/ѸįŸ] ^8ч~w7?Lԇû߀(0+ %[@^ߞlL?o<`j0bj= ͷɼWwS}yKgvn@zݗ7#`寋#'8=Jz/O`٠7N{|rU]_|*wKȯe GTYh.&]:{1N18,:$-I!q 08G:*=X3kZk/4 H4t hz'ц;k7Kp3-R.N$S;&vj\EE `-|>-hd0ͰԚ<Hs-4qLG <vj3jJ2b=6M `p:,vR0TC`D|>*ϮŤiŏdHjٱy9N`.0 "-Ij{+%R 9:-&_K&hWߵ>8;/UXvB@y`7P2zE\(#t+8>jK ^Ȍ )U >Kzj]wݥXM=ay?ӷ:'z+6*Yj xR]@J]B:`^S%.r ve_E.qVqf@YGe9AQN^MCd#6HcаKc  #$$8XGIFbR$S, A9p0I̥$6t&hkZiԴNkXH#轪=,ܔ&Qy: g xS Jk79xT=f<*+/ ?w f6KKp14pTfbS+&o Ah./hv2zSx!eÔ IVj>H"`EyR`rM7 $ vØdy[0)M@-ƒ"IP+ʋՁܘۆ፼_CqNI9(WRS9)ð]1۩FP3Gg4Cq,īeXix|( k̀&Šk}dD{xspJXgm-Gn (Rp<e 9] EyD7d1xgBW-ͭGth9W;m>QUfč\,RO{cͣ~yS?d/+؇]wiz[}&YڨkTӣލ :%ٺ_"VXs@mtX$ftxU"}TFU32qqEvf6FNn6;_yD]Y.6FS-n= -f銵'i0ɐ1/R: ͡I7(GpMiNjtxСYOłvd<αTveaTwl|԰|!ؙ u6rm:i=|g?w/4+u̦-okcw'wzhss>g|;$WaV|ڳP+(@MPK&zG}{ -[|^i[᭝OcV7۾TމJĭ*@<\ngI¦_9VD)  &"!t^Z6ޤ 0r 8M(rneZGEdQ0A[R.k%GFèg¨uA8p}/=O_pyp+Vv|~5X-m{wGn~4,c<"_~!*k . 0BSJ]][/=صS4πE#-f bF;0H(mZE>m ̓p'JFm&̳4kk}˴zWxFpJG #)$4:fyP=| s&*_LqdYNL #z!Uxd!R&R/5eDDL ` Xy$RDk~v~>+/Z̛]c4Vuw|afw7{ ͧkw4*kc&PfTetۨQ1,%! j@ C4jc:. GquVW  C);SF9Ag1rO,DeB!  -,% KVO7{^>[dP˜JH)`9W `OB CT\Y qNp&rQǨKp3 yS9SDcDpC[,-,xGI4 W^>r+XʖJ-&0v=QvRS#L4_96rv]]ʋ4 pLY'"\H*f0‘h)Hꀣ'$XxRۊK]poSv._XKP죌-oNpVJfKY_\:wWAzcF/lPqxHEzwpNDm ĨWN $JZ TyVFYHBbl7Ni#e*Y׋&G.1K/9Gi[{p2$w7D{7 bXm-VQjl_Pr\ yWW{VuĞ}ʞZS\,5gB]/ֈ҂rU0lNX&o;o[4Af˅q$qla@݂pF?-Bջ0bz|ȦmKtq"7o}tfm 9>iD9B|Rz=W4w~B~qK3la>Yeki"~>Tg}[l`2c`)Mmwd([ v1w;[Òe>؝f~g%N41?Je-d|m ٳ%8* yZ[gŒC"kStF$˝ɳAޜ+'YaT_ ܟ=)m9֟ꭒK._m{/#LN|3AM2FS%!ˠb5wC;3`u2g -@u3;CnrboaUs1EgH/D#T=ԏnbU7ԳA7O]~hSlXpy82ܡڜe:ZL:"wZ[0Α~ܟ%|Bp 5|j]fot>~\9t)ٝFusNzA;?ny|^U]5O2͒ 8XkXj͹CY:>.4#51g#hՇ32WdW/nKb7Vf_,Oa2˟r 0z1//3#򓻁/~D#,WX+Dą RS%)e,boj䇑@W*+]}C34~pGw7G'f4-h< Ⴂ5Lϛ~܅ۇQ~㿾KNhf Z]~~f8=WR'\gveY?M޹s\ž+ *|÷adP__^dH2Lb 8R+Q@1鉱A L BaJp\LrOwkbzPp9v={ PM hbJJpp`"JA۠FH"I`ci9I1y=rP$TWeFfE6 ]~}3GʂY>A Iߝ>T(!fV/jն[nΫbu> r\` "||G/:p)%Я#)f|7qfLQB:|zhp//Mq9Яvjn z wf!yL@wMƂF(5 s+ 55]ė]:)c1a<ꜲkQb-F APIh(AVLZ"$"M-KpDRٻƍ$WB!7llr;0i+%G<߯dIe٢-٦%/j@ aMOƺ#|YIm2 ?/T0x ls5 F- X`8!ҽeX%wc24xF*`Q`IrM`2*k ^y-jjȿv֖~ע ֤`":/ꙕZ~5-Tu[+qD1=;Lt!.S!X獎Ɋ B"L{hF1 =MD>,u8gȬ%_(LE1.y$ 9 H "ؼ8C8qyFlEƝ'FҸa{pk)']x+]$6sU&aS2t6+@kevƷYu20 N0aOf3K.9h4 ; PMKږE]3!jY&> G|Lh8Otqf?6%b]oYjX5{jt@>ߏ.U=-B6Y]YeYow+*@*g3׿<r w>o?~N0Q'ɇ f`R;u]h._eSMC{дDM]eMݻ/-پ[(ܒ(4v.z3(ӊWC#m`RNѮs/z.|)BbG y|SLGXbtsHOh$-5DEi&둎sZ M=u8'Hzno\߾D8 aIUROÌƁ$` Y^T%@k]3YAM8ct^?l1Si:𥔷!Kk)0&P.Ϡ|(!H'VI.e<^:<>k0M&B-=̩@wMb ܘ|*mnZ:i4+(Tvѿ ,NV{bm]~wRr֚n@ ̸ %8#DKHoށˇ6 |ߜa.QV96B2'Ali/9v $g\Kr1 IKQ ˽)תM7qXv@Vv" x.'AUy.)Y"ڼ~S"qyQU'{"AV5bF&jlih*L;B%e::icW5ssuV Wv阷!I7#o1ȄhYE|!k3ߍ$n#9.\:Qa㔋O*W΃:[o9fqj!~l<̊zuxcaTR Ln~R6t=QV=;hƆj0{%sTpaxhM'~ k@`RGSJmΌ6>IaL^1xşm]Ĩ!6vIyYٶPoH!,&%q:b]G@F$Rw}Dve?}JRNvkn&jo#`癐OR&䓔 $eB>Y΄\UQ0X~42c *:QtFK,jWB<2CQSSoi4ʔ=W/+O @DiKsφzLf x~ +Bդlķζ4gPEؠCfٹeI` 8g!?kl JQ0 ͭ$G~[e5 ^ND/dhb$/HcB)7j@sneZGEdQ0A[R.k%#RHD#i=uB{fApZ <'Xtp_mךҔ٬[300e\ȫ=zLia-IhN#^uDX9sFskJ 0-}6sw#HC䕑{X|^f֗4oow&`*pH"1~)8 Æ!0@3]`l>eqg [uohhpbmQґHr P?:fyP-|s#'g{{ M4կd[7͒UmYQ7;+]8ƣq@\y|x[V3n['Hs A7%T+Nh% @SU@뫠AUh$;8up!#L692r%(vӏHIJ&wY DֲN1Nx9- t^Cx)7Sn|ya medpo5=:::rΙE5K*T% D3TNY٧}KM.m-k=p3#je6XS.C>G.+rYJd.s6Ag_xb2lBla]l?8+g8]ʆ[[ hw]l^%{˘ M햡疸Гzy1{IP"7.m mNh'6]_.wW;,ޮx#5Ja7nb;7CIioqu{9߷8og[9|앵=utXy\lc/nݞ,ɭ^xmwﻪO6\tkJUԹ-%p3L&?/?yq&{{R:UeDS,3PqtFCwTѢWbGW~  L@XA2 f'Y&m.spo _> =2&C#BgdC ,e -Ґ;PK! w&kmW!IymbVs{,bJZХв8[x# FޑFfRjƕ(EQϙi[9G8s #`2HT`5 x,UZM@:!#) f':;6X%\mgbGg0擷GW^}ԫzQ>Gמ^0TN*?ӟOrS9=:#%J5֭ku\2h@QdrFAG1^I/8{ӗ) ¶6 [㳾}.Ka{o`J{Hu HeNؒ+jc>d+oXO/kT^+<DԷ/ ƒtҫGYZRE4(7rY1dRKI~>4:}9Yrۏf`N_巄^DR \1.ьh>ӫL猽.ΝywI59ʓ۔%$[mʤHs>z]k1t[)We_^b,4CovCGɗpzYoX+3 |I0M4$#\2". pN爙Eckaz+$,+m| Vq JƔA(KȢ(ldkIRVEBh|'l2+1g9cTT p \)-F͍D#YFz! kش%^)i/Z {[/Ȧo?Ld0K"`Oo>WvxltSr2 (qT&E!*·KT}>Wgi0&2mBuD>\L^2i FT̒ېgNЙ J61>3YL~}h`0l!8Ë}p_|9^i>SXR/Ϲ|zYޚ>P*^@s\SaEUijѢ;i^Q)}2iZrrY^2^0{<`yֽ6'oCLۆZ.N1 5l \blȎm:CЅy^='-kQ]c5DgiiM&N ~aąg{^|TV}wDž ~6s~㞗4O[4UAICNx#MeyEC@&M V% /oWHN҉ "2{!'t=&Nx\Oz~On&q %p|3l*3*$M AsAg&-:!Je",I0D8С R* ɏ2dC2O%tλZNBtK>N[' ֈ6w%iB_}Ox3_wǫG* [J=67ф0I4l`?d'Jˏ`Ty2'=^V?fW~/|}u6*0C.9_<(+7ʮl ƓEO-I;wXLvyLi<֍ӬلIŊOƣ6= *Y4kӳ:Fri9Ŋ )+bM.czIq̏`2;6 '9l8]H}?X?~㇏\؏㇟'Q #%h`ܙ>`j4bj=-6˧ͼ#7{W|張3;K[ ~80Rpf#?%fG<,qWh$絧\=q>DA\U(h3b"MHXCȵﮛW?{7O0MY}; M]c W-[`Wsf#>Ę:ԏnXf^ݺq%K 4 5^CzzH:zi<=_Hل8|G8. _YpBbRFL=G >UH[R_ wR\H|cbY`q.PQwژELNZ!t *{zLLKcg/;FnټHL;ÚwF**{Pyy[ex|2u)V $NEx^bAU=kJR-e𱺸|y{ pT2i FT̒ېkm#9eFE>8^'@6b%>F_%S͡dk?Ë(CRP$1veӜ驯Q,jS25 j"Q&ӥ~-#-vKf.C,bf+,KQ\8VgMO&0QAq H6D|9rA Ԕ7v;^ tvP)ۏSf5T,#RPv{ 5+m"%ܗuvi]Le>ϥ^T"]A9Z|c$B9Ǎ4ᣱ.Z0- B2*Z驎ҳ.{y\~ł]̱Xal;'%dgLk}g JyQ)yŊZ:X1c8R,K<+IŸ^(ˠXBt'2:F- ڑYeF^z5^ Ŏsjwpy=hFw'wDoODO )ΈޝrzW$G4DR̕72v)oa飲'?(sSF958R VJU4ђ^gԱJ!TUQ`o$Ҙ#йQk$jΔt%T ul^D8Ru'T\#T[.S5W+EL>.]_^bh') 0J3G M9'M18(x[iy &.6ِL)j;qoXB0q_ v2 zUbM0aj Y i)cN~%7]uԄy&>_<:ِ7kdܗϽTh1h{eӣٳ9]Afg7l;,ӰW^u<,\l|6&ޮ -l6Ӎ #«崃9pY+Ywm鞁ՁAq0qOzav)4Զiso 6AdmqFoj>ørB[>+bwO&u!~ϵcq{͈dcAwIg{pImXw@daţh&# cƉ0-K+#U8)_q<Pr3Ac蠌BPQ9J8-!1@XN /Ҝ ?ifk٢Ѓ)$y3x>~DTֻ:q* 飓Rzָ^%.;W>piD(c$rqϹ(GT$y[ٲ\)ԬiksGwA(<撳+梇6jreήApO!mc;_u *pPȍyU L+`ZKІit&'$c h>"Z;.bpc-P(0#Bž~7wsO)gAJB)Tt,_ .ug 8 ?_ B'e0Ù qЖH'DF8#,H5U1HR@zL쓢i䕌*DrN;|?c>AdHDJP;HdcLwխ&A*Z}q^qɟyĩ3%?pzMOߪN*jEdeY8:ݜp3ӈVn5 3iOZ^?N@Œ+_BP@F-rr甇WUp7'U m96(N\`=uٕ1RTi3nyO,C0S>C!g[; 7WE~kzZ/.~TJƍDX\$l;}p &+ѧ[ DŤy1?Pݿ&W~/_L?xHF#|0{or{; Ab0v |j0IГa'!Wt kfne7!a0`lŇl>{[^Y!zmzVKZ]0CHXu8 WTdqJOh1lso*TL|ܬ7\.BQ}z_^OOoӻ_^g` Iy' ^{muͺFڳEך ߦ_#7{>߼ڙ텹ⷫ^rhnm&zĽL"ξF_֞r^ [ԣQX>慘jGAM1y%˸k\auj9q$B.$hHl2RFkQYHXr>м Y!:'鹣 3wxny=4DH<ĉR*'=G%,B@K£Sxdoau*5x!Y9\YK+xU%nX쥠r}i9Fz!K[꼍k#-T@l饷eb2rsw`ԭK4琬Imf(XI٘ #)PIk9RNj Q1WqGJ A$Tģ9#EORMTˠ\LoWXkyE^o0(.oh"(~'}eS'`miWE%aA^c ǖʹf 'G8>yG7 3_/\o\DTݒu)z{z2Te8pyAr8C"ۛeʽ25N> jzlLИ_u(ĊK-W[ZD 18x ԟ> z_(g?^']FHr|Zu//迷TzMB2,-R .g[7U =m`1<-vUZ`3uLKi<"_/tGH ,V\ɸ0YϽ V26R\yFuŤ#őTe`yPL6z WN@]Q:)qWШ5{%Pq1◾x#oǽ70޺=LI͍Gj#mç}=~E,yt%%ef=H0xMrhQ+UR[1X+ J:M0&SyWqf내eWٿ/]#v]߀~|cƶ !-ƸT'CZ4Iؤ2}0X?dX>9VD)  K*_+LukSnT-(9&a'7|h8/X_ӹۄ|4 pLY_8$p! |G"_ar\jߢYwAw)y.EZbp5*B„Jl~Pt8?Ab!^٠LeL )&)eh!5ʂicDiB+*C!*1d!nN;Fzq{`iCGvLYn~QSPTAq,Rp:p;{I Oh<;Ѕ)9=`G*?`'vR옹hZXT[&Y0vWmoژNLo>`:?fB%B%lSg?Q[//ŪUϑ9)cZ-6C73_1pZ9S'L myjM&%QU*wGOgP;Wm8췆>#3N$ $kV#@Og෣,Ϧ S3;dui'x_?\!ȚgYՎwld P]S$9۹Ks;z0a`$6YJK+a6ZcD)g]Ws7ҭ11&tyP{>M^6NMLT]Gn3CB(> ,MApժ"OWPowag|/HfgLvvɖ ݆E?nfV6[ }ͷv9H69 K[l>7W/칑7:]z*2:YO9r 'v:#LQs@p SޡYR! \sE zWkϫ7S@#J4rSkJ#:k x$ ڡ}[X1c{4zl5ͭ1XٚӑWYS}Ȥ̳= +80(>Rt 'YWCЌ=ooLSLaX33R wz8uڕC:{ֹ{6̠uȤUZ?hu1ES펺JWMp&-.쏗?YW%+(Yͤ{E핧Uj^)_(]>w z*oxqtٛWz6ќw^=ľy9#DǏ_xr ffuW<{(X*+q&F:蔋6jJ6[[?Z[D{B{)A`;pN4je~39Z_w{71o_564~ 3A5DB&u`HԙsP2J+," =lNS'_.-zPuoq4fM'{vsvgaA[)kQtI2{ ٓ}ffH҆`Ҭo~evi keJVЛta:嶇>̜::9|^35-3{4NHǨ/LrI>`]r3ƒHd fa Ϸ0w~ϯv|(l(95JEb1C4P#LhDT31vKdZ)yG $+ّhߣ-9e0i>X RX3΅J*)A{$wA_V߽ ,%Ix`I% EBDs0*1ˡa>f\1C'K`} ;`uR=G_ {&bs(U(p#FL00'@)ZOBJo ˩6,xAp8acāU*$9+ a/+ՠ") b/g(xX R"BG!Ƿ?F$hq_MEɥ4 {(pFqnѢP(ŲӔ \vUaV sxaN"$"zo@skTi2X HD2871d2BKOCIo"L>h`r~oSlGD]=}ÞԥBw6P*V1Heq$5JNa/;W-2M̴X ,Y&Z\pSDr)DI#igrؼ : rHXY+EK >hAE(h`2&"9t'[s:qh<waڟ<ƏϳFvvz6ضZ?w22+7$6u1z,:8obDEbw&|&#^`Wӫ)kcƙQAsg8>`) NPJ258[s5x+κ=N;qGWI/pdOD+J`eKv==x !Gq*"_7n!oK.n|ݍg%Pr ߚyk!]1k|X ҺҒIvz*B.!.bٸ *a\`Wkgvd:\sk̙2o^,BGg8Ö<#"DeMPsUFsTiJ 0<p! sӒ0@,9(TG'y QhjZ'+fABWk+5@yxBBe{OtxDc9<'u|y)"TlwmmR$Q>Cl| U`he)1H-h ,94/CZRN#ЭS\-&x#32x5sfb ϡkbvuԮ駴#Ķyqpuf1)P~85Bhd'ZK(I fh/#Et$LEt!"'U5`!,@wa iU3`L1(hH> 'H >r ɡa/X# #$$8XG!1z ",iP%LR#1s)qg۱iY4hZѴSjMk "c`MlyrWg?+.Eˁ~>M >, nHŇK;_|o7Itrg'Kxxjzn0ŧh^4lrnGt۠&Ѯ֢7Ch|}5*ZVUxl{QM<Ñfy8}ph>UZDE-JTf/8Usefn̢E1/uzpa*JqfXn x!iHw^9N2m.6?/wod&kcK:=+ x0cs(5r;b ?/W`D'gpřX=nX/= h~?m1CRD*gn@X4) g :7r9\й>RL=:.\< f ](" a=[%iYA[\4hQ[.EM+91Q[c8ƤҠ3Dǘ )൑Oafd$7Q0cg2alH얜z獯ؕy9Cl5]v8Z}yGE);ڥ(WЈh1'd ]T'1$g=V-o{>;iTxPsnlMʥfV^}fgK#Ўq"dJ"QDe1,0-:$M/+Yg-_}ln!C  H ªDQ0Q!Fp \WJi-{x |y/;=ӿ,h>(单5dY5EyRċF ^gB%p:E >R\ QIOCLԇ.i~A 9B/8:FO:#kޭ{~ w%;sgd!3Ƭ%.Is/Gv9ŦDЋ U#2|f|9 M♣Az25BaNySLɄIdAt&Q=`^<;le?rVk̖Ԥi?zqigJ $J* 6S6YX`,oW">I' 6$ 8 hrRm~/r3i56Ԧ~5~8NK u̼.NePcR!hnY1̤R;sTV(x C2X!:AJ:!QzHXtvLDCr5@rrZn/I>a 5A1yIzʩA2o$ִL9X$7h>hrV h::Ea4 cƖ@l謡G'F@-U1cKPD'8.3"6ޘu.䟕F+,yA3:yYӏE$yFZI b %zϮƱRM0 2y]B2%MBFD%c6 RZ%p B"̦g''Վ+V@xz?60R`x,:g$-}dl!}BomNYkj]a78rAcz_[N qj2̇mpljj|2;*ww r8J ?dip˔,ޯ)qK@0K-%fxFWG9#fl =]Ŕ Kke"3mlƮh߆s{p|eO'LO;·bmѾ?쯕C>/7?jV[ؒ(҈\~kaaMK5*+`> nhzh^\Uœ_?]O|fw'cz2pGÓJwNx2G"֎pg`c$+\Hk=֝loIJ ǓV>],Z|2]6zv=|8߶ b=`G]-kԮ:FzYŊ+)+"Nt}9$8G|0Y\s_w;U:D\t6/O$>/\Ͽ}~?ODc$p5͞={=nkZ $~ g$ܸZ'?ZgE<,qWh筥\N{ⲋvV=+2zV<"ĊO f#=SOrcܭM,<55q$c6>Ԋ`!1RIi cs.i'>D^ ImXɼûvKȓ 1 Qa"1gɴN{$ ag!se/:[&ISv|ҭE{pU%pzϫ/^ d43gѲ,qcU E\sI*5AdJpQnb;dKh.XNzJ΃IO٠.-T1(g0RL^-өd$H We"ŸVґ% szt.(sepi렫Sw~V3 hǃ9+*+鄄-9 n~6}۪/?gq)sϳIqs*zoHOM[`Wwn #bLc}[}TGwLO'kNf. &t (h rLȍǛU#\m{@)mϻVgR`^f8-w}%UnJZFyZL4H,5oAW= LlNAP0KnC*q摹pZ_Tlb:2}}'զcd4\L'߮&yGHso/L.fܷC7;E LjFѶR"}sN\5N(Q(2 &3+b8XhɌ'W*!FǹvsE@F^Q،#ZzbY5զ0ߋO'y [?{QFheMژ[޲ܓe4g=Q ΛoɉIfd`\f24@»qRwqz{Qd׻oAh4}cMe# -2+ W/e^ ?}i:REVZI& # zQoI ,WV9mMR'i^EPrM&R-e T&ЁܔN!D'EUזT )yL,+L2p!k{> 8xvܾr-{T.x}OOlo,C@DhSqRʈA! }(ؤ[(e:^bNiL1%ZmQ`H! >#oAen6k2[gz3W8$^8b b*/R2G>1+Om@/ o4gOTsllllV,0 zsdN)63!KJ 1jZp*Fcr_RTrBPRh,p|RdtFX$VO6r?)$jw|h-:kc8^/9@d H_FЧpTTF@b!3[.*Bsز5 Vhi G* K4\} D ˜8Z'8M226,d{}0O ZA& ~ZWWǓ0s2[o׍;tH(>tϛ2K}pTt»w=}tlInlϰM\w]z=#`n[Z7Ժ]i,Zos<{u*Fl; lԺ{۶N׍Hwyvz() [=ݞzͻdw+9etr7V Қ4W_5ݵRҚnv:;ˣ>:Ft} B: '6,6bsrQF}Z~*EFpiUU0K*$in؀ʬ!P BIѳd!ݽZFkz LI,괻<(IHw|Yn"n#SFщ46,4^d F^C4_E?.gD嚓'I2?y;p9-MID!,Kۜc0d,p'\4Zq 3H%Xq9&.VϊzyD7FH%-N-͵0ɴ=_,*wzvE.ɓ\b/ryHz\ԕRixmMN)J9Fmmʒ mʤHq x+Jߋ*kGq$2,Y\ܿ&_Gټ[Vf`2!E R!hHFd2a{f]3֐^ZqXV@)QbsHȢ(leښ_KNɈ_T凍ĩdk D"eҮOc8(JCQPٲw%nԵ(1Hbrε!abu2dJ*HMJS`) }P`B@8˄NSr6dIZ'iR^@(Z5{WU̽ SҶbt=\ Z\O|BTѦ Kͼ(4DD\JŻZ]xY|h< p ȿ0(Ӽ!އK5<}ӳX \@oyonAWU9 ~}&Yy1Зnڕ~1v}x! ^M'7)7;K;YNZM^ pi*mk(_xc\\V$N0'+u' r۳`uBBqd/UL ,jCdUQORmy.͉lI۶e 6]vxmdrp&bЊX^N^ ~ϵ/k|Hf[Nij73]:ncлy OmPhEg&n&u9*e2c: SVwaҢyOvҒDN:-ZN0K[xn35u7&݉%I w h(5MhDڨ 8ae?))\%t13DeݼQК$Wh7M`$B_r.~?J"h(b1Ka,S^J*״4MSh΅HT7.\ }AH=su@UK]3O5\ f2"1fȷ sĽ2R%-S5Q (C9ˑ⚈ftPFk!(TB%8xKeL3" 4e9=`cI(COHy39%/\ji?g5ntf\/(+EGe  P ʮDKv4B1(nhe8ûxb;ܳTwGiMmΎ+s᧥& Zb1qK ISN+tPSp/Ev52GBOEZcmO,|:D:ϝq\jALB:gaBфh`uģ=X*l+e0 z5Ɋ5AsfKlRVMa⸴19d)؄#Ur0{wm+橈8OܲRJC#pIN]Jk0nxv 9X~_|@8.rDNeB@%+5.WpF<Ε\&9\h8AxDs.AE@;JBQ=-mt9&[!=N FQxh%gW(f|ɝ9Q4 Gcw,Kap55W6Tl4vx?q ֒ aZɁ>X!5G,Zo\qONJqOa)}m M?I e(Z* EPѱ`O  6JP)!Ha0tRiAf8A1NP6>iㄘȓg1LIwԮ mq%0xBcU4MKwGXsHDDI5I]8~̭KaZwgq\{X {k[|kktdL8.Pzy?gnq-ϡ=Z=wwhU_qD|*VLb/5h d a4wɎ.ѩVSd~ps4xq[E .RA.-u ͼOJ>WW3pɧT}Z#!gKMqmjQ-۷}7-TJƍD09Qؤ8}2sqM/6߯POϷ0*SqŬyv1⯕'?T>\MN}b4:_si?=,3$*OWy -vo! #fG.a̜*Qd# G3g1kpp3t-1ꨌuM6%h`/F[0?~L|ד')T GM$(l<37O~zкq{6Z6ߺMmrø[[Pqa? 9:N:-~\y"οd_Tr> uU[ A*\j 1S>,ftP@%'}>w}tK<-O ݡ@wHR*$J"fA&#eZ6Zp%P3^0bG8_D%")qTIq!QOD%rVis_OҴ4_&ӴVΜGuYH/*zixN(X@h TRbK/-@8]B6N1C&eYzrN#cmhd6Wz UL(5)D'R8K#$ *Eшܢ%S|r&(sePu LOWX|F_`XL]V_P'D QQ6NrwWU}pe؀i2y9iN^ko3j.٬^rdC qr2,.Xz !Xѭh m;'"bgP/&T>7'=#!Dt|9H- ={vJI, 0'2dk! 0_qc( Dh,@A ?9b(QJOu{&{AbvVOX288LFqzUA@*l^U\O]~qqջ.αgÐ<DŽGO/')ƣiD;^0v,1GI$\/oy: 뗙S108]bƴ~WɷYbj{/g^ňe3hD.մ?!y?F?ŀ\?Ì?  ]K3H7F1hK|*N$jP{MN*D?"G_, 7ǣΑwϡRmcoL|c.:9+Y+I_fo*yҼ w?(3^@sK;} ($)8eJ3W*JƟ٣ϱGtfYpkSTLD8$& 6QڰWW ߯S"fRDA|SRl~{ @^tF73(z|8Vq\P] 8ʄIK BS)WT k\|-X`C#p5o@%7[Qxh%gW(fT;/st 7h :8)"Mwڂ \3 v};_>m?I e(Z* EPѱ`O l4#8CS B~h 3٠'KPAuqB%<K!RMc n =IQJFa"9 &C%P,9DDIs}yʻ%52.]&ڵO*!)kz:ѩLY1%k+sjh\38<.|d4-z>k'RlJy*/u ͼ?]Aդ"C=xta3A.A4/!N^amHݢ~mf-{t|qy' Y%7œ{\`s#p2sqM ŸW'Q}sf0j@GYѴug<Uɓ_ϧ_f'Q|2kzݳ e]PvI^zHA a$lH9UC#]4 ÚQqeV"?'`b~vFbEwnqK̺:*#G-iԦZ|Q1 3 Kd !w}5 uqJOhg0Rw*u'~tOwB/}~??~?2A L'MSPd)k &pouq{Z|qE%7{_|]6fapkS׫(߷#'8-t$ ȝ8 Nb^+LgY9( B TW#>kA7'qO?]kRܩ/HR*$J"fA&#eZ64=4_CeVHznol;k7y>4DHh<ĉR*'=MXD=&GȊ_fu*OߧXE稥pUt%iVz7Ѡ#3gQuٻ:+#襭uV I&F:ZKo$AݥIs ɚЖmXFzJNJ&]nG|r%#Vsvg)p@@H<8[x`g&wP!1Z}CgN)U/ֻN>oΡc"oP'D!IsHzﮪW?y77L3)/X>=Ω7;c5-=>|jsҬ|g)8ϓayuyd 8g%8a Sd[.oށ5Ĩ $yD-mm"#v_]3'2tѼ?C|ND5^JVQKG3X%8RT$X&\[X5-sc쭂cklgtzvKwtÒSq*m͇l{B-윦A5U\#{ƿ_vFW([a*BRi2my3JzuiT8.ڨY*!R*L5T<Ѿ≛Lnd-㑎K "8fYw3FѬUNv6 h>ye=({fOTq{7jW$aI*7 بL@8h$(rn|L2s%$CV^YC QZ @)8I Z `(PTs0! eBsZK kp1JZ+i-a{IC4pQoQ[U=z/.yµ&ugm|a/L3ԩy_~ۥX͙ûbthǠ HNNk^Z QM8aYA+ꉊ*7kم/.3`YTa=j0jx߭d> `y8\[ m V+,kK~2Q]jj'MzJ- ~a#&YHۖLOTAv;]nt(N=-xՋRt f\x)LMRͽchCwo3v"Μ6󔦱ܳM[mf QΰMNx=0фM$>Ǟ]3LOx4JK:O:iSהr3MwR6j( E o!I7 ǝ(5MhDڨ 8ae?))\%t13Deܳ.0!$揩a1xjs$z¦߫׊/snnE\ q+m"WYUYSK10!tw*/e^|łj+LvmA/ɲma&75)=v/ǙF!]`X^_XQK'+f GE|gIӫ@-b-}ŽV7 5ic (\t{"`tpIM2!s5ga1.%eKuI%AUI !#zj1!15eZSiXlvB;zmW@p&ѩԸ }\JLa%BA-QšO&yOpH?N` 2l%YI*9@Bo4>,;Ce(0Jbh BP M"(&*25CR{P*!F\xjQ`;tug>&~Qٚ@QP<7H ˉ<&lG+Þh''}BALa ~p*fp|\ M_G'5*LGfL U b,W `,ų+_6ŤA]Z,2"#ո۫K4'9\?8O: jNn:>p^ |87˶i˛˗kRٻZ0JɫTxry<(ʵ DBݘJ"+U045w랲z{)'KhY&\{]Is9+MCIX#~ӗqOe, mEK(.(EQHeVT._&>$^I,8ȫ0"^_ъx+r֗[IVN^+!՞ FA|p?jW$FgiUUPe4V7Al@eh,d]1Ԏ8"x8"8Ί:nIhd p~FP'Y@O+yey@Ҋ<*Ɉq~a z}s?`3v?Cͳxds 9IcH:essBN+ f&J(0s:pMJ]D/%"Ho"V:h6^՝UϮTpq׫|C8槚l{̏;c~ўN;UK-*(ˠBHa .EfFtYVFN)ԢMYrAceRvA$Ts>muq˸֒-m.*p'u/x7%qNIW?<?]~ _~,vVf'dBbѤ6QCАpdĘuYCp:G,[#&[IuہׄjhXV@2EeSB@a&@JHJ,Rv/}XcHǜiQ.ke2 rQ$ңԨhc$ .Ho8D"б^zM5mv(l:VDO Eu:ݣO^N.G.@%fJ-{L_)*<9Qԩt}9 (qԌ.ΊA\)ݪٽ˗f[\ߜ@`R~G#?=M.;} }5`h:b Ӄ?/pi|WDz4g~ |9fɖzA 7)O9V81R(R%)$pLQ(%+ȪwN+픷eB8,oP<᧵Ew<^GЬ5K3_azy8z^3^1{ܟbDֽvWV^']!қzn:$Z67ݼm WtzvGЅY[Inīw,6fVft<53!)f2(;!`<5殡zv:% "wl$< FٚZyBٺI=^1NB)k l:FfEZee=֝ ի?_ͳkpH^@6RAXH# C2*s#stkzJ)PrF|9 (-b8jLlxG6K\[9v$\)ʩgˢcqFN,~*y"+ 7N,2Nmjv^~3B1y 2ò4f?8ѻYF52o;){wLA4$|%5!>n)(&xy8b'gFKe< ^ fvm =\% qDfZn.Ӽ v c!ߔs?,>axm~\֌w͞ڡ7oNo.N?jVZR$nr58pIϷ'22Έfw#Mkzۛu̅6|:٥$`|9;ԎpW`%+\gKn颫݌֝,{Rل> Y,^r<>ɏ6Ûu[V[|]v1 N.VkHiX:GNpph>ns?=TL{xzyW$!cwۏﻏ}KNٳuс{MoOhjڛ7Mg 9لo.rGՇ[øpIa$KOYp3)"U8 +mR6R.=qW_jѫxZbŇ3Yؑnt.@e}~WN2.xZ i~ I#꘍ "-"`HeTR;KI$'mAKg6,, y?Wq`mF9"sMP5Yr@([搬l2KOt 64GIL~t z`H)&֖TxQ$B@+2Y( %E2'z2wP! ]q)vZ+pP(sn"OHBi[G]u]~n=4:4e:%iIYbXͶo >0'ɑ%1M,כ;+X\BJ[-PoOPYuƓ$ܨo?*(4f|c:qw5NJ1y>8neoO3WP H[x6)fy$[) ]&̗(KzY;P+1Ŕh84)AR*XV*luT,6fц\w<$ ܱ^VKE%ǽIt)?'2Re`KсBI>! \uy9!;UӕϓNjq5o [P$rX (h 2Z~@Iymb%D9+iAAh3x#އS2i F +2`܆T6a%$#s޸tdZw6dZ\o |>c3y5mIН]ۛ=8W~u[km9q ָ+QDeL>gV bpʼ'S&3\] l:εs{-:.!(lFύBE-=)Uvjy\Y@//orpqhD1EQ+Ȳfd7, cG)nu/7%vd͒Ρp"&b֚!nBHvER}C.鴕ac9.{YhaGN2q]4)D B]HA@Saҫz;y6{;e5XVK2~)erA^Šϟώid .<1(=]FШ4:)i489]p~$Rgx|KN2h0.FJI;8rSΘ9E`z|wi񭣧KW12BaKk9 B9dܗLрJR4R'ET&:{麨0~hI8i iI-ZPIW7E&Ѽu:RYa  MA4,Ϡ 9k4Ry4,ųGIJ$kHJ%XD_ZN+ä t 7egVaZW9o&$W٭^̈́|۸gՀңκN>{>dwRNrPaV )dm,Ӷ RFs& N=AA~1V,km#G/{7( Y'; v&&3X`ƈ-Zr6[e[m+R ;@lbXuX`D攲)jQ,\Ap/bԊFw|P cJX⩮%cPZy ((E} & >bRڶ_ϊ}0l~bs7}i 3P_.U9+.)d.F%/1%de^90eT|& w~;޴!z^垗Kk(|ojvutЫ$G*n U/>lVuY돀ĵ]%lGdݚ`r<5m<${>QQĭM ~_NN@^N0fBj 7/s7׫󊹪KDFӲS}~]Kإ3Қ/ԙ-mޱg:Fj qm4;q-[w=黓s6_ 'm?\y:}+7.q^b?yyZ4^|t*./c勦sP 4f8>Os8ȏ KQ+8? 8~1QYH4rF(Z+NW; 2 AMrvYFvݦ8b71}ђ)<<gwnuykR?lx4?y\asq؛f>=H~ALrF4 9N.R4psnmx;ۺ6epwK_; r"$e Ջ;I묽WZ'Y>;of8㖜ۋe~H dUO݅^?:EZ;D+S GqD =RI᳽fee^C(8YL2^ J)͸˚d2@ l8uԻ~|nzk\bo#Hޮ]NG1M=Xf |󌻗OO0U;(RK)-dսߊ!Z@2=1hgyoRv3*,N'n#.k`&Ki&u)zwy^bN9 +vWT Fvў|eǥrZV_d/=`mqV0nUsr#W]+k)v`x=fi3[Â& %S 9qL.pW`\Y& :h6 P_[; {;)yG~|2<<7};C@%1Ld]r@);o9$-WRϸyVv;>;?3hSVOzV YNK%q|0:!*$g8L60.^,i'S( 6< ]Ff]]ר:=1)Ϛv_ZO ugV vU W*|fNe"366v)nyqM[\񥵶{VF}y_R(q-m|_1[m;ͦӦM#.dokߐƓJk"䕔w+zj陇ߦ!lg5: e3$qX+Dp F/Pd6< \GZz<16|I$E=*7'NŰ}ܶ^TW+VZrm !59Be6X1J2p,H.Kw*h9<>Րr\R)JԦ(932D 4Z 99XqsO> l>V"f`̢vЍ QqY=udX41#ZXRN=] |n`5` BP0M|F[EմKDW'g" fsa-8 +#6;p܈rіN_˳S.3k\D%M(Y+.xm䢮tUZo묊ܰ i|Vns,#DF,+ZǃGk"ܧu}%C!N}^'T85C,gk.Q(-d Vx'ռ"Mr3\% ]A5ەTY :2Xϼ֭5)# ⶃ2S *Ҏ)L@laxLq vI(e :- X sH]piUfP3@)䤀`J*јA_$T$ut:>a1[:7 %̆HrW4:%A!h) ,HZWX Ұ6 S*!Z8s(E lQ w9A7K!ȗZhOX<NY3z߂VŢ4L&x,'Fp #hBlz-$v ˥rXD/K@B]N*<O es Cq2M!LVY@C>CVXY=0wuj25ja)` _XGi a#XAdp  u%+9dpoJ6BA1DwilULР%f 0 |SpPS FHղ+au))a $)byIZD0 V R  I6S# hX?P{2ȋxY-RZ[x# n݀mMƥ3"Y ԏ/X?Vd[żS,00 6b$ F`; /pfѼjǭyg'cf^[su0+paz%7>8. Š<$4xܗq yQ,R"$2;y*FgueAFXŎ f8`4D 6lTѤ',Uf3*0bV#HaJ ΃ V`8-(@u"uQ82 gu]~amLgP'0X*JX!%1Xw@ 8 JB Ŷ(`)[(!,KP /Bb^ޟ!Ƈ* DZ'&BJ%< n9ٍ`pph]bV6Q@ Evۥm ch Yh%#I"|KN pR-$kLr^e=$]`!Pdmy6B}XFϽa4! N -]h4F+2“zn9|?]&x@е ,Ey~כ<=Gss4IˣgǠ gA]z faYon!ʞU/0Q 5x>ki9 t禎HRz(d1!) pJF% +Gog~WFym|W M^M8dZ~oMb|5ϞoӰ.k_Gﵸ'?]֯ȽѨw?>,?o^|~Wt̥I/ge8!4PϴRKeeN{1C+pB_ \>WghAQuqt](L+ d8r%>LEEARqgYzB"vA4;IØ}+j|7̮t{v%J]-7ROcD::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::C::yut D{IjX՗C:{1j|Ξ\%?#Z(nꓫnٿf-\\d4yRԀ0V7j l7[Y1=CbS"6$ ,b^xֵgwg<qtwgJ(ǡ"mAO:k=a}ެNppŻ,u$~ONG.J7zDsKO>8Zf~gU´¸fG Y0xi+ǃfU*ZjC I6yfK CHJV6CFqK~iOTNZ,'^v_:l;Ƽ?̭kqǰ) .^k]etrG0\v57jQ{(jj[#T*&x$l F3Q!Lk՛ޒ"#Op>z^ߐ.Î5v!jk |/57YQOg%;U1섋wY! b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-&b-Pg.IX ˑ-~r-b *% [ @"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DLX=*Vo2`W5U :-*ʄYct)LnB٩.OfiipJJT T` lL6,C1aZxZ6_1$v5>b9Gfo_"F&] C WA_[҈L7h7y]X'N@pЁtEO[ V0tq\ 9$)\mK8~ ~< 0+,POA;pXnɀtֹ4 +Olc  e3x&9&9UeMW'v,rv!721٫FTmQJa l_ mk8AoO4Q)ʽHo?Oz3LGg# C]{4097ܘ X͑0O7hNV~v8l8rΗJp?<]l}Μx% a 7h~v^j3m9vs-e23;NCt<ﶝPi%Y"jB_[2z9 EMIOvL}Y/L//-jvԋV YHe `t,ɰ[ Y:a/Z)!E$-3V\ω[cB,Pbvi ~qF:M/Nnrnno4ig {w( >VCS㲃~zۮ>g᧎k[%nh$Q_ X3tV݀UjGC7> yAC7h ݠ4tn AC7h ݠ4tn AC7h ݠ4tn AC7h ݠ4tn AC7h ݠ4tn AC7h ݠ4t xZ)ekkCl5]\?o\w9Mf}WcqZK". VC\V->qFsV}\NkJZ_T&zmnc|hpEc9>>wv}+: }?i3ST`Sv04[/m\v^ ,A w&A'3ܣ{,XG?;YȾa+{L^g=t,-odusMtj^H!DO#m[lx(id1R79~ki=9^4B#N7ΧҠT/)r)7t!һK ;*vlqر`}m @wKaإڹ9rϵֿo0svPk{J tYC ] ĄYC7깓1nSC!=d!%}zm]N}~l~= k@f{ߜ.QoN~N?%y{;>L}sRkm-!!-eGCZ i}'?i-ܗ0D)!s_ߏEb8luwbjfv<}4~n7ȷĴDqaGeT(3=(s[8T,\cZEg~<1uUǞ(7TYTR}](n7}~X"DhwIHLcyMBF&ն_BX_jov;m=Rz~s9vv۞[~ WzEco%h%9%Hef3(8Pqءep~ޜ{}4Doo+&p%8g ZY wBeV9֋?P̯>[=dK=E/P |K͉ˍ4v/fV,^ά&\`> W)5j"O~FX7vFRCBIYmP޲ &hUi?cNS| %g^:QvUb|I;?< d׫Tam]}iלx9;{ͺ9j 6x3kuYT閛ywz+'"{oMT̝s'<׳5w0Q20  xmVNYAJD9 g%25AE̥ +m#Io^`e܃y#Rb"5,ʶC$")))ryU_DFFPm._^z7E(y,|'Q"EX,wT.IT΁n=\Ϧw7a aw40sR/Zs~-]w <ٽ)>38\=]/]OnfM 29:5sW>i2kuռq;NWmWٹMز–ՂZv>z^kOw6ޯaxqc8V֪wd_r˦fo9\Xm^nWWi_{*q4-5˭Od3iO.?O.? aerџ\'Erџ\'Erџ\'Erџ\'sdr%e9Vyrʣ-*״2M%7s!Qx6 tmVͪ:dB6kw&GS-gь8%3NhcHNeO(4ӱmh-J3z0b T{)pJs*w(tYv~$)qreK0TSk=}56|!vhK;їw⪣Ct FtT*?E $R@:TlRZ>J9V=+P1R*)"P#h i&dmJFǔJ:n}r(g$5LYB=$e%\4dR8jz^ϊM@>kwf N.v*>~ ^:$ ꣖ -"F< TۗR* fٖ= _9|CVH᠔HȲku.xDʓV9fI'ow{+AHES QX.r C>DC[1EA􂣈av{Bњ~-۽ML&X=2k q+Cez9ޏQӚDR(OKM4cpiD(c$rqϹ5юҨ}hOEK(KM,+:YײYFQxh%gWfT;/st 7#k :8)r=<_u !YfU L+`ZKІ枳Ɂ>X!5P tbrOdžrE<}61NFn1 C?I e(Z* EPѱ`O l4#8CS BOC8ZFf8A1NP6,i$'`Is#xB* $3RYFzjĕ*DrN;|?#5MK,9DDIysQl.\eFlǩqJ~/+=_}>ou gԏxV|a0ҍ#+,!зOw~w׈Ei=χUq#-lTC޷ 9kZszLè2&Ɗň?6̟Wr4P Axlo O>Hg$a#I +0>Q*jDSVZXr>| Y!z#鵽 K׽k7+_D%")qTIQ'"ĒHT9Y?N;}ΛUZtZzWK'_Rʓ^zyZz8[MDY2S7فB-:W[OF,JU]B]"muE8L@hTJb+/u>=K;dZJXω*,W%ђE@QE2ZZh҅Vs21g)p`<*EшoZySx7"cf*Ftv {&(| ]x0vCo&7mj<6n1BC{n~sنߦmY峬(:i;˥;Ʊ<˰eyH{wEkTVYlze4|oL šCA7?RY%|-RD5ތJԡQKGhj)` s,/N^[#w':hzTw{yv'\|5FTkjtbNUC]RqejsFM9o"*x2 **N&J|c#o>3/Yr0EKެ,Qz^IL?? mѫ{}=9>Z}UkKR^8\qrwE:( .}\k¢GiHtԤ\DR'h7I8AeUp_ 47@8;eJY/5R]LYN Mv"}^):yV&zyZꄖ􊣯XվcRsa#e@r"t \J h ̢%DExQɈc$F(cb┡`:&Yt%ӐFT,Ŧ_2F)(㙲²p|6;[rXqlToAFɷѰs] 4/ODx+8Hp 42VFG,l"-nx sP,6A DƜcU)@Sl Kb/d׌ˮHb\MJmJz#*I]FR+6* 4$(r|L2.hJ8ߴK籞6yJ !STK|P`B@ ͝"Au UQq+m"WYUYSK10!tw*l[oNui!>| o,֩U\&˭>mfkiʭ>in[eO(r~V)%f'|x3 8L+1a҉v 3EH|gIWxa{%Zg]>jYqm+{cktE N9OYpR19 B&\LTlG))\H^'/ HJ Sk y7 'J{Mg7[ СjeL͹G.iv:[>|1nT-XjRfiqyBPLFIN` 2l%YI*9@C4rҙS(0Jbhhnh@69Pq܃VA 1r+T@Ն ȨlGv]eok?j5Mo'drt.eNw` Ybb2#d .\YX;ٶ>'[P?{W6 c^f'v!Ցu)^wFtcv:$)R&)_Y)(H [<#DQEe!(]o~vs_IymbVs>`#W҂.Bһ3o$r}h~>Yձ&^d ٝIznNwgS4еTټb6F}A9'W|%"€̴-ӝ2'S&3\W9Muk<([t*z]CQ،#ZzoY[#~ ވ^Gin WV:}Y f,_ȳ5kݳU1d:cD_}/w.V4K: ʊl$֚w.Ӹх\.f\GҾ wU -Y)m(rMBGbVT-q1lK7U8Ut#g-w!jڃN I2mmʋzy ? 9 `<+qL2ᓻ<7_v me{C ë/W?YrhmŬ@r,Ӷ RJs& Nq+X?wı~cZe)`{#sJdQ fl l 1;i~hBLcWhQbWV?7/[gx%GAFQ 3cJJɌC2*};mEx5.9G{5Yꃽiep 5{-{o(l}CI. i߅J)7-tᨫB E]}WWJ[TW8:|kp} Y*wYaa MHr g/L,*yBp?}ƣDIQs=&N(8IcO?3H+*Whtf_+-Q|7 <2;`cv F!jnv uf `ZFqۨ2Y4J%/pfs?~"i^#=tOz<\tYWfc׽uc+VRIkOgNs.LQn)wS Q̻PZ d!eRVX[\r4hQ[.EJ-o1L1i4LsB87AA@c)1sVYѡ{OfN{> XUڏٔECw^hI/)f~z!ͷ/Xm/7ߨ卶&MXH9B | .JeP̝ĐT0uۚVjƥNΎStJ+DRbPsnlM"CʝV^}fݘ}ϲFH- a"pвqLEV @kP-Ykl)g\ lfF-MAAXH# C2*s#sdkzJ)P<"!7{'sE~amާ9K,Mpq&$I<jtG?_uo݅4bI5xQ5aTғqwxqكBsNq8hk<tG ߏ1ӆ"R9OdzFX ZkJcVFfYL9G߭e"B]*'"|cϴ,|4-3GKeEkœ#ݿAS8043>N<%j_5ytؖþEI1CFtIU_R/Y5YƄHE@B2nekR[rz R]S2h2%x~W-~ZU.p_\9Ա]h<s!Y_O˹u^ޅH}E-߯C''BC* [Zm.w`HaKgd>pyflSͯ./KgTٙ_/gցmb ɺ\sy?=ͬt][$ G.~ [N,uH6t4hFaY#lO4e Ww=UG]r FW1?Ѭ׿}SuM5ӽؿ8#v+)?闟?~?~~>=_i(5$s7`n =`h4bh=[ mXO=mƅ6aCl?. n-@R~v0ԟRjZ ͜ƲD!IϗE:5UIjIUjp4R+"\J%MιD{"yICs{a/f<'sb"DcΒi*HR3 ""C $ZOx:{Ol|ҍ.QKpU2KVqOld4Ez1gvUw d蕫N}B`Hyr*,Du9Жmf:0~#ytO+YΨ(g0RL^]:SRǻT% ,2F=k(K g,nϰ*GTkGA7늉N6!a Az럦Iu}ozwp}؀i V9a r{ @'W> v-S<੫k٩iÝCQ)g@f9_];JWf\B=&Ӷhb Azb)^5P*|+**Y+ж홦?Eɲ lKeJZ$DAn}2EQdrFA^U瘾ǴpG_;p͌^7x)<OPr@g?*f6*ŠEIjC14VWA@eVPi,d0SepMqN[A1>.:-ƒ4m㓚Cہ@h wJ6Q,,rHCːrAd3˜; LRL@)dKRKIpyMOfBN&)$eY!$c4QC D s5qdpMB]D/%"Hn(:T 69[;- UmrŮ^kOvl4gNWO=E믯y۳;^Fn,E'SYMf?͖k-j:)Z$ڢ L )d&ࢴ/5βUHcrrikS\hU&eDH$,QsJX֌sf.l3 9kYN>.\RTmzYjN-gL&q:4Od5vYBe P!3$*(CɀV9Aʀw*kAn!-ie$X)(,EQR!hHFTJIf]Kb۲n5v Ӛs/Z5xjm:!6TUV@+6F@B@a&L!eX$mQZl>%N:.crƴrP.q׀EB jH1$Ho8Ēӆ5ltIZ'iO)i-aO ˨نgW)*j2{ k+ K:\i.X酂746Zw&g8oUڕ*#T%TJmEfJ-'Q#vs)⴪+8ӢkU-fͻʓORoH7)I~$"Ȥ5HAP0KnC*q\8-<V"Ȭfn C7`⺻/K9V͉܅kCq.}?rfMur^ 7)OɷqB(I$%`RdbpT5| ,r;k,;cOV˖dY.IHQ[jM>ֳv7j fJqm\^n qG6-Vu"%SCZywY&qÇ#qUb%,րYo'=qT5GWA_~cS D PQ('AE)P  AG*9oE2a=IB82$9O5Zks 0%QT:=VLe/ K b06xsJ[llMB@I8 "5#LƤm4>A>TFYTY^N%P`cQi9@l94FCۉN*)+ ,C7 O0Pn1 &=6>t>5W-%:OFSm~j=M=bbRф,g3e}x'g){Ijhq_2wO>רyHIhj" W5&(N!yDu}khYG,$(7>Q-18SIȳ; "ֺXN,΋~W&y{̈́dnlVhz>4UKQ`riP\+S;g^(] 6%"un ~WQ\%B)kPvh(Xj#pjk%NR^EJ\Lgv%€G+Fwi0vݖ(7ƣJ2s͒nsj\ !g1ۏr;P,CS%I@;"Vhr>䷎Bmd $&Yf 8.P፧0Xs DnC5iF0Tgz/rvbyw};s80xs2 O6Ʋ/_>w%`50vq|ӳ8r@3fG_Ot܄}.'GeeYR1TsK MDHS`,:-x o5"ۡv182gigi.t}59Q|#˥9a0o#כۋ?_O2 my; ˿O %#1#t:c}04.Չp{-u%% bL! [9b9zjQK+82q g*  37\$8Ny#e<=ˋK)ƙ"0KL^M׌{&H0⍣QL^7EwN;-su8ץW/+tIqBY?h0>FOw%fM"yjպᜈEm'y^oM.R2C3ҤHKFh* :Z'Y%К YF6ubqvx3&c.k2g> m1a l3t"vc V􊾩 a ^\Ǔ;>e 4U`V⠺n R.(&Џ_fr5%б&JHFh6oF~(l !I^!~>2h_RLbDIp6e)U6F)(9w!=sr)K$7$ )PրE*c0$.&Ύ6S#fv8om'$OfOmM}2j:MCu\3nrn7w5]+t4%!76 knmFۇ|֓w()f3vBv5ۺݴϮtr+ϝYr͎n:3Jە~ڧRTjלJk77q%ruID/`SsUkZ.{4']< .=op+~+OfzVonk=#;C\g Ywzc8ӎdZ&*.7>rZ%wzA=~ݣ>FG7?uZV`630t}452ccFe8ŌOQpmݛhoiiIjzvrɌG`ŴX5.;G ^̱ ̱ Ѩ lZierTۘ EptDъkV *~ƛ j '$&WG&F(^'w"՛`>'>I(F+;:ɘ&4GX#Hc`9B uj@ihb4tM d `02NtRIYQ`Y}j6N\tl}"e1JQ`. 1O15̨3*FAٔ gqٔ g)*k)ĠNT*qT'*ՉJuRT'*ՉJuRcAS<هoNpИ@D V4H}滵q}swe2OwwzWvg>ݏKzo BzƄHC-7H饁vAF&j3LQ[$, AJGcʀ2%)Ĩ5r"R?$ ۜ DuU%Ӽϒ x0yK",H\KuJ1GҜ"ܠE%DETąb¦"'0b"EҀ?  2O}F1Q:Dpx ڗt)%vmnq<{Epar=YCVM SLny\YԳbD/ꕷb(*: \%&i #"!hZ 2qd##ZXFP'*׿_V+/9&Z@w}}V-۳um.vx'9suù[o#ٿ2}썃AlcH-I DQ$5"eQHlkŪ J~B}xCTM[}LMf !\ƿ}֚njm-cD6%l%`[ Zk,;nެV7+r:ugxQǠcI$tפ ڎлO6= Y:FpⴟR=|*t1Ù:MՑB]3E4΅\,A{$hՂCw9]'ү 4)_mq|*Q xz泡6?jEwpaK*Ljd<)V2""&ZHHH8Hw?_wYϠoW뚸qbZ[}S7Lzuںj{a{~]$^=$l^}Ұl<7.Z$V;rRjoSsO߂. {O (sO{|8œzLX{Do %(Vzu!v( {Ԯ澽:e^{ ;]DU]*s9z.[l/c꫿WEVt X `ٖA 0o8gZ#Ÿ"2je[u8jEv툷1>MJ4&80W-s]wUfu d}}U+W_)fҢ&oG|O.w;_`A/qtHL*EG9eH=F0*1t[dki"Xazy+:Íu3׭ڴj?Y Un=JM<H'*yyjӰhe6Z!rF%U*O[%E,o/\1A޹sC%DG%ߺ3N-pI6ieXJGj"J(11){0N0%)s٩ ڌtBI  m!M$i zg!)f:UgH2 /_1|d o5HE, 1Ɋ9 m,5P.rJb )LXvsZ=ytG2S$;ÐuCuL):A/'I:3XdE&L 192DZ6C4ǥKP E" ` 6SM;6R0VnUNKa&9s FjcNb{'Ǯ>6 )GǓт@ɩPՔskL+Qд HXG"t<&y$,Vi e)Avn{kYyWLD.=lˍ%{9<5V!pJzө&50oTtc,'!uT|A)6GR;"E RrHd*DwZI(" (O Nw_z, 4 o?>٪(3'( %/"3 q(N&6%@QchcMm)EvPM@p EHy84K,!4!=!uKH>1z%BGYOU$ 9H @A#X$U>+TָOɯura\xvٟOq*\{H0F{ϸ|5-ueLxVRQ^({ =姐UjxG\;(?8/@ٜ)_IS cKrxOSwh&pnW)0y[  .I5iatHeXHՀ`8<=I8ӢLud|RzUva 6R(E[_z죜8J]F0`rQ.ANk5*͌011uV6O.FpQySxy9=*f@|zV,+;$ʆćǀ|ChH#1xeÐtYTy$pT'XIz'cUAQ Z=j=e.F``7N-PSH}4vIHCkuSS8?,_gq0ǯ޼N{y_޿9y &o'޼V`e 1l[C|CˮAi1D]κZ6.d5l5 J?fo/WK 7#'8]̙\ 3_`%U]P}*0y; Qc:4|dvp]8O1J.q<5xFR=]#I \1*8ı`=QqY+Z{iuDcNPflo$}koCvMv<4Ossrf:~R{%o>G`0k.ƣ|Z(ڜ-rm|SAAIgLKƘT O45q3;f;tl&#nK}]e~;kE>ג)E 0%c,ޥʐRc̱|:ę k3d{[{57l3{jUK\d.+xF`ۚ"oz/K?ҵKk2DVY!=e=t:ҕ~{XYႃŒK=nZSB:iT8Qw70o`f0nX8eK{Wͥ+nEl*0m4$H4a p`FQ$ Oqv'* H,Wo _ejD֣lFl 1#;9ɱQɍvbRaReS9̳]bZufybu Цn؃bIE5ӚwU|m̩e`\\c|cv'o8.,iQ4wD$Wk%Je$O2TЭXk-p%*-&ЦddXEk-so[sp]l ߑ 8F̸"' Nʋ$$tGpq,|NǂU|郃e"o䍙m86xhp[6z՟|} _\FR˫͠[Tpq@`d68`.fiDR~pT^?x^>uxx[3`x,jzeGͲ<&%'fSye_A2YQBG7~*l s8~4ZϠ|bSEw6fWq*te:j߃\P;Ol1OZ/x?cX(I"Iʌ-G8K3˓Kyfc)bΔ%,Ni$w$yk.PHTh՞4#,RXJ]vYCL&? ٵxq. Y/gaM]R[M<̲_"\|~ ;&]9Xqm@L^L׭1UcWmz?󰈉Z uԖ*/6(=:ZʒP9^z'k 1ZS%?gsMU^.4~澍V5*CzT1HF_Φq)wE%K n.m52>vt1:WWB"*=gW'!vٟr~J%c]`>v䱰ήSdW`;뭟Q gVMkB{th&٤*7>x4-zXMPC 1αʐ8Xs,9 Z{` ~ML-X{61CoG_+06YAJպX*hyI|^\M$7Vo+51gIb ˘ʔ7h<|_\o[fVu>ETl2A7M^'ƥV2A{R2) wU҇a#`飱=KD%{K)Zr1JKhUW®8㈐]gW J ++îJ>vRv]%(ϓdWRPr}J_x#iBʟُg{UjEY*ȉf.gW0˅E9;fr8HZ`!/yD*Ѣ7)ދ;4 `XS@h%G)ﮆY>_{_%aAZyu&/&[KjB^a2S7^qIb<2:Hߛse土NX̵Q*g er+gI1GoL\#ٹvtSϟ;P)PnsfEȒgY{+|YY|? Iv'Mdv_v`Qu%$;ol%Y?)H֛U.(H+ o֩?G2YCaUk:|SKyPS$H|"&)+˓ r61jAQh%zGfxѲU-(K$F)RTk1=S5bXBlƛ[|8Ua5EFwծ=Vjhص>|CK&w)\Uޘ>r8[Kݻ]O>} tng 76KjGu]V6w{4uz[[wu5ܺ;k{\dK6Ao^ͪClŖ9}}];r3>[|w0V[޷=薎Ѱ;+Ps/ZWt} kkڭ2j=d)[5;d}rkm#7܆Kf̡}W i|H$Bh{hLw%V@^h0qwayw/u5&n; lDxNXf rԇx Fu[+ $%hW\[/, !\NOPE#DF 77hw+ 1BDW]!iM/׻lϯV Vh0)Vr№:Q/qTʸ6Lb|J9 ADtQZ!. ~B\h5AxgKPzCQ[%QbG4R;R3IR$a$ ZD:Yf: OclT6q  PA 5Im|&@F%91  K2R[ YPMWbts^o+Dc <42 ҆ ARe!ɄgFxjΤ"\RKc \%ƭw Z{ǧK#tNAőN/  ux:j#(T9$u&pt / 2b "2KXGьK;U^FS0PNp>nB /%u33ҷPO$L5s7(^6qtr.QqD HNNk^Z M8aY/qceLL2d=L1zD$I4DEwJ2Ҝ8;@lR Kaƞ¼ж3ӛirǛ2&w0pt`0}ПLqX ji^HVp,y{O heUF8k[Qeѩ('ː?gmr^:GMLXVm sbl~4+eSv1ؗkӲ\{.Zj5I̜8/49¹ c0LϔKS")(^$vIZȨRVt@bLP6'9||L2F5h bl燕Q816#ÏpDrĖ#nV((-6)8Iiʣ,e>FE5ZLh4Etv1psG煙,%E^//|,k.@o6~oMQ^mxߑ$eTW.9T{'Df*dW3/jEDJۨGp<;, `t~bE[cBz<"'E3D3z? xL0ϣR 2 H%ԙ-L%ʬbN07P 8-h޴¼@A|4毧vt/=Tizrp+[1.ǿmqDYe-ݴtHZ$BBC"FFdRL")qcH%jO8KNPX %qɻ|;A=,\Pht;"^I\f>I ok3.8'-]6.+5fK#GuK5t;2aDDwL.?L5(*!v(ԳDji;m izib^εrMr\-ۯq6?7V 26, :[.|.zܿz9ox`wک|ACΰM܍x=樔 MExKd/x6UVuvj_j_[ԯK|بu|SZԤ=q8rlA @i D{ F HIIɅ*(SYI&*ehϑ-0!$azsݔDb@!UϫsRȱOCP#+n}q,/,3k&v>a4y6;f6=CY_g7rp)g^2BB5_w3+?2h큱a;(>=Hdp,ALJ(?g#AÌ* zp'w(taTx}ۙx=[|wwأxÜQ鏕*љu~:kӴ`QY~fz3Nf|T>zΘEC܀* R@"eH.50ڑrQ̚7N-\)F9#:JMڔ(HUqKĔXޑf)K("2P#ruMX2Ԗ%\4dhD8!&$87a)Db!@-!%}|E;^ɨ2L$1sM!E(Ah11H>'ofN*ekdl\q L;}-=Nk/v*!x7wK{OMzNbӜÜv/ 7Й{r;HJ{swh_o8q"v59S4?0d'\+)'n2+?;L^v?"Uʕ^]6b:]f>WPxgU, |Ju"Ď.s*6Ry:=M{q#,L$\hbNrGMޠEgjTk;ňVfo>W/^|ZU̞+Gr{+Np4&3'D`QR3av$!׏tU7 FaVQ lOy w܏ٿO7MnYQnuaA!y`8o&Nf=zB;YV?xکԩBpBft>??~9̜}Og_>'\) 'u (b~{</~~к8bhT{vZ/.㊒S)帕1K[_;?0П|?:N2-~V2N\|F2%!'0ϏBsP. Wb!}1%Jx O?h=%w=ޯTHD5>Q*jDkmT{,9hNCeVHzmoÂ5/f'sI#@d,cJ(rsd"$4$~i) o$I/3ݒw=D DF,h0FnyH.(|hLYI+)vF6ICW z`5Jkާ[e>{gcغFKG 1 bj)jbpF&f2#&v_]m}1T V&,**Y~sOᧉ9m!s[jRrٗV+iAh2F48#O;O˨5H@PRXAItd.P Wo["F;0һP]\}b|^?gيN,Ag-\<]l4/ytϿ4;qmېs+%)ʨSt93m)="y&Ȩxf%8 W[ME5.Ip6 :>qm6'ǍB- U6s/Y?_*nv`x+|ڣ\Wsy=lqAd+~(3 }&jØlglhx Fո"kpQY"6Wh*+bi hI68/;پEڱGxmEK CU4w? 0sA\B<;>-|Ծml׽`n^NAL^?{zu0<~6%Fշ +URJBu.J0DcETGn&ඥKimږ6WiK;kvC&"1礵>ZdweZq2 _-OԥV g#u%Z|h#K.Ͳ˙p?K`A^uOҸ\~gDxM!.Y6mr黷mG?~6?eֳ-Kl]wfW)UAj1|.Ō7m_s8m)3cpضHI"-R|t i-<㥟V_/[)++˴ 9Yy)\9PlS.EOVRN[$os:J ЙQ,A2q2Fcb@:<ǤBPRhˌ R Rd܅#l۶u9F(=^0ud_#r{'?ul(>.!$. ")m}jqƳ@gQu2(ľ6"#(Ob)ĵ[eT (vYo[f8D3'j=\+7QArf'QaT _ZD-h}(#BSꊗq(P;'P UW/P] + ʛRW\CQWD- MZubԕ yo:ш +W '9v|gsLZU0;HI~|GUW7!pZO=j=j3SԊsUIG:0\FTum Ir9|2MڒCW?džK)&hk O!j4yD 9 }C!HFrߡ^s)Eѡ{4m[FFד}r&Oޯ;9uk/t=kSŹyA!#3&hx"$.)k3L9ƱVr܊Pe)$Jsy1)U<TFٚHVN.3l<DzFHqb-/a"@Ah CYY`6DZk$Yzٰ5FΆrV՟?Ռ:QSF" 5b2LhT*($H5=}ƔPcڽDE~fwym=O%SJ9ce.a"h&[DI<j5¨G?~ҳwG[FW(FT9 "A)-0%Cth!4O*F0.vT+[-$<В-kH\gBt;hz;A&fp,.uD1a3Z,I锕YmBG֜'2mW*NAEcm4,|yV M't^:25B5:] HS803>hB+͉ Uap}_1V:la_bjX kpc/RէTSnҕ j Q?MfOH>}fy2.iv9]/,2S(* *=>C2MiGLYLyUNx1{aO,sP"҈PK$pe̜.!qƒ)dRkn̤R^yTV(B.%. mHo1'lkL'/ѐ7Ehڷh}E>ߧ@yO+b'+2 wE $fL\XJU "IM)/'2#sIh9^л`ffsoI0: cƖ4lT[D$GTE {"qǚʘRvPHԓT.~ 볻͛VB; r&ΠVC$UBy`-?kL&-!kKx% ֪gł,i?ra?_& (2r L?w#v ҿMSKzӥ laMd7Y5[wRT?3IշMwʼ{f",r'NBdz%j~*rՌpkbŇ"f8#A,Wńߖۿ^-b W{;XptJ:dHFG* #F|ैV @$=uahh<{'scJQT<,S^gx#1(0xmW(rΪ%hyVq=T c65Qڤkݦ3F)V }V}anFn#-oV,,rÿ0/Η7cS,r((-ˬ G PYLhAHP2"PX-#s,)81Gkc);{SJ(|6y Φ1VnL[c5u6l Zl2޵ݦ#>6YgURh;E3vY/w:ty3򆾮Aʳ^iKm}w@H;QzC"=Cɾ5rqS0&TFۑw>|?Y}sKA,KGeԝi,ngoDJqgg+gsVfd3rh hdT++7`ރ&c smMjgm7 %*_ď_ʻM4ENe^\cQF;8[hzRfn /MN΋󲜶%4^;EOKF(R0QA:UFPSMFsvxxxN!8 dQELVF׊A3 (Q::bQętu :[_j@@u!X8.CkXMgchoْ%6Kamr[/}i;n>myƤ v($NK`]s䉌I4Nyx:V3}іj6=q<ی{WV[]?R;N1SXovha2rN7 zq}mjO#OiI֣/=w10]_%.o ަ7p~ X^;}=8& |R|~rϷձ?4hϘrLFM* NȾBWW3V~Tlj9q'.g|iΜN:D(~3o9ELAhoLKgh=\L89:y MmmTqg8WͶcrGL}Z3NT2}qd]8ҡf!ySmk1a|}.zZ;h3Zگu˲suιu"%:+tst~}VrJ\h1ťbV Ry)I}QXX_T;E4l6mUNYAY0D\]䂲 ZwK}+sXR1QU\,Ǡ={W*:눈MA~et=B Gp-u,aG8+r"?OoDW|pW|.^-5^mF߰W| j{>'_ف1  SS fꝅ9락}j.G )%ʦ:jvb#'R#&K]TKr;}~.U,RU\]k =RZb 7l)maظYSLqQGY9F,B*#t}-D8cNBS>ƤYHHѕ)r&AC0oջ]'q oa?{5n-Os-y`'bZO3[bqg˛ŏ8FI1N<0aI#L> y y)亥(䋞ؑ(ҟBr haGޕ,lIeV bBuJRt xtc| JQL7 :AqZ]ծĢ Yƺb,T'H>:z˽{!B+<#7_7<%nX /wʼ55u6 =`z c xDx+x|z߿rKsPy5iȃDb_S ߪ@¿uDTo2֚@9PJa->ٚTp1:F'#́X8.E]D|jf)& m!rM?}k4_uowί"Phڸ@NTtL},1b[nolH7p\0/Oԫ,_A.Vli8QOtDG?J.OM_Os!QP6`ًUe/4Dκ`r%lM| 5^H^_De4bg/%"ؠufA\K-HVM֦h1axv?6nVg?߶[n-o?{l6 _u{ķfmvϥ׻ᗾY]s8GIڲ Ԁέ_p.0mt;czo}}b JgsD\;ɛ8QoFʦyM{B*'ܘNzߦEnh|=ZkZڃr=:esR)(77kORU=h}GMIs}Yo]r\}EXzMgʖhŲcw3z{^iFă8]R4D&k.cڡd~VxNb_7%&*sr uZnZ]v?t=<:jmVکTϣRq/jI!XSkKZRuثˮjZAqi~$gڏiz*|L } AeumRt3;]1>P\˨lڎ?!mŭ,/ǷegezV&Jr0tH!)˒6B28m\4Zq 3H %X5'Ӂk"l%[^DɍRIKRGvZlQ<4uyfZ>O[~=vWosgl\w-#0P/ ~b=#C푷I %1|d`rgePN!c$0,y`:,"#͑S!"MYrAЃƒ]c"o0i\R9;L=<=X:,<(*2*gH4ci\~8_+GGBeIK BfHT"Q32*YQ$RYL IAHQ`Sj5 L&sY8s̢9;Gi͸<Ԯ6ڋ]CNmxmFbdA1LJ\HY)3,Q˺ŗ`I3@VtX %dQ 6jW2J=}C*HF5+a5racԗ acAjq,;D!Oe N@P>i JDN ]JD=-J1mlcT p \p&-F͍D#YLXC5r#9ҫecuV}qQVEbq~a9&<3$DMϯf%(ף^9%,RMi6k"D-s~ZIS䆻8Ȃ)M$W{2J! %.ĔIsuo21kVrU8xs PdxT $Y+%IeRk/dK{J zc>Oqvuy,ylxcHj / C)I^դܤf'qO%7~Zǒ0ԖIDܗNtYeO3qbeIzKߕn:qḴY˿ɾ?J Dbt( RzjA{WBR]W7C z2(=N*ϔsn}jg:@ȱ}-/ŮåEd9-sGPҞ[띩x%%x/ V0B1N2A/T*M" * *+Z2ZԖKQs2U-X2L1i_ZeDǘdTk#9vվjP0hvbzk/LBO{z#w^*-MeLQ c]fFeS\h(Eh1',qAQ@\]ʠp;!`<$]Ja5 4:E~J+DR)BBh6*UlTٹIr.d۳ƳRh812r qa"pe1,0-u@I;,gQW fDYIaUVL"Q0Qi>HQ Ń+& R`0'HӈLɄIdAt&vQO`Y(Q2G0V&[}%+:k|$*`@oO: + CX ϝ:+VFm9Y:f t"ZFL|JRP(9Gֱ(v] E-$vPR]i/Q.yɘM2d 1qõ 䴵#镂k+ {o<O3<[ӌ3BGK8D"~ߧPj쳹*5Vx zYy^9ޞTi0zV+IewwGW-˗?}~|=W9OH$i3_>MiԤ0a5z 5e޻CmA\mI4Z>?GO9p3"(+D6< y0-IjEUH XLW8Ү4Pܿ'}t[x\T~rIQl|"\J%MιDRD!*@g$lo/fGK''$DDǜ%::U3 ""C $'C$w%j_z%Z5ZK˗OR>nCZz0 QL(%}ag|}F :=_.??' '5=:h`v 1F8ɬU+jTDOvߊ?-ŚOXYz.rnКK!2L9qc2ȗ|1\MۢhZ Z˒#q܋pFG%'cI~<6Aڀf21^^ٔ>9sl^ϞPaR~`WBDJ9"XgUi]LeR L!]7[6oVziSwtyk7!ihsJ>K,rOЂῲ4&QJO2->3]&C~~"3B~xE0(&."D.:7zoM]_vps*j|;k7YAMu;O.f Qq~pyiUb!=!-φV:џ} ` eJ3W*J[-RwKR/vDskpN@&NDK#`!()2!YڪMj 1 ,D31NZs&8CH\"n<b,9A]hTokl8aާA|z#C#j܋~Tb>9/J|^T6fʗ;QA$$e.$H>5QBD"'M18meN0v'xZʥPCh$dR5WP yvfoFS17>Lrc2 G1|N)T}SI޾s7 ~n7B= Q/i2G#M\e_}ݤ5y;w0qQIGr)sa5ΪoE9{ysRӣ7M>̮ /W8NwiUJ՜׍:Oӥ鹚.c8r%?!o?~W:iВ肪&O(!FO`TŬw3k7ؕc7~y,CQ^ze(}>af|F׏ZWJ#)ixR1# Ul}+z1}gI)֪rbZP y4565&LW:9?~@h TRbK/-r]zuF/o{`ֳe2bȽOFS2GBQNFr-;}^m궹ڨ% <j)` s,Ny1,beq }-LN0K39Q97yncX6ֻncX6ֻncX6_ƺPZ ?(U!l(<6iB:GYz0JHa$hNYGp:_\ dy4GU6,i<*k*M-cnȬggqMW(aMLXhz8ˑܻd !DDJ/V\-m\;t'td1ʺ?}jiP]fҗ'@ȳV޹V]U H.51X2<LO-gjmſ)nmjdb0XD/(18-(K$ NH:1=S-#Uo=7 nz]-!Wnsp fdz^kv"n;,hUN YOW^O<tnk!׊6@h}9K3>B12kUsN>{lxC4Q|׷f!,1e9]׼_H9Q-r^J^fo;C[yf_7dfOA3y C ȃ0kϚ:O#*%*h6'YR|`2S4QǼevjd쭋i˵)I"i%Y4Adml8E{ni=؏9Β4V-aXy2B[ɉZD)nR9N(0Y),IiXQ>pi8 (c$rqs.s &F0裶4EKhy&Ζ/!!g> W5*,h=P2Ap Vx9mr` _P|PΦ&:UiD Nj5J5J DIx0sXfJFr:,)<*(Mn5:1z^Fp¹AB gFxjΤ"\R7h SX}shkB[ԙddV+<@w˦pk[?"b;qJ+ aPz)l"> iO?}ƿ@jztUkAL}= #U zֻp!e>:i> %*(:N{we#nY'}k, xRQ(áih;S;ńzU L+`ZKbX!5:D@q 2-kph0-ɁUU4{ɔ3 %̨$BEǂa>)g GpU Rt?<3͠ g6( %Oi$_B N!|b1THHok7Vqiĕ*DrN#5MKgG,9F""˲^IPxGTޅl:vL?sSDV%?8D_*82XVfd^ǹ/. b_^B1hT9HH{$9v`hoz8Lć֯q7&A*1ќ%;&GZqNy8r1\xSm9}7(N^`=uJy*/u jۭΦ7&ϲ`Rü1tԋgkWM#U~[yr<}TJƍaa'Q&8^Ųr+67N\5cB(Vj4M]ɳLuoU/\L./8\+7HEݲ.ZvHT o&USTyaVP=>XtT k.FbU=a5?a V̫pп5xQf7YW%e_*#KUԦZ{^8iKTG_؛1TIbXz_f*eP'3 E]_:'Oxcgwg2svg}l IEv;E;E붊fE#٢hMڬ)WYr>߼܊|o>??:v~bͬC OPlW@aE%_KtPΥ WB 7byTPo*&p6!vb-?ZiK_'WL I YlMF({mNE,{,|BĬIzikl+!Z^27uyz.i<ĉR*'=IXD=&GȒ'ݦN+:c\慨khhM'V dK3`û5M<΁.c9M+=hab=Ta{nhw9-jv;(TbҁZZHBtR+e< , F sG#Yrk)6rn"Ch~F~~KECTeDz1܇g&M]_v@i=ʷ罷7Y4U_R>%6;m^Vy>oG,sQ bV-(\m*Nm+|LL<{Bʝ6!.;u nAuCuۡ˵F)S--+1.q\pׯ֨z\8-`(ڨYJ]n9O;暗 >,;-u˒J{RI-ɪK,)enT&333NV5%0I]}A968ԑ y0v鐷z+fgrhQmSh̭'χ}WTuNB^})pJyвkPGPcyPc_ /)jȌ[;d3%M K[w^q+&,[l!'v<='lNRu^+of;k{*ъgVٺX)ڄY)qxUF #+25j)5%rR6=}nEukHD.lLFU@72v5^q j £bU_ڦ쵳}Oxvvty#vGb` RhjrmJ*[ Qeq} TYo+ch_,F5 6-L3>VP @5sUΈm:;];\Puں/jWڃn3,l"R1SLT dψ@+*Z) &WU\ɣj(:[n J LZE(?T,*KP.xm:p7pƹ b7 "ꁈSѡ!ҥ1'43F0GF@\E2R@bcF:Tr6I%eLLY9K$mdʽծݦ$i:nVr(.θh.\ܪ6űN֧29_QkU %uDmV>p8zP`P&`xԾ(X6xr\m:{&p.RoԺ;Z 'u|m(ҰCFl?M3[nj_M7x Q61+Mk)>P0PS!f#4CfOX\=g Ϗ<#p>N!ꌄB #tT. (.:'kzE<|r'cc꽞ĉ9$Y9XءX1T(f6*¡jj&H[D3ymMt]cf \&\(^NUfrHSP* C"$7m:{T F M֗*3M4Ow~Uͪ׆~%ZH`17٧IZ/?Q[oOieYCdUߍ1֐-:6`9L./x,ʑҩ83a&4GD hP0tEW!&}@e r-"EeDKȓ)A.dlVrYȡ;{_malsw2܊f/wtc܊H,Bq^{ݓMn^J90m'Ć>[ G6MNdM@SN-u:֒ ɷ;-csĖJǖL끕B(|%CJ]aO&m#BںǨm×~g&crVV][B゙Kï"^{oC^E}ӫ~7s 쾯Xݟ~Owp7ȁ\>_Ysk_w{@YhWjb;0-,1-"^^,>~l<^Lpv'(Vq%t5Ҕb2DY%b=b >(7 =)wֿwm,y'me:.ǖrWjDlB:XRo^ U^M*`6*뤃>W#`-,6;gQ(Zg?BGP?jFv2pE_y!I#>5c?_m<׽?BF?W BaI_M 4LDC}k誁y*+WόJ,!azXXjQ[OPɳ(}eqFZmG9.A__޿u /b^/?ӟL\?܁fC\}S{Gd)TSf.F{$9H|qi${"/M38ZiCLŤ\{-W 61a .NB{='(˩)ğLE}=\m4-f}- ;C/)Yb֕*A}ߖ!-y*`Mಙfrm͐=9砟3$%/BړvkjzĐ-%좳٦s/ j j F>[حίʰr{B))~_9jL6SլzgWwn~g~oZqCL(#|EAeՔmu浔m aAed( K:F؊9 ;x\~5smFY%PM@p7OӼw)xi=/u;ҎAsUAW`œz`OdʼnctӐ3y̥}uD|T߲jFF(A`h!kPd,RXRV"D hsm:5.ϩI>,B&s?9}n.41fzu_8u'0~!1#~+xH_xxx: SAxxxZ :C0C:C:bV!@sA?uPuPuPuPuDG LH*Vj.kْ҇!@}TԛaPǎLk"tZk@b"X+pS&Nq7X}@K A-gHDVk8FYD_2,KrqeC ݕiR.jMС8g :@kU%Ft̽SMg7SL^qxGَ=.jqquHoOM?>K"j L #9#?ZZ|]yG] w35q;dG5OX=#v7/xmu(+k+6|Z&;p O;&|NAC6O0Mnht~w9'%ɿF+mAyXo=o4e;G$4,ݘJ7ఫv!@WwHR($=3ItxIvHI钒$$To(>$@Զ|!`>)KQ2sed 0PV8ٯբ_Ϧ ;7i62QWq~DvC#\ Qmڌ&h!N)BQPѯq$LL[LDR)5X~k bpE*V2fh"Cϴ8F] RN)HTD1DHѵʚš,H6DYf6&!4ذﶖM^͆vi8 :&h]Lҕ utVHD)gKY=%aAdjY(Dc"!BMgat0ΫT(D\I%ATF]RQ{ ZFh3K۪Π:@DU ~Рj ,K5m@(5H.4CZ1awxxJ,DLuBUt6ޱZTf.CXvNxbׯ }w͎<߸c{9?^~%Ӵ؟+!Yvw, Q;z%\KA\ y3>+^ =ג4rl#Zzckyv5ǫo~܃V6S["@,2V BNE~Ԣbn,*6b0jdkv=ό)pQz%=߈"FvjZ@d(p8^rcN!@0 ͊%KƒׁGc2=s7˛iƲyh>.n1hb:bu=?`­HY,鬩h笶Zz-bɧygEmmKPSY?R&iG7<ݬFs3vnlwf;!lxdçu_|;xkc候܍.X|n|γ} ߭ƽ=Z=؃]uܢfnv=c ?}*1nnOYwhݱFWbDsmn^ms\>QLd[<51Ӏqd]Q;5@2*3*lV3 Ռqy"(Xsơ)HK,IdɑqRZ:B+l8k$l0!!Gr3S*/ZK}W ft CK˗i0wߞ97༽vP+/>QE=ՎI~H"KŴCd->i|^Sm#F:rΎG &g)Q I"RRͥ c E~kQJ+dD&zlPW-[/IIt \q 14Kz3΁t~d".DCIHڀrXPhbɑ1[,zC3-Ho'B2J!)$' |^ͨӯ!h,DM9z@ i2"Ꚑ* mq1*Ț_y՟!Nz]sHJ42kR jHͺO )$G/cA˫g6xk|cIgy،v8AWtȺ8=(sUbF ׀V"!HV?,X䜋J{a-V]EM02{0lW2wC!Pg{&MaCeCDrV!l|0xb|aLld yF<6l<:\~D'բ fabV'1䕂Mc2E>Ŵ~mj{n+ף|Nګ E {Vbe @  @^Ϋۓ\Oy+z׸n'fs-6nRMaA0o<~@o :RT1±j )B+kXr[<Z !|~rZb(QH\ ^Fd&tb+h_o5uXkؼIӑ'D&O ]>69)kHY$Nٶ,>y䝭߈fB R {,=;^ i<tnx .9lɛF Tr*?&M|K%(tRuxM^ܔ'`Y$^;,KE~]gKJbbGBꗐ,ډJ%SPBߏB,g+ #`sQATz?jT1OկQuqۻ_h9kqj]{?_-;*LxV6 C2&ig~hhQ[w}H;Y}I|<0%kQtl TPN.щ5ZKbAgqvsX2 <;o6ߝ$eT $hml4m7|nq]>c!/+R۳yȔ'OEhH;}mvٻw ~5C& w-!x6V)N[%ˉStV;S`?Lh=z^]k/탟U̾2-0g̦'d8&߮*g!:VrwκQӹ]TI6O|?n6]sz6]d}WlkծzvE͘ ԅ痩N}yTq,|R?wMj'l:_=)}~O(?'B # ߟ</=biҼfiV{Xڊ>w>Cօ>ܱ~[cv% ?LvObUm}s [jM*%$of2?[.&i=E ՟B nJge!N}1- Jxn \o~>K|F+Ho6 Iņ *ACh˾VXS%h 7-Gx/k,QfT"c)Z 53aHEFIU 2KtV': 'ߧ\|}Jn7mOr_v/_k{ۏ+~ֿnGAkzۏrdNmfWok5Q.gW5SjFK7'9! PH7oB1~N3xңC ;)QdzNd[z^^2fCؤK9 #N\/Il9Q{.Ӷ|=Hj5h#vJ[ίf*&J\|qUf&͇>ۇI׫_?ϯ&XJ}q0]tm=?،|ݱ[&;~_TՂ0^jޚ[cЮPeO'mu-ױLʼ\< 's>iZl[fTW X" A6g2ȪȍMrN$ XMawm9juiuϺ.o=׳Tw;ERJҿ)g؉77엨CwW(Fg없 @M6C^!GVhR^\c&0)jL<&Ƙ져l4^vM Cc"q8}LK7 p{B!#z]QSPN`]ޛ,{@-9%@Ɠyyү8SK5|%cF'r1[%eAVXRJ f߱SD8??e *DH=:1GނK=joP駱Q%[t b'OI(3IBTuk]h挽3 Œ'B)z~/ e_^U͸wၝ7-5 lX~:ki0у&T%D:FY4)TUr6b-.PµlS D*ki1-/ <7%Q?zؽcycbkOڲ_-ȵG vg>qU"D BkR䜜1 RoniAcC%̚&˼&HFrɰHP Û fب&.8akb L1#Cr#GY5 PD%2{s5l Sl02jI! AK_LmnAgb2jr)%`%dL}g~x}J#E&!:{ÒE3_T#_Ί*[Q\*%YڻZde0v2Ri^}/S/x*?!< FuuY,鬕5` ܹ^QگתN0'K4h/ "4o ~FA/$^4.DΎmM7/덝AA;fQ R"Ic)U igEWA:%P?{F"p9$#%`?l|{"y$gxICYVTdH#kh>5_U#KnC*zq\8-<drz*j3r MzyhZ>&ޯďǫ1\~BXW#oCa~[#$1rur6QFxoyŇ>e `w!$9 ³ B  EC@uDSL&xJT%/GIӵz*A#H{;EtZ-MM?SGsy&Pd"ZorI S:Mt*&C ш4 <lHBπ<HF1ape#b6&9{g&il٠ը#ps3I jW/.#07,68V 6SgxՈ3 O+ioІkyD0M BDԦ0MT_!L+ô#+"ب* V \CP)WQIZ"%GW\~4Uz0PW /*7ԅS2n1ӥѠ+?)/q!"̿~r8Q+;~*⧵ sDpUBQEͽB%^!\98B>j=*Zw*T:3wWfGoNڼaړ6ӞF|ሒQ^|gi pe:a]G/c7lv9쏛lU#}/z1K OY9z%λGGdWtL`B\}{V0g`[@?gqp->zf0j+rPӒOQy \DPccT* 83\j'm&tRY C2X!xmRN(3\萌99n9{;ߠio7rPؖ.cE*v&^aKЖ˥ڐ$ߖrgFEJg3^ \! p^I̘w$}PMHD"3S]i-^TqrUzO=2fM,uh(#+ D|Tt$IPKU$ޝ * C6jڗ}?+VFm9VgLN^D+Bkt-Ƀg Qrl`a|ۿjGPX)\ԂIOD%c6 RZck!1 f3RtHNɮ\[{xJgٲf:Z) O(.5 ڿ؀jL>~(|G/c̢Jg@8O7O[H2N SӲ_ގ.4ѳUx5=1gʝ(I9/> WIZB1x7"*~pu1pK@9`)%;97ZJ.㉯k&7' N\i5CW9}qGtP&2cArkc7ڷq|Mz_xfiiԧx3 4#| .>^nRBZEbaK# Zp=+9 R8kF5k`ލjh9$ӟߖW~n.xXͶa%1d]n?y2>Xt;u82Φ^!l?c-+eN+zxmˈe.*oIdNg >],Z|6ܬ7]sqx涘=&VmWǸ׺ϟgXq4ˣfIcjߗˣ8 v?_?}υ}?߼X`$pFf_M߾ X/o~}Ҧid챴a]jYunqfqp׫?Nӟu,un*˭$x\ 4J$7ٳq5EC՟SEP?BxW^@iTV9iIGē|$+4R+E lJJs!sI;dCV,#}Æ5U_z^ڐ OR'$DD|ǜ%::U3 ""C $nzag਼F8-h\Qפ6&htӴ gG|6O?LgB+=eߐGЅȃ̮[J;y>OSZIEn!\\fܰdvoL5`꽐pD0Dep40\~40Z }O)T*a[u;L؟OI#c}gի4hP-旓܁cQ<@( $K3 h pL};^w9'y^ZyKc<ˀCy]WSNNn lЯtBw>6׳CHד޹[jt |\xuʧ䬫kՁ t X] YCumRtm#tMW>##ґ.i 1TrA9'e bVJo4N zDKaR!8r@Rr&s9x'=Y~;錜=>uV+<Z^&$a"BR%m1 B28ijCҊ{`hA(|N'J6 $ETJ][B/"ȕJnƎ3rv *ϛƣd .o8KWg|rhZדq^[ Z=E܉'ykO[\]1SS^vK/?ϻϞnku>jJ209J[$\8rY餳iYR_^lAv7u\L_d2w`%+- Ap QLDL* xgFYd1Vfm(ΞOVBR!hHFTJHf]3v؝l\̾vgq(jnQ{,ضEm3# 10ńgR: hDϴX3@-MJUC&ːyQ!IȢ(ld8b)@Tc<쌜x8%yy>NX/">Z1zsi'A HK=/q *zpz-A m[ YΘ6@1*emLD.8RPFuv%ŀ.>-A F&b$՟!vs1Mܭ3BC a\|\MWq(Bx3@&zWa޿m7-0&q !ьw84cFXgER:UeDxeRFR!mY)zrZ~nk>o3[tɼ31)RVAJs3Gr9e'"pQZ z DIk602`܆T_$$pZx261>/^Cg9>VF)(H|LơŽ_ /qWc0.OIDZ Gߨ9:ӷN x$t rۑhEjwj6[vz\,ή;Vt Y}GVMg&Ƽ]oZ&vQ5^vk.܂#kė˭Ui>K?pѸɷ':t>8dC L)da${$z2[M:8q9sc|Yis< OL@@YQI&"1zR9L^{&[ڻ!љ++0fP`cځ /^p`Cx}3s=S<\ĀhjV*؄uEcHT?v"eDE6s?0Zsǭ@`ޅ/֫_gB!;d$2ZԖKQ| ǘ/P2QR hc a* k#vrjd`,opmk[Y&{ g!?,N[ҧi+d}!(:/?(5sHR';*Jq!{f9ңugE\y"H >Ъ.J0(:pC`<9.*|>:2uJIER#@x4F-6Bv.gk)R4xBٺ<^!vBFnU)fbJ8Ah8&u"Uf(I|V?_pw,$t H L"i(@b9G5>hxpCY& \ 3Yi>Ab Nxc%j.QI]` A$ik{#m/ZʍvaE@#3 I9V>Gh00].V\J QB FQEp hښ8hdyAx?>wd9Osd!3Ac#sݡLDIs^fjM%NͩZKX3tw؆IHoXF椝<ȴR80e}*:G06Vȱa+s}WˢX'W&Md?xqiJ $!at6y&gr>̪;'%}⳸\Gz4 ^6$'jxȠ!H)CӍ)7Rf$|n;2; i1)e0N[:/ic'C}qkAD)9K$ST1[n lE}8mB\w'/E]MS+NMlrI2B!a~Ln?M]7my62]ߗjcy8≟pX/*ӧ8K _%QRvFfO}/WJ1 LeRYeD"Ykc!+&sj+^A!xg?B!̡]`[+Z_ Xj:IvBӔ>Rǿſ*@(4f|cFETw'"3B7ZF ޳JD ё1YEQ'brKɀS`FVA$[zH0QPَ˦6<5sdrقL[1Bg)CsOIt o6\OKN,+On]@bHa5DEc9ha#@Iym̭F!JAZn2%gF"Gއsa0}{|;b RüpB;W]%CGLO79'5J(E `sfҢ ѻ2'T&3\]C6x2S9=J-:zm6QKOL#/P|/z|/sދza=Eî-g,F+Iܞ-/_W7חE@_I(LcEM*L֚ڏ&㘐ۄlh&Q_LȣhoBdu!X,+)md(rMBGTŨ-T2tbn-pF@(K kAHHޅT^y@0LZ9n C nNۻ T*^( =5~2Ԁ톱yČx*9x&Wş 3(I½LFM73a7tFcT=y{u)*0 zsdM$ 04- @imcv|,eS |`.&p.T2$|}_S'<>?s MìQg%t*1R2$f`\2o^qԍfӓqӕ{`-{+|;7dX%7[Do9Mn/.$i䧭c]Eݫ&'d1\5MhpIQJF3$q_P;,z1\b =ܝTp7^xp(UJ(rv2@\tɟ "2ig=I^\K%D-hIȞN #vmzI\>qz7=~eb! >}9W9FfίtG3=7O~E Q![BA4`Ñj8Ñ<#T*nd$wNf( I1oDb`ֹ zD YX0eHчx,2+qQ˜8Z'8KmȰ9#dkE].^pP~uSawfΗY9u7=.o_xbh,S{]/.>{+uɂCIڲ8E _KG"\0^-@jݎnVwwOsRj;f!leӭûyλzj{=ww;|ތ5?ʀGqsq;Qd4_7=JogMߴyJ>ѓF@mז6d ζr4_]MeXM~LFzM(iW\}vYLd_&Qϥ+R*rq1gEz9zLR:j,rTYwtǦԔӨJY,Om~,?]\^W+=\~w.1& ,5΀j:<뽛z U!l^1ǐU~CW;+<m~7-OqEu pV V3jX`=``5f씚V3jXV3j7f^=]]n׵uv]]n׵uv[GZ~|v]]n׵uv]]n׵uv]]n׵uv]]W /؋"r˩*nuURױbuRVJIR_^96!h݀4hl#FJ3UT-p[b[bӵ#/yBBXTYYmPrh.Em hSTY&1cҾD J. 1'sH7"_-JS:/LwkoȜۢ;6R_Fm)`%#3ҢEJ{R$ĵE  Ƴ*In$^tJNsy]*dh(9Qer&vL+/]>3ʙgK#Ўqbd &DY1cYY`6[$Z뀒4Yz#gC9k1[6QoDAXH# C2*s# R!*xpEB"Ч- p{AzJb J,"j< L!HyԊ6'= eѕHuF!TSs,">*I9`ɗL|:A 9B^cGFs1/r60Rbx,:g>p2eEc~B ~Sy[cs ?pGG 9!w_=z{|o@ׂ ࠛ(g,?w#?N,?`)g/ f_R2B Gi ȇ=´D}Vv%W!b-%qOxFv3]fql =_ŕ ALx\0,fl>bF_0 qd͕㽴\W,f!Y厉"m[(3zdKC8-jp+'ʚj<>kbH7G)-C 9)Z$ڢ L$3F!c$0\y`:,{EFZ#D+Iڔ%=mʤH$x%_IzUi'b9w#cJXg< 9 ]'+}gg2h*;?Diݛ??b4ξr.!T*dD%r0e<3J W2E:˚D&鷃2mCq|tp&j.Y8sؽs7b8ۂڽqCQڜUԮ?f$F SLx&.RV@$LkLU93J6joCV@2Yȋ aMB@a&K%3ћJ,Sg7r֩/ՀÄ{mAD^"Ƣ7:)|`4hp)G&+i]e ΄Ԩhc$OZp덜Qimu%E3.o o#1O֐l, c[[ggυVb/x(Bx'r%ݎ3O:rN[|kKv$C( jse?g*|l<@;cȳn~Nμ9ί[?}wi+/x<9-9ON,S|R/i]B'S&v;Wt盼ږW:vYw@ņ]G_F`~őZ].|Pˣ9wfg|i_*;gՃL7s`ZqtE|Xz0.L=HfP297kCU)azsSJ'jДacjeѲh?Q9ݻ:n)?=Z@r_51)RVAJs3Gr9e'"pQZ 9 ma@e, HI:s Г#mb:2}A0'4ؐ"UJ[Х|#8aJx2[w{7Xz40NEE/ H|c.klMԄ)[|ԥt}!ɊOlM*0|+-2H#jǜb"0SD,Cs=pg#CC`+0pY,$p+w) ZyAr \(E&|=Co BE#}#&)YhntVO;/S":,3>oSӯxt Yȸ<{\ÔN!4 5yBqQΎRN!Oid4]6\!fh#gO$`u8 Re"O/SuG5c^ BH(b b EԽ=a@:=^ċ/Ny :L'ξv}*IYD ٻ6dWfwG[E@ ,e ImS=3Ii[Y]SUUuw-PBi/cJ9JKH3m[{Kp3IzQUx\Ls̿}' 4]>R1 +.FU8&tSVΓwkgRu")!#71h+3"Ճٰ'D~P ǢFYoڿLOFT?^Լ0VgV"#Pj#%*G:^W7뱺4gK^B^ȿ+4ԟ/eǵqF{f D8ӷ]ؖy~' vw@RTlb^]Q= r"|)Տ`[* OO4(n_RGnaVd01S={fVϬnVZo͆-4osOkPwZ73K5 [m-WİMMykL+1սk͆maInaLwdž2ꙛ{mߔkQ4H3C#qk; YKLKl)y s~R;!ayl*C>B>R {v6(ArHBi'̣' 2`eFI5B0ݹ^@.;ޚl<6> ƒUGSܽJe:xL1._t&E_(s'2sWkdalHE^l {TeuAZ|ˣ8fWMs b׼`rd5WiB'U!-dbgi<6-%N 1e +(# bW8XA'Nb=bTzTΕ|,jY9 mJ|Q FSh|!h@Fiig2&l GQX0sy:JҩCJҕP`7 BP5M";($e`-I#c!pJBk`zcFe΁N֟S9IsBw4M*v|tMwP!4kb^YKL}`sdyj4,uڸȃ/.XP&V(AyA~v@ XtDJhGOk!s4IQ^Y#<Ʊֵ ?uJo gعOr ѻ׳:.G_3V֧Z+E< lJDjJg (:1B%BiWvTC4R:V[\.it*x-*U]Y߀OFYj ώV:qS }&X_ytqAX&#1H.;p.!]~g͎c p]%|)IsXE WF`!së]0]^ukڇk֮}Ԯ}gv~@Zծ} (۵5Oϓ??|@pĢ&R"Z^8l!)a  ~ \$s_vAx9g*:#& HT1iЎQ!.H+E&NFv")b yfq^66kZ6zns> Ϸ-I*(^&"HS-,%41CL)0і{PŷSVv6EWِPge{`5v|ۅc^&C=VGA)G z,̭\*݅c^a8q>"s A\evA*s*siXg^6GdaoТ>r$s5rI*͕.vuNj2MшZ2*18]q~KǷ!: c~;?hV5 8[((->]HRtG[p\FTGc3hPe>H \LF3 RrLNpf'fٕG*3UҨ\Bs%\e ?s \en2zJJ`tQ.Oy{c;fć%㘗?,kGazTcn}sLP(D!(&J4@LIZq6 qrJN0XnSWX@7Ch4}g&6kU@zߡ9naQ)j=ɏvfIH},9.ҙȬ)pZ/8zN>I^8_K([;!~L.]̕u(31xnoB9gH-&xyQ=wO\;&(YҊ)f}:"pp+f-y/Κ:s&Y0` rO Ì]{Ɠ4*TQ˜avkxdxbNzcpe0JRQkt:4mG[lwzK0GY1z6cN3'}# eYoBǻa؀]KQSBj pbB[5rBW(.8H"&"yB-2F ߚN/V;!a`A)D˺;;BC5*X."-k(*+\T!HdHbmݖ&v"A17Mu8X DH,(>_ۋ.i>$ᵬTEAQ \̹5IT3=6qc(:Te:cp֡zyrilۘeLȔPˆ}F7ZhNDQ5 _m! =Tp$^aPށƔK+.⟀I^jJIHB˼f.I7>jKpU {Nit2xVD93^2Np-C |-8.**c(41NUUh.q%R3Uߏvs(4HB ZG "XQB (?Lu^˰ǢqKv|Y8 Fr0,'Z;p.7V0̪t70,INs7g?~}߻ÀcՁj 30'APC{?u">Ї8 iROsFECF3Iv2&'JrN?ir MI~6%V\%oٕr6#(O+2?/6g2bC })MolK3R(_V/J-T }:LO.9[uY l?]:qG/aTJ}譍nݫ Q^=|zvYveF3WkjfYJp7 F|xo18AГۓ{j5wS%UQy*pT'?&>(8'7&gDaQ2XX=/h߽wo?rs= ̢EᤉIOc@<;]twgisүhs Uf-پ_,q7cZx;bAk&z ^F5Z5ns ,W4!}ӅiF>OBJt4\`wyf50b%##r>Q\hE>Ai9t09K6X]9rWJkŗEl!bq{%GqFkYH7=cFcP fd=~Ep7*i>n8|&rn~zZu~:v`B߻S~q;p6 Hl9B][svۤwΗgJm a"OzѭT{M^ R9[JƟ7OK'mN%^41-U t XYC5&E u,Y9~>Ջ,L{pd&N%s2[SBbVJolCnݣ\wOPe!aBN&)$eY\qN+f&J("0D $ *[IV+G@BbTA@CvZTi>DMT㓓&Qˣ&T^ uzK'%XQ[TQc2Fo Lpe1βfoIDMYrA%9lvA$D$= |&%U[3V#հJ=]X3NԅUօnЅgՅ[nn 2M5cqUs/4~2OƋo\c!TPZ2@92 h%+T xgZY\h瓠/!EQR!hHFTr_Hf]3V9l}ոTj`nhYxP)&<)E+ 5p`6jYCV@2!CEGfB@av%_J,S9aԗۀq¢G_4"4(\pBIcP%K^KBi]ֆd@gB jH1'-Ho8dU#FBZ@zq.6ڧj\r^މ o#1O֐l;9KB&Ys!A/C/l-8UBe}3ٿco2 !u,?G?ʏ_ЏkWtr9ƈM 4%!3R!LAA'w֠=Z~ə-9 ޙUAɔb`A)Zi %9ϙT9ۜ(a^d %!|ɐa:s T261>?ͥb9Gy΃|SkzyjZ>&ޯ,o_o~?.Z/qt.5;~-ݗGH|c&`3l&-I>-A:>΢ /ظO {Q ޢ!(@rQ;FNL,+4yckА2,uL fxZdS S"I>ʊ940hʣ%+nAH>^W0ȝRtMniAM:3ko\X<(Rqy3$)o&!HhDAfhbP-K,r藛[`,鎿>8Q>Vt_X %ͳ++ބ_qO@gqrvE:XmTǏPUm=HJUqF cxWVFbUTO:Rڽ9{ˎ#1}[lCIw}H0MLե~O+>9&[W9BiQtt( h1',qAQ@\]ʠp;!`<$}5 ":%AĹm3GKeEkœ#/8 C3LģxlXʾWSV=n.a~]4mͥmK;Z Q Cl+9`\U6Y/J)M8.6 %45Qd`/4RЀwq>妄N0C6Get@)mp,=>NFڼuJq-Ru5%ZK&dsȸ/a-h@ZM2ET&Ntl6ۉf6CO/W8zdIju|Q1NG7Ԯ{ [v4~&ՂS޵˵tv?]] %'yz5I;u/>L/>lچuUv@Z5;ilH~37pآ70 +uM)I& Ϡ# (.S>p~+k>IceTHDiКK]^&Lq!D'릕Jq!Q:e \g@E"Ykc!_MKsxviW{X 7D9TJg>}[k..)[%d|[lo,C@DhSqRʈA|_M -[ Q`RH-PGDY/1k'j4 6Z( $ li>@scAe{r$.rKzA}a^[/7>|%!Ĭ>l=}a6ms?ueM"Հ <9BPh?9ϞO2̭}oJZe7]h3x# FC]9T^U ,Ű&^fDž횡>?{֭_eo96_Ç|M.6-CG,?ãVcKi uEəp15fqV|ľK/7ysN\+k\ѕP("̴% XR*O[ C6xRS9=J-:zm6QKOD#3p{oGI^瘺xwU5:EQ>WTrG γG~\g:pA6{Aka7# JB"{# a:7#Ivz}i2}߹H%iȶI%ZPjzml,^OZgx= -^9Y,d6hIDIHV1j 1[G:6^D|#Z 28D9i: $$BR^A YäeX_ _ާ:6ijǮT+sR΅Pֺ￟;.nֺ 0۹SVqgSN}|Ё$Xm&H͙0:HU&@BT~ b1yMTAut41 f 6l1;^+ouRyxh<ӛWMZ0mRƗ/-_rג)4T\zjgH)qs^F0Nt\Lzhll6k296yʚ<tG}|[7\ !ך>,>eׇJ[_NJ8 vE+~8\]hwvEP7]Bv%Mq!? vEK]p- "hή *z_)콢xIпZ#fܴVi݊*;0Ak)dwozףɴi$mg߁EP)O6̛IΓa ?HÄB7tL15/FC|BYg(•RQ5A>j^ּмǣDX0LIQkӅ>nb||2￘2KAqVƘ\jiV[vˑ@m+4miƽQq~o\2cM(%YjDktByjA-6 fːQYAr"sMhADݱ.[uf&ַ8qu+_+{0m]D+J\5*r`tGaW\]}gWJ-*zJ[!0T%]phwvEP2ʮ^2[TyvƁ%l>*.kՒ:^ }6˹ͽ[0N.Gh!N5ptƅrXz4 ?ᎶX{>Ydߥzږea7>~zVm2BLA4m0ފ3ۣsH!,=DRup(\GU)*QRf5m|y{0htX)]twO9_#:4N7[7G?O">Wr`YHKZiϫ'\@X#6I*EdA$*N[/Rd& D))3jwӠzlo%޺ny齽d5-ڏ?z)g3ʹٸ7/ݿۖ|aow=.FMHg90SK|ZⳖ%>kZⳖ^z;V}H0É*72A+ާ,Pͬ1bfFX+׊ĵ"qH\+׊߲f jE %jEZV$kE*bv'bJ]H\*~VV$kEZV$kEZV$5KФ;U%>kZⳖ%>kZⳖSy ,ǵg-YK|g-YK|g-ySy?\LwWC .+P]5}($Yfɚ5di 4S3l + 2wfN^m|wwnbDGwn߳y]:;(+'ֱFb2z0wƲK۱Ȍ4%I{qA9g9!ZˀGZNҔ,xLw8qp3.aWA?ޮu@ҏ4Ԇ:cfԿ! /]ob8+&ZZ0 iPÐ3)bYC!*DB$]S2XU %QpBdɱ:Kd(ErJatɲ$@"Q)I)"'J"s*,$Xs 4 ҃fBIs?9Z`:"`lf97u3L k4 ј@獎FHi#%A['bLVn9HosA3d5bj.ha-еw뤭r{1^‰Y ?Ѐ *48.JTnQA&hEuϪYu>(rv2@TAdQ&,e^$BYijJ,eZ-74}߆Ry #5lNz\r3{kqrݏcş!;d2kǝ(G*H:fWɻs3'_|ԁSؠ.%PBhTȖ F0V,=5KYz?KGPDWRL;!P6p'Ƹ,Cb`ֹN%C5da Ka ːųB!tA˜8IO6A%6Bj_'a omWoxkk|uߠÅӚH/&"ΎG›w=}Ҫ]ٻ|qmm ͕/]^/1o.@jοns8~EA?yM =̴e["jyy#yܡMU|Gta9O]({HMZ7JX4 R͎\͞(Y鬺(qta@+e*aA[(rye-dwM2L8Ƥ=h3AR hc a*ǹ H_:Ufg):Iqm-\QQ==/_:9B=;#-Zʗ 9ѮG6(p;!A0UrUܷGNtJIERb< FlMܬKvϳRh8!2r0pe1 V1-u@I':gC:koZD?z_JX$!"8}-pM!\+ tI1 {AOSXZ9K&8O&J<*ju'j`|]Փv+=?/-w_i]  ($tZu c =bŵyUe01Nndm/C?or8Lp&d(~viIjyxUq xcpY${Ne#,ibAHnjAИ,m"*O9G_mZDklJ;_g(T3HoXF椝Rb]&L$Tt&V<l3 y*u؎þF#͸ֻQmi[A8ɎM|IYuP|ʸ}OIP5緒SͶ^g&M@IOus+9W8w"DN4**X&!\:ySz=Ji*ӒTbf^B%  R7Y1̤4ݥ)\pӚimF'kn:tʮT@Ź̘}8H8% *#ҴLS['c[۟g7sY(2V6-x:YS!MZBE_iVk>a\Z fs qӛ^!G:Thpb:Tb`.;>7ZJ.2 nb8p "*<2LB%kc[Z~<:+8Isbih'xۻ-i[vi5 QBZ '8n҈ rIoH^-Ũ23o;3Mؚw7V/1Dr̥?̃J\Ho80;BpV1f$;\uH r0b0ZfU>&ZD+XVvɭ Iu\#{1B@_Xco{icM#NN5/^_}$tbFgϟ<3.ٻ;wem$IgIyDa`zg ,vg^a48I`FI:U-WUfdV_fEFH4W)S8AW Iӟ^е5h]kVs ߥ_9~wo;G4;fwL$' ht{+[ 1d4z4Mr}cWO0S{Gs'DRN$oBkX?х՝{Ѯ>3K!;,~?)Ѽb̓URs;^c URQd9mEv{#As0lJS<k-Ӏ(fF'jDj&#&+ Ghx];X:6.W7\I8՚}Pm+(Qq\ŅwbȂUϛXC1F54fx(El"ҵCw 7;߇=:߇Ɣ ȆٷdLZ֣2JߘITL)6u5ܺ!G8OOXY:!}Rk&G4,:nrAHHGmXD wf:D tnL[Qù" 6$FߑD!'#ѢԎ!i&.]KRG]՗Dž{^5񐕼>jOu$Q|yEoXgCI+%AAjDc01Fo4LpaiɢLEeUd>rĤX "P, I:fEt)Iө+I9"FjlGz\Ҍ=Xh,<(>({u]qG+οd=7n'wec(!}f-Ȅ"{g&}2h)T 8ket$ʢfpFfi(=+ g() %df6U9TFjlGl?Ƶ%f_Pvڼ.js6`7Q&'Rds4C&1# L) `LrV;)YCF]9 !ZE4aML,( c*'Iޘ1@]َT/XM?|@7 N[c!]Tɣ`$h1SP:tu51 ArU)ҫA3>` Ʉ@+iZ:&}}8<ѫfi\}묦%⢬b7Q'&B &.5PYki2V X a\<.l-2~xxKQ*}n}NZ{G׾vC.(Ƿ~eJ^&fm+P\oZҢx %/r~7JBwPwP׫\[]=X{v3mֆC?u)~~ksCT怍a)Zfybm|^ŧVs~TsP?&jTQy1<")T!F9ϙ'9IͼU79/R M3,񱔿J*sap 47sqWorTgǘKBC)( }Е}Lƾc7ryÜjul %)yq7݉#o˶ͷ>/lVʛ 9#2([~lttA.ҦQpD(T>3I IYf ѰB!w`XrsnM`IT!Kb2Gz#$ӛ91p[}7j#Scs3ڻ%Ͳ tgzӬ§9iIdR<;$3VC{ 7`r1O]^z:g@ yJb$@f-h)LEvd,w,qϣX: w1$”K)op~BlO4B \ %o79IkG~RTy2KHkxV\JzR)D(5n lQ_fyB)V7CG6Q}}2yI jfZwr~]yk[N=`⌉62K ;&/ c(Jb1#ݛЫ1fq&||6?Y ho*qzl)"rtYH??HJae-Yn#Mضr[6g7nѼoz ۩߷0Lm :?ב`;I̯*@?j"MٸxpwuG(.;5wBmrWCXJe;sx~o\tlHxGnj/h YD^8my˽ (ㄾwk6wi~QI}k MZS zԬh0fw"w.7ծjڅ_js,/joH0$j׿Ϧk0>h~ϯqˡ`"=e W$>߯NE]xȀ ?-e7z>RiW{OgTolqbPq#s8L 3EXkwoonηÍt%{~tyEZ&X혠]{ۻ螸.[, iPpQvE'놎V&^;uwh56M}7ݗ3ZqW ɇ7 [V2օhiQ鵖FQ&bY%K%YL16%ѫ~| tޯ`}9سRCw< )5ρi0AmTD C Q*ˈV+gcC{%+Y3OW}SM7`zR./=VqrG i҃!'!Dɏ('O)%>hlqhH{-R%\!,+hK'_Ngя?hNrzbVu\ij4ب+o}x~fΎn]gn|i<l'qhvoGiڵ֖AbLF{82G?p$e[6*37zIXOXX ':XbvYb:W Uckd}lprj)>{7&iHZJuw9@g{WlL!ѡ~#KwZ өB{{Y_Nd!R ,2$9r$eMD]^e)DP$I2YY!H`9Sejl?U07>dZc_ƙ$P|ޏZ^{s.B%:/v+K^NķSkhۢldiiIr"d.: GڱrYn uUNE *FΑX> x@879ɑrI̤vXvR )e9q -*3$"J$vVMf|AQKo`K;@9 d0#"*  y n_8J'z(S62)YqEٚ"(m u6 ^y!jINz[3i$hp*'!0U1(p90T¦ґsH>DgsYe%W6O0GUbj:Wb,U3]{.nZsY ^'iy. t2x!-:J̲h8]Vw0ԑg'Q F L9*KԾȤ)043}y3;2ք׾XG5 :Le0qz^kpe -M{]ާ?\ڙ%%* aHuV31؄0ՃMVJtyZ(LEyV4@4qCJ5uP]X%Yw"8dTLe,-3DW\5.'2KeeJ@6ޕ$ٿR0!?ݞ@=aZk)qL-,odIQ,QIiXyE|aB Oc*HYJy> a#^5UZ;8xoB}=:?iY;O4װ<(P h4 Qm:ͣUf=* ^{ |xa+ƗdPL]<t4TxG :&(UJpJ3}Ј`@8*4#@GGb7,1?yJlƽOp^i0h-"-9 h*(.>LL4# NөiC5䬔\ҙ^~CY>{_2Y]NJW !鑤wzRP$\Mo2셁gߕ@0&R!swaT70o}nFKoa>S8[F?;y}L|G^͞qdzyu8:&=&hp{=Ѧh8 >x(a3Us{ 9D[s|bˆH6;si>q?hfۙ] TuKPKK3"-1˶fH{3M,PYO0MIaG|'m/M[V |mUxnu*9_Fc}= (V 4NpW(P2yX\\;w^͋w߽x&_{ `k@o{D5חд,47o̞MKTrԳ .+9v_پ[lY(=^_z1?'.AoRPքZ`bnk45ARl/?vyxuo9eV (Sgկ?Yͼ)mtLg3U%q?q)-e' 6+TY1}[<81AXT1ُ"yl~S:`aƑ-Iڳ4 Wp Fͯc'ko!~cFz9+tUlO <ɻJXMY&<ϚiV:k9@~$` z!u_u2=ǿOLȦh՞:#,RXz{˩J[CMcT2.N\2tg7ǂs3r+Eh|4ʇUO$gϮږYCpZ"^+Jܚ(md^v#4 mz wt0kt3i6Iwbz<O;rm}jtsQ]ZēH/ŇQi9}?z('/k=]e-[DEK~`e-{cq^H9Žb?TD FONVC}s'_M888pi]v7oG<@-(_zOon5؊ v~pJ%j ZXTõ 2"#!S/DF8$ wt!m8BTs-ךFVTLDcR#U٪2(:pBwZSQ,ZFL&ZO$YźF> =+<~8S)zB`s`aة1O%86 d"IJpCRt Ѐ]QQ8$ʑ;a0 +jt =Oz fi'N39(brVxvwN7ym9Cк|LNs.Ι\7?)qGhؚyy܁q^SʔPx'T1&jy@^IDD"HxiQd6`I$brK%*Ɍ p5T]X;ۑR KIƎXQa,G,+.z6NqKbN~}r3q}j3>_9bP 4˸Yh$ZGDmP+Lf pLYn*LCr'($& fh:)R:"Xˆ];ێ~<.&6.Qhn'*dbGX&G J5Ô1"́&È3cD%b hYȅ4*x`b^A ('p v #ȁS)]Xwav'0 "CAD|D#"n$GwFr{ax,'ȃ'6IIy 52AV!!Ɓ3uS/ gN x҄Ij$4yMlGď.NrR슋0.#.qq#2`K5!l^Immp*-Ht %L+(E#.ޫt슇0pHo~3˼Gv*+sl`K2dF;$&V. ֲXFs ?;IQM)53ZXdƜn-tfc{Z-t O1XNl@s?ӂ# M `!_#]蓮~١^jf Fڙ\[j>;ygs T#RzWgN̓˚}-WHg㦛.=+5k?p -jhgyAw_5nfg w3=\q Mf~|iNjͮiR߆1\TgDLcA/%a2{qlNŗ&L^c_;4%t!d3 H ZcMX\[qO5K0Э&٦~䐕lL/ \u8'@R¹Anܭ Z'jui)#-|5XlBJMQ7v$:?䰗J96IzĔRl(}Cs,(Yw(-zi?)7FާCۜwyDa1":aלqwuwa+LŠXB+֩mfzv'鐝?XKtL)@z٦/A2)g'}/ZtVHDHAs{SM=SJ=ܣ 'ʻ/jSt{a9ц`^w08F(QE5q J[zCrɲ摅H>HԔ1тFQ4`pH.|vXw:R퍭q꧛)ޮ|Wsm! ǫ5qOɝ$̶7[\>i.@: aUd1J'#D#N2abLzD*EuZIgb|HQA)`XD %Jpd ,Fr(H%T8 ySSDcB%% XZY]2b C#| iI?LwUu0:Ok)= ~hϝ tSE2.'`!3!ⶡm}x(cnp* 30#) f7lvr;3Bm8޻*mM+4mjlDV zwڵ vѥt'.zHI Q~hJxR]Ѽ(&wiu/rt?|z*EF?ξqv èomy]hr alQ}{/í x{fyGMz=uh6f= x16 RSGa1@MS|N oC1YHZLT]#EE͗/EaV$-;F[1DAt6ތ\ꍧ39"a8RR˃P{ԛC4ȥJk93q%{n[ 蘓={x!k(Zח[=V`8X|e0^萴T1i*"a¨htޓKrwRƨ7زQj/e۷Lb.ϡNϗX.KW& ( b Җrw( \iACkex5rO祺-f0]3w-<|vXXB]:XϓO ԻÆU2Hiq$%X4:fyP<瑚[dhT3"VK!T1DPtHp(g-_A 2Ku%hrS-ѻq4w˦D&n?_/w11~bX0%u%8.yP VkxWTy`2"u+ịUր1ZleF͝QFGRUt{cg ~ q*3;<ų]_{\=vG¯ȞBJ PAo8`y̝vL>p,m=3{ 0 ^س;T#1cL0lgx; OV8K>~A0c]ӌ]6`!3#&۫W-xiRΟ4 3 ʽTHCKK)FD¿>d8/q,^,؄H5"(%2X!͹NH[ll \ԹH?p>ǣ#K1[F{D]k+Em \id*8 \H*f0‘h)Hꀣ'$X u\j߼x{Y5]pTV&/+KVҽ'ȿߜ0,>D.hZdDWhVpvwlt>drHо0=@Ar\1_dwtYER) Pkj X%tlJisTK]*S'_:`3Cr2 qׁeHA"UR+Jtp]uD_L\;xMш>r=|b4 F1# +?ǃ3EyYhvvf[Zjx4̼ ՘Փo!RHRH?ga.F GFk0u4 2 '4MI?A5j"ZڃT&Cm0Imc}J! E* ZU_쬷gmUJ|u%'GBjuy<+qWTOO7IMNEPLjNaˈ-c\0R,%qQ HXGVa)0kZFe)&$Z l8zo6`; 1ҋD[S # VaFD8, |qOr.ŻJ`X%D XZ 3V﬊%.^Z6/v&T*Y|w _8m"U ے٨L 0(p͐įXeH$Ɓ)%Rki.,֔, 0 rJ^ AH}[ؔw&%lic]HKD~ColuG<܎]VJ;*|e} Mf֑|^W_&pS&f+uh5]xsңiEFsHg3IIfj҉ꕚ6u ez5yd y>͊ ,eHdW #ʋ7X7jZp7'>o"a@eq ( z!\4gݟ50#{,ڼdT_]댦(Z ]k ́ε8SL0 :.#2A&fW2A$j!-ێ'R.~-FlxRϕj{j0 6 D@SohZ70;ĤD肨 ,(AHa5I|¾8py> l8L0+Qg/`}z])) \ noMw&j,t7m/cgmI %ȗZrV:Af.[@K^ 4gIbbYb+Ks5S":RjKf(u:IM0&wS )dtfݮz~J-c1J|Ed)2l"żŗ7 Œ9K˭ˉuXC%v?OH'$8{rГK&  W'Rn%Y]Bu%'/U]{_Q=.9 eJ\rz=J -)+`X!~2*٩-Ǯ<;R]Iپ]hzx _GO?E#}WaVm+[9߰Wϕs\4Wϕs\>WΕs\>Wk-Wϕ_C8P(hT{R}T+JR}HO vZI'vOv%.QԪ6Է3Eɨ}MjMen;Z;|JZ8@F`zY %ڈEZWyד_,'Flj|*Pyr㪾`hf~WY<u+Le8&sOkqp6 T1\j%aXL]A'( \9àmsj8%&x* dYΞpa i>a8VLۉ[x0kluJ$E}.PH{=Qk\6JxPJrWYps]ݧMI`C!,JD3W2Qif^ ?{Fܿp(%|^yq#kS S=CRk(jٴ=lKauMOկsbJCk ޣC!B}+88?rRm8kt|ID!,+s.!$cJyHZq 3H%Xl!5i@vU~VЋ#B %1B*rRG^-vWT!eX$;aԯ`D&|@7I|(\pBIcP%RvR%DEY%hMV׵i]ֆd@3!h5jn$ɓ`7t2 ESWBYMJvEYŀ.nx o#1ҟ!vs1Mҭ3BBX+iIǮxv'Րc* !3E?sZ}&; P#qD)\jdDQGqtFR!qhm=nrOq ډdޙUAɔb`A)Zi %9ϙT9ۜ(aS4lJ?&㰹!߂|uBZk._t>|}$1An6C&j ܧlLQ;kt}!,/x h sOɲbBп&"`z {;o6Ӑ€w#}簿OydǓ-F͎ Aw|h9ք3"+ךgOKbi2=S1B֍(hFNߪn2l|9xפ3Bf!M@H!`58"$C CT`,{h`U֛Enk01tS2j{vp?~3Y^E7}L.,T0r.]1KPC 1zw;'Mh@Jnr 0p%4;f=RLt2CE>PNF K?iR**ݡi|47m-~)KǟI;:y6?0Bb$2́ Q_jxx)cE6T׬Alى$ٛЫ~H Vnۆ[K&WRˮg_cBK%ƓI8wWOri҅q9pK^ǮjFDs<}B]ɛm[Wm?k %9b )] f6?obꝦ>r˙5cqm|vlo^Qyr.nzHۑc]ӽzZkk2Gȱ3Ve8|P۔?a:K-[f5rCT{ HRrcQ)|YgqwlWϰ۷=bnv }Ւ7LJlC^V֑tb+-ed_Eķ־eίMTW$w]9:BEM}zwԝ"l#x ƛ_|\jɆRxr~b:U i my'Or|--p3˿,h>(单5d] .6r6I+mZ|Gn>G|#BN 1y=l0Gثj2ay(~ xܕ΃*{t<*<1 cƠҘ44<YqGxĒЋޟFDdW3o;݆IHoTXF(I;=}<ݑIR8043>A=SjvaMЋEYXw[F-(@YD ;Y$ &+D U2Y]w9w816~8]r2ݷ} ?K-ϟ\~"Tdt6|qxdGcՌ'ߟq ܮ\<. sҋ3$q+ vGxvE<2>hmJxx}zt. ]z3nͤ&y=-Xӵi(ذD:j9E?ZeV^ju_̧|h7]Zܗ'j2^C^שa IcsI+ճYZvcVѢ%7/ r'"eGd2`ׅ~ߓq?r;M9-J.v.fu9a UH^9fɃT* 83\j'f:DPyD.K M`]@N R* r99j=ΖJhO'B曍-:M87;48HT KT"3ROHK&(N.dZK -o:Ea4 cƖxl!I#Г"h}^0P}lYyXZw6ZLXA3:yYӏEѪ$)s`_\N jiFPX)\B2ZKh5 %咗$3HiA6p7\ !Hh0A*.Gv$Rpmdc)džgˢcqFh'SQd}D!>5V'??]BQgT|?i: 2e!%WC0 Iqpx < ˃9ANhym =\ŕ ALB1uW2e7]Ҽ6wzjy['LN?|͙oa}mW-8tttx~zvx*R [zmnW?i ϵ6\:qBo`67hyw^Nv‹/f3c`YXe6YfH3M( >=~I· Nc=#YzGuH}Èaws]Tސ&Z|jP4N.Ws|:>pׄmu}=c`G]Tt\iHXi6Yn h̺\_DTLWGl<.7q|vB;w^~U͋wo߽x?| cE᠏ϏcGC[ MfCV͸P{X}z֙*o.~{1MߍT͹\E*mI\I4]ҧlXA%ǭaϣ\񄘄0dZG 0bAYd$\{DoLgWQxKlfeX1Pl#+*Z=lgR jI{9G.idU^qz+`#G( %BQyOu<Gl][o\7+BļbR[:bI-b.Gw֤X$cA͗j^t{)2||w*mᇃş۽(! z!>=l*EzH)dnHU%D=$ li|Y[&7/Ziθ=\U- 5LٜtwV#ߍ6_$0e~>_$npe/}qCޥVRkLKs⬸Zc]@DǪJxQcutYS.$ N)6i YKĹ@ &#n{vs3΍L-ITZBL25lޤlWZ1,a{]y7[]hVݡl7zo kE^fmOМ >]ƿf:96%B䋑N%֨eJ>qrĪXi q{4?+?^~q#Zm䜳QbvP^ Hɓ y`U>In?bAqG_7Z磁ѶҜS -giW U\"ۥ W W),K{ D]K`6pkMͺvڱ$5|/}s&gYx˞w ā !pv_M2~-CZg^{һ}`ȷ_U~N!|ޗ8$4>eaLJcJ[vζ%xZ.\+x]7\OҼ̂d=+:u\^dB_X.6%qa|pE*yhs+\Y6|dPeqO枉ݮd.p,]1 3򧃏U*x.0ghftk(\d`?۩64v5ɇƼ-w*zVK8[k-ЮP#C!zwH۟vx2{xg?c<n;_o䰝7|KծGs /fj[ ڲft Ь+)V͵e_ kI)dy$ ZQ IJQWջROFԇ>̝9e{ 2X]/ùUZlcKJZ` 5,-QМ]" pLe5UZ}ΞԐFe-B|L=kM"u=4l/@R>fI14l>483;-3tKuk#Ki{-"&&Mqh cL˵fIMgl5}.Θp5aa## Ɠ&R: w="\jDë)9{JcL30fcsV@!=Z7#!L v~r(RcFsD\$D.rkbxR$-Ƽ{ܜ6V:E2,iH%UbMMAg[[9GpK%&m .尤& $3)_,$W*aPBNŢ @QdkLF{Qșnb3Imj U,H) ]{X5ƽ((}۪F`p yx^i\OM $K$&ol ݒ `w1 @" 츥+X+έ%hdȀgrL!ct_H{4jX1딑<0`U)dvp'kA2/AK"lg7..+/vl.~W7^Oa\;xj-M猨OCM[D G\&Q@˦(RǪT ]#U2a((G$ .b$ЃE$JDNSBu^he;B/JV $<= I,}lt7{bsVx&(&_S~Y :8[\t7U{UgyNJ8D uZ2mǝ;CF J>I02kuoX 8wLV4h𶣃 1* a/wCZPߴQbDk^P A}K^h& R(X@Pxʵ׃`VBW`XfmdW+ "&ꬍ5''.0Bu+;_4kt)ap>`,Eb LrᓐG`QkC[=@ׁZYr ~ٵ!(`-I{1Z&1׊'1/bV(O>[x4u3iCATCx ~uͽt*NΗA:XS}?(%(m#= @@YKAgCmcG0e#u>؋Aұ]tO=zt4Ƈ(ҢgۙʬעKYl`=Ѡf9x`D$tw B,^(iJyr@t_W8Hw"A#x"8`ઁP ׷b*z+a}7yH#`Rةu=~y=H/%s+P&6\O,j3"9M_>q&^._䓾-,Tt֦yIR9HDLa4v:uG/(^ʃ.5q CSMe,eHL@g9STk 1M=3F8l(DugAe1v` uیA ô|3u܄~z/wm{Ѓ\aF&wUw9=+|8zф6Jk3y>E殺vM:Sh? VwϟWWuN{Nmvae+AZ'>^;:0c6\9Z'/@|aȣ/~9'_.M|Z} T_=!z] t9[YT2~3rC+VR3+;S]nțaf4{Y7aÌ+5/J)+epJX%IE&pܥT%q҉8OZ"IxV5l< }\nP`$Rsk!QY:"=>:G=ёz0n4S2̢f!`"'g`ϟ.:U: 䮓ި=^Ə`M+؇aڔ=uAKԊJ+Lv HpU#`` *:Рr!̜|!DDn& n͗"ۯ%SIǼx:er״nRU{;Y B]rmTŬ5R*Q(u_ӀGfOVV~J!8U#yG .vĶ\}'s[΅1*:IDrфFE+ꉊE5c hAXRN#u!%uE]pQ~M/_V&;d>Unx7iXcP4/ISFۯКp  eWF8k2uQ Ξq&W:V^"jf!#L$s9kЦ.~i1OEkc6-+Z=W 9! q^h" ss+(ȫ`e@/Ⱥ&iRՇTo$!gHEADbq?p>2wq1&i bׇ]Pަ٬)+SшT4"Eq4zh1eKF"vP r NXgBʒbU5b1vkĻ[A/ΫpRyaP/^%EaL ,0h8QJ"S^[@)a%QƌEԋ/OEc(qTتr;Ƚ7j} #pOT{Z~Be"8D>Ҝ&MZwӄspNi9M8 4&ӄspNi9M8 4&ӄspNi9M8 4&ӄspNi9M8 4&ӄspNi9M8 4&ӄspNi9M8*ڷ4T@CPp4*s hp5xC#@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" HzZZz~=i.5OݕקxRug̚/X8#-aK2vK9} !p=n=J6!%Yʜ sPo1Vĝ +_o)-#iʜK4P &  KpS*C 7| 烙#[.scJf5m7Y 0L ym+iv38&۷8ʹDip:>pI mou+?vSvtIF%O9]RR) ܆be]LAk_tixb|I̻Sß;OWk ?gN~ׁCW!90ty>&P<,<^/g|fcsXYUG}ҷ2 yv~{B >l}uf\J]:ќT;}^D#dYE^3I+*$eyhd!Z\:&AU % C1v+5v s\_9xjQvj>\xLol4X2 !&ҙ\Vs+CTJ t.$JPQt-FG !;CzST:k~aa~`TB<,zJ.T6|L\ǺBͅ5$S .ãNyo ƉF ˰N$e}jny>)Fa?%Iᙴ<8I9)wJLĹQ*Ws6pv?i3ڭΩv&_~\׹jI vvC񬞵>xI.Ӫ.ۋfZOi6]L$tsE a]eV֯oX p+Gdo[m)X'&o7o.ZnQV/7_g@f'tPjOrqҗQK9Ҵm[c_6,;c]),}jod/CLwi̶<]cV pnj2gۮ뫏!!oatjIZv^#f!d"u 1mcc$䱖'5u񷵦4UpqC?mh~8_,I<_"?jY\~ZGi/ ~{w/RG^A;r6RRVFe;ѫvxA|q{;KÚ*[0Ow9DW/5ܔ Yčò|͇k^Ip=knn{tX,iSfZf 峇ށb潘yG"gE'~ǍYgN F63Q")IcÛ[Fmusyaoi9Q8o]J52i)(e53Gb]yoG*`3T!8kmA1>%,_ /Q$ECk(tWwWUKeҕ|kȇj@G {VPA3P3?IJBr!,]IvցznAV&KP7rv4T{)Qe\QTFjMD*T*gUTCP*#gG9+' f3FM7',$T0x䈆T€ ʒG׀MD,`9:N8:IȪvoD0TQS!_!'[87R #Ok'6]`+0OaɫwkL7vz9FP F@L 31 :8N l!1YqѰ)kG%VG{^G{DG{#]iuDp11PQ`ɠj$e"D.ii.09eV&ÐXXQ]2Wd7a(=8hcQrɐSFk`1kģ:񘃱7u8d_PKzN(mtϧ!wWy~PǥP Nf M#0!8({,P(P5"2"9S z5R8V`ɴ2hzD`*4բRap6&LI8FҎY@ERʃ gUҪٱ\*Av|òjL`yħ`D f:jJtG)`= * =d>nV /:dPceOC3©HFǤ"9ACQiƝV!ʓH&P *C^^(1We ]?r"rŏ ę8El; ؕR)jC80\_jkEQh'Enb--K">? [{\IPEcQaSQBN#0M}$5㛳}kdfɅngR)o[0FS f|3$cƣ0렝*ϞnxW 'g%lsJwVpqN:]_?(b2<M3dNM.S2Qݛh *:>W  N[l)eM@ZX.EL]re\qŸe(E#,>} Ч?u1dSb^6mCzpS UBZ-K;ɕo2S1amlaӉ0!5Yzr_ OŃo'/]b$f*+6;\^3Kbf#Yax%0|j!'Eƞ_nnkYfw >~,uof=Zٹ nYW{%h{AnuX5{( 3 Is`CB֏YR縅~y޿6ʷ5*'ޗuzuW_2xׯҿ}{7?ջ L˿x[`D@a_eU]CеDUz:˪~ыl7B(0mfvWx{&Xĝ>ue(s& _ A>]B(!@zd3:(^"p__QbUb}tKm箸Ֆߺ=t2%8.yPVk+v5LFVypcE>C|2-i)_ƒٛ{V}&vz]Q**=ǭVR& 1A}0A*1j 3ꄷٹt|iel>1,lil NmLG1a G=-L1,ܞ)S7g]qTUNxj:Rjsfȵ1էM0&klX&ި#ͿVWƫ)1T)b+t@Hm@5/RmJx->c}0zٔV=LD{smuq <ץH!K9S.OWὴE~"_+n \hAvMi˵ymHhs t[vpC8[;-sZr5vO:ֺ6HqGktBX@0dbRJ,PKQc4X0K`$>!u%0tK:u*Q2QWG]I̔R'`}B\%NE]IJT2ިgVRW@dU"SQWZꮮJDZC j1(O p|k1b(xn!R p:  턐8TZV7?Ajtǥ-DrRW"aGVWV>ATT3uEh]zL1&L>u%TUVD咷QWG]ʩ'`&ɨDFZNjQWQ]Q<%u3-NF]BjuWW@fQWQ])em߈6-cf,MƝn1PT{62=o?tc(Nz)# ,̧ #թ*zΣ0ЦcY&(޹" I(QxhJ֢Z*\eHQhTBxb `s^0#mAָ2\`mq-d̈RPٹ.еk]!`IIk*孡V6QV&: -+ @n ]!\Jx[*tQnJZDW֨7~YLF 1xt%\Yŗh%k:]e +dmmyS[XʒQV zFo{Ϻ7WgMn0 =2\7D6]eM;jjbϪD~ @Ͻ01(C+y0F7t%:zoS%a-+L5i ]!\Fh[*tQ6kI)-h l]eZE ƺ*5QW\cj(E. T5u=#Vk-@PBٸ~| =M2V54'!F\C iLRVg= bm+Dn:]evtut%  -+,h ]eujh:]!JE;DRF1h1d WghŹh? %t 8. B U UF+e*80:j;]Uׄig_{\ueL3T  :zoSbN|t+pMk JENWqNBW8-+SQWm+3MHGWHWJX*65tsi(ݼʁ9||ZsD@-gLqGM󐶩;л٢e*EtܞSd0[sx#U9if`T hcmQGQ΃كÙ-GZ~ddnϟX?mo+P'ڧs)Lk?>1=l-?iʖ\^4+o_y ~{ )t [C\` ;*s6CoMe2o C믋/*̓KiU,`r7z0oW\ېokdFk p1.REk]˖3KX6C5_s:8k,B|Q$I _ɝe o#ur)D 1PKwQڶ8$с/|2pTȬnYYǗ?͡x)\z@)Un zN.V2q0[׻8sY$dJ}6rɂ??wVnwp?~T?EVfW]ܖd֭q.ow_=Z9x[<Žz>^/j}Пt(8?&-yŰAi}&zy@P`)^<0z;v9~? W%EWYA6}^bi9EW&t7&77kQ`q_LrBSsus[tu0gsWO8b˜W\؄3 `Wn(-EGSZpMo4-lR>K?$ZVxVLkY 𷓞8a:e~{z^I_Ž 6}cy2q"hJ( TZyS,!Ty٪qscc K$Q9)ۨm 畢Z#*^Q-_FE,}%"/hvaUowIYښb-s4;oñq}rN1Grt:4ߓM(T@?NI+ѝ>a0)+ dB*;]LƤ-4> @I5RxR ,~8W!ՁO:gsrhb& n20і$ wQ10;I%ekQ C7I<+Ud`yzDSg*]S)ה.?5{j/,>q{Oso]fǶ+u%ރ}͞yVK]=8xS T L"gyP] 6%"56Jg ~+Q4*c2JSoBrY4S Kmh[mMr=)Zc%ة'\=xx8 ˽zb=n)Jr]Yg,3 8J6pڲ 5բP%%&K܄ arDu{o2?Ϧ]-Q((L¨BZ+<^xQl%MxĎz=^)(aID K$ ε;*Dƴ^j!:&@H^6274:)7*uCa牖VXi$O5˵m/yA}l!OF ;̮ C8&anݜmV|hu ۅ3h5$]uD :zر6tkѻwvW3Hݥz$ r9ݣM.:ͻe*iڎ,Wmit켡buBQu`s\HKU6&aR s Z*! 4SB3CMP3@Q̙P:,YA{wOmpvaz*O(rH{> Jɲ$oéӭ$FDer2eBbxU K Me"MZuZ:\} ynJf}>c¨#SmXQ+}7gon`k/*+?a J I)/S'i PBxA m"/E&m5iFqkXf5{*1Ia6$y`#Vpm@8<w1~%9;쪳,ګj;m 3%.2$Qr/<Ø2hyu3f*uf`-fRu'9N\j}͌V>Z4 u/z!۱pf) v=*0͆lׇ-W `)iEHA8B*llϮSQS4n kV]w:x_  @h0` rOhW4)sOhF]2g$POW9亱AZP" C}$&@='f! 6FQ*uNG~?W1: VܠIMa(#c6ʼn.Bo_X]CZ1W<},:u ^H-T> *Q'kP24Q2]Jq=D#yM%1l7rwxᬶq9r P. M j+-V4 )GbtpI<r JU>R<#I#]%Ql+-/, M udb5#T')Ui3B'D\ D3 UفlE>Xt8H ,6\X6'H(8QM[|rpO]kPGk;G_`ܤgp3sȤm 慀5p%WF[Y`^`^SgFPvCA|4qBHZB(\ԶpR[vm-:֢dK1 a3ҤH%e#h"- d ,%5Չy g?6ҁͧ=⃥Yqd“[-,$ׇtm +1w1w&cXfT4(mHE F!Xb$4:sgLC#!7h(ixЎ|I()ژZ(!184~/'(w[HY"pDy e5ܢ1Y2 ΞS(?y>y4mA~+\o`˩SK9,jmÊ{$txl<Ƈ"9#Q|ݑ|?hZH$-KBh{9{=FSxm|秋ۓLv$|f,' |VoR~yxwsW)W<[>}p{G^ݑuV:LF{VncoۇvTXsR+ZtkŹ3P Qag٬Qa]A)1Pl&BwJRurw?唗Z)!hAZB&D9E !1Ţ0* +|kg9AKԲǗ^B`eV5hqN;/'VԂ gZHoԏꗁ|pnA$яjgT4dgHz I,MUUFZz9)d ]TI I IaF9;NI)q8OKFh6rsUv.gkR.ft!l<^!K 'FFnxl -ǤQdelhJer9{Y-W Ĭf6F}aUVL"Q0Qi>HQ Ń+ ]K0{ς6HSILgJ,"j۴.HzOdzFX ZkJcVFfYh<#B]_]E:cCt,|9 M♣Az25BaNySLɄIdAt&ѝxuY39T:laE'uU6abd\7YLJ{wkE`/lbtG *_ܞ_ @qe)tc IqIu=tBqSz4C:Ӓřr̒JTpg :3),=4ݥ)\ͩik_6g9JbwUBj=$)>xB¼4gwf.K01J S%%[*:kJ3$IPKU$]=WWx3l47a6ZI?' VgLN^D+B&?EL[JRP(9G6a%xROV MB)C{rKVd҂,#8&nB`6 u+H8mHz OSٲf:Z) O( 3`wG@s&[KF_R)+ OVgIFGa5N j_ގ,ѻ'ݹA+NʝnF45݈laT7ק*Y[2 ^'h)'"L?T  =_Ŕ AL\5$_y_xgB>z2 itk[ԯ6ڼzuf–o'qF4+uB' [O }M9mxN*3`ލhz4/5vQ<~zqÛM11/kb.,O'gJwΑl>麀 1zi802u$POt6hFa6Y-lټ'ZeϦV ]ݍ9,MnY*QWlm*7Eq%!e`E*+)?ys_?TmTOl2k7Fݏߗo_߾¾77 )gAg m[ MjC嬗 ϸ[},>o=nmNzíHw?&Ue|uWx=,IQ \(>Bd_4iC'(.!Ud5U"bŻrQq];^2)ruV.,|/F҈:f#HHJRIi c+DD%Y8F +m5 <OOI َ9Kutګ !fDE@ϵIt/͞*3J^9:sMz]Z5|= F:(Rq`uߋq7XH=Z1m5Gzn4nVilU DY:?T*!!27*qJRRNQZ|xrjt;Yg.蟭]O]u#rNisXrNrE2bhmN%w%Y0@d,wY#sj-{{(6OZBfy>w&7 چUvTY.fA=v2BiqkifēXK5'*ˣUn@rE51RPA4vF8Z+B'GM:KVmvg4]4HTBm[AB*#kh8VuC14V88dgg=t#psơwבU#^pcګ(H#39?bL:\Pl bVJolyEnݣn\'ZYYu5˄1L$RHʲ9F!CHR$Bq 3H %F`Ɖrk"l%_^D v QJ`ce"gQgWueRKa!^WO 򺯕|~WHvWG)Ū=C=rN:)Z$ڢ L$5F!c$0Ly`:,;EF#D+Aڔ%=mʤH$x%RzU/*b9;];,3B:B7`Q^yuь[r`7HZ}bZ/~`tR-r.!T*dD%r0e<3J S2EjeM"tpJZ{> VJQ`Sj5 Jɬ9bck;Fٍa>K&f_P38yـڃŘ`b3)p R"gZg ʙ VRK^JUB&͐YQ!IȢ(lԤ8bI@dT]xX$aƾ bgD"5=bFY<%O,|2.%_@TȞZd%}1mlcT ˺5 rB57mdI 0N2bW;#g7"ޞ#@X'_gg\r(.ʎqQ 8+"Amd"F5$ۥ*KB&Ys!4\;C0<<TT-#J,cf88l l(=/hlq~~G:]\QJ'`\,e%W+3^C㑂mA=J~rɑ-_tI31)RVAJs3Gr9e'"pQZ Z DIk602`܆T_$$pZxI361>?LVCgy|_^?~7ui$bnXvI7ˎ_oO6})~7! }A͐0{#*,64)[|%[t}!l'`p0|+-RH#jǜb"0SD,Cs=pg#CC`)0pY,$Jq+w) ZyAr \(E&|CgYHa@>GOKME|<߯uג;`Mvx9InZ3xt Yȸ<{LÔN!4mhbng~8OBπ<_L#F1ape#b6&9{fˡEdAP /+.E’zeZJ%c-5nixTb~yTSXnI]XQ_>Z<ⴔxOZwxUf_>_WDsƄ(/Z}7.6!Jg᧗瞟xoJu".X6O'}:Y 8p}p؄b-Kޟ g¹I) %yମ~MeX@iKBC7to^_aru {=?5`IL֥sW![Gi\|>p躛.)6[~7DO+uq[G=v> ~Rjtlmʏx+]=rKT1H#RroS|]fqOҷws{m>cv}Uwo:{OM%ݯi/hktb.?ֶtk{F"ֶm_)?_/D[Cw#sf>CFg>1=܀__7ڈ|5ίu~z0|(hXo7`l8'q~qg>F?oWI)yuzOKƭpw{W{N\?lq*}O_&'1іB??2 K&?ohEhwj6vyfb˫R{Ulxd7xWZhj7{ͶڤwcѐMlsexEӖV^un=>BDN~}r+-Oro4D>48d:XɑQAQ*xVۤRj<SVƖ̉!!3Ъ_fsgGDm}D}2d<݊bGG}ߗ3&DȧXU:XEX=SRrax*e"fLٌ3$=,Ivbyі4V6v]<ݴdƅwǫ]h(=hh ΥE.VWf~(_ǮFW䳢;o]?Ayȳv3Yfp+p'wϾ8=:9TÏs9sVv=}~7w^\QEAbHݠ袞k8?}-V6uOW  t`E ỻj >lɛt= ]m  +]k_?+5Vb `e,gz5"r]Jݫ϶[kɪl͝N8Ga)Y" OӏtʒMU9ąR*IO >+7UJhs"*Qz{ B79G`M. ##:*P$Ćm1DX;W.ʄ>R&m%MrձPpVK&ưH:D$xXc F\MO(Z&T?};Ml`bG`2}rv]!2:+M#]hq ߽t5yQr몃2*<kG[?)uA]Y*V Gjot1lfHIǵnzZ7D-Zz%#lM^SCWծYOg-js6wUVv]!b:+y*|)据0_tzmk z]qn$  qEWH{qu:+%k-,$2 1|((Y}ut iU㻔µ›FQvMeu^Ѵ鑮0gh_tFJ [u]))O`{+ě>;eCJz]uPWZh_]"L@\n}jv]&Uuey+!]!.1R懲k|36\x+Ut%EJGWߏYoDAkMY]ա56]գ-ӕ+j׬X+<Ք(㍮X_t )uA]1-S 9! q7 v]!꣫.kKO`r拮V ) uE]Y!*]-OCboF@1G轳O`qZ)`mV¸lQ+]BVcHTQVvʦ}vCLdovᩓs]xj+CEW/BZqfMe*O}W0I7B\.}*v]!飫NJ UPA&5  A')_~vrφ?pOpW(Ea,?z?XdAûϥ>fc' 7NG~sqW/\Ozi./o儹8ގbT/'o-Vn<%VB';1XU<=$;L/ܦ8 (-**io Rl\nv>^`GL+/טpLُ9rdC? a4G ߸ŻM]k7ڡjڪ QVq< ]<r'CBP&jf7GKi3Ԥ 8>T+C]RjHWBSot+UBJ몋bTQ哮1⍮k_tR^W+h B<KOt+=* )uE]Y!+S[:k9 b5RYu5-֖yi6xiUhi RRk<[WQꋮ۶ )uA]) (ot+CJ7+-Ǽ덮_t m(_R#]˅G2}\-vj}c(GD+#]].+U^WߍYOqւ鱵9iZWpYcъFQvi ]^Wf=Ք3tZz+ĵ]BJz]uPWLǝX 据5kҚGW@ uA]q % FWk/ZJIu:+\Er6ĎܡW>ݸDlT/aվvoT/]&P ^L#;xHrHeR` Wd7pϪzhg/{VYQmu-*j$̧i1lC;H{DR#]- ݺlE{mzz:T"VukSnIW7\N/BZۮ+lEWLPՂZ]! il^WBjtJIotMYV7JW=:+mݠ~[TJj\{_>݂1[`'+9S]T=_efYIq!c* G*E2Ii"(тsł+\W +V5ۋ&eW00u'yr^! =?T#&XS1-d&\ƩɗT[.U,E;jRb(#vՋ H/I*Fqw"Y}6/=JWd}1qSeizz}YX's^hq9I6&%ג.h̯%L4YKkkM(\0ʓK \@)qQo=]C8 Va_qE0ZO3b&ʈ[ŚeKJcD&=D9$疗Ғg|UXgِ*3L )\pp}Vlvp9羊,Sp_Dg|trB/Yh |xFO̫gtqP%GFRz[/A NAE2?eShzHiHGcl7cˈ iy󑲲]?֝1TMC]`Ηm;_|ٶYLzXvnd_+H_Ҹ~]R Cqqc(MScuF4ӂ?%"&i"̨ʘԱ(@ncHѺYT|>Qx:CuU$-) &9iM'z)&NW̛wӏӴ|{yw3s2 bܔy%i 4 5 q9zH >2/Xa4fLBTH$X1c8X$(f!o ۇXXƩvփQ{El2Z}]a^1:~~y73K 2Z?`4uXuI>t~Rd$M ?]qP'ae&BC EDYIB9O\ yP8SyVzۍ@=7BO6?~_?yn>,U|w{cH+)Ѣ-n|RnA*'QؖKMMJ&9ԋcuk!8'SQj8-Y]a}ub6?)Vr817bb)-ocn;^Ǩypwy~ؔ*<`|/s St=N͕lc E}>QoC!=n iͱQ7/ey [\قylɔn:AmΡi:Z}FӐ|3FgFgY+TgV@|8S_.&0~ F?r]\í<쓯wvUߏ0/6uid 0)y\1'Vx7TljCh٣=S۟f"Đw~:}% i&`)'LS!'o MH> _DVmG/-As$[I F(8d%6p'_[c4ym.B 'd)6+mKo[L>^v %/?mѵ{|>ϗ:N7w8N7>5(+)igDz*r|+XWZ+RJyr,'UL@JNL19Kb2O[SξI^?[mOݮJ<> Vrֽ:z4[-\sްaҔz,%f2b(X\HĚ΁.BPƼ;֤nuȦ%ᑚOc1𔳬a]߆zư|q?tf>_\)#ԌL0J13Y2Y:\ Q9|rtEu 񺷟NAZS#-INJ:xe'K: `ٕ]jVS3 ^agp:@dN6kzSшNfZ٥P-%~zN-{488ҠC3P꫶*MչCVvvs1_Νcc7MlqgHB&smaSg2Qջw-jͲ {ƗJA`vFYc*1U.9Xd{tbi!ޒi,&΁i!1Y- ZE: N*nECkU-\f+2I\Lw˷4<9z^c_d T{H] #;>2E.31K!${ 踤|]cxYOliA5Pц8:\E.6Z{bjllFH[O| 952;At%oL QN"% YO)81<Q4=c9=U]Z,SW+|e5,ƫ<=;)FhV]~CcZxp_yKj%;Hs n6h]BxBGo kѹk/ְn]qwszH]CJkOxzk's߭LÇ@m~\$e#x[qJH?kAx*}JV!AkYuD]:&rކI"%:O1ElPͪ^#} [*26Zͨ-fs Cq~a"8.}P mC,e^-}\*_̎'CH)N/!Un잦7Q𦯮VmbftBv[LyMVA }Pi1{}xfwUxr“F[ov~ߐZzr>{)χi6yNf,Q<)I [J|T?73'qeE4g6?~ cx_7sɾ9e_@0n\hB6s`mz;:wϧۻm^<%?VqŪk Ȏ 7sx:Q%1} }9$6h ×$Þxshn{KU{~cd39>J3>OhgzK4aT8i獇b{A_0FUfG8%iv{I F-mX_{2F$z>fe]@??n1A?d}H{_{_@0ޭx\#>lVc%KLIYȵsEl6Bl%9öIt\ɅmT2-`16W"AlC.ȍ,`sg`[qN5#xVZN/)gFJK^]\y9R"G>t1  )It5j=7=@м19o]O* [\kЂ}{4ڎG}͒tٶT1ꍀ{ ,#g1ĎMA;Ζ1U0H|*&V%[1z<L3QPM!yh!q di_zw}>!j]BO{6Y-/dN+g$!p9+VUm0Z:{rڃ3PRmm)Eӽɖ쌭1WoalK%.:|Nj).$YR%NI>+,zKJRdBb ٙnmϝBKXnYB Zd|n@ #+ =CSDVlGN0֙/  *ќQ'\RXPTR :!x;;>vtqDq E rMoc('[ %_˜0d] ąeX Mc[`ƳR#WeX!0XA (WWꬁ#AiYRLPЀ:n #z sІ^Q>',څSL1,)l8oA(K#C uuTXYŲ إ4AΠYo .iIaw"de=ij@B]v|Հ?4[p vIa @eA266wY3kwAgE 11)?%s a*+Zp`RP 3͇ h4 `R@AV(7XAS1EDke/ UȲ p Z@) ip$+v Bq&κ.!Vߋ. ٷZg "ƒxkpʀ_.} `,D((%̀|ȃ0쒋b`&J xfA5?P 6>b^fȘuNN7c4:Z!UL&I|}6V9:Va wNdD lcׂ%sj5 _&$eA1\9RjLph Vc(Ov1! +vT^xϠJ?@($11"2Ї@cU(̈ti^Xe(K%+.F"-1#2x`Ǡh`uaQ$ 8@(yN({2.@mleZJb; >/K`VxŢ<$ʄAIVU =+lA[щqPznZwkGtёp4لQk3yo[ne,5{%E hoQPN@(Q$;7f4m2B˼@y~]Az{6^jR|83K0F`0M8›0z6T58C]/mgS bkQ kn昬5g]nIb_ LIy^OӘE_0WkԊF/$H˩i{`bUTUĉ<'!e`FfPq7|mh͛-v>0h+%dJ@S SA @%5@$\5%i\4'kOHy /6$ 8*R3mXk ˠǐp,+С _`e8/5%7{|qQg9*||(۸$ipj& N dkY]NCjbxZkKB1BRh_8!c O-K^iU gYX ]hQ,HVR+/i!O}o/5oGpt%)C3E79j½bpͤePp12Ɂ)U(&1+󛫯~`nxV~>Q~C\h, ଳ6&vzyuQwgHt]-%KCP~z^}(mqE!=hSJL6 h,vs](9;k"O >8V7`h3epAjXrVR1`}q}ܿjaKN&LUierɉX SAO`p$2$9ǹerqI"oR>T2A.l(D$UIDƕdnBj)S8u UCޏ̋ϖWǒgq^*X"خ#Sǽaz;M!}~˿ _|?^)ӏt`Dlˈop/fIwxvľw x3ȋԻPCW&j:Aj!c Z`);9E.Ie|2^ -?[ +AIR1\mbW'hƺ39?|p? Kp2da_mO_g \oKV. e]/>U-oyn̎:$я7}7~y%Om\+8c _!/!(-O[VA;VCpbƑꠟPC0Ӆ _#gL)&UFC-*13_u#"eK/ T{$hSWz{ay\ OHR(xwU2c7AG R呩ī*z$sE3^A 7oD)3;G/{QwKYE/"k|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I|I LR{GUh|Z+^|_$l lzQN?._u9 ,0g }uvEjK(lqʾO %{ߺ̳/ >Fx) fOfތcб S@W|X$6v )9wtçc6X~xyCYޭH2k˂3_eҚ[f]z+pnޓD~!ҷJ"}HD&7Mo"}HD&7Mo"}HD&7Mo"}HD&7Mo"}HD&7Mo"}HD&7_/[1 У!}Z I:EOH6il<gH 4;]Sm˗uk‰Mͯ.jηVcWE7YժK (<$^Rx2/ӻw}nH䧳 62(&NS )XMwpn&򧋃cWoM㩔/D9Cj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r5\Mj"WD&r!W?\Z/}ˬx7_'t~pߩ<[đB~sG8$&A/o.?]ȋPx5KyI7GiWg]T\u蝂F ? zzuEH? + m],OhKlytFJNn i a& ( Geu RuWp*3%p#jV_؜{4`F> [ȣ{Q;>u2[s Տs( Ҟ 7 sRΊS1>1Pq: &9fX՚ĘS25~ M8+P^xPBZju 9!ө\wq< ן49Ƨ]imJ/z^~j=~; {}ûcO{V؟)8Rko?ٖϟgGG?x/r qEKlM҆j2F2踸вX#3T[,&mٍjvX35BOXXm~v㧝}l/Px1Q #6 JJʐ+ . X2 Z!,eTQ֦- 6{! rR lJcr{Ug[cW|QTJ5FfF8PPw<y[^fBm`<W 8J3L3^qK*A+|VNK>b$kURt+萡2 WbkraɈ"\2P8F/BU#˹bChٍ}S?Xǹ 0 "6CADNHvGoxJ JԂevs )(l0R2[Y"b3sv#:.κepXg3/y*.Ƹ( x%Eu,KL?; , g8oQy9YB&\|\<UgŎ7ZkҝQu89fUNj2r#i3-yvI=MxUjL0QnQ¸AQ{ZtpW)#)E@.* 悀 L癠gzy5Azku)胯V@+\VyXgc3ËrQ 34A,C@?-3 <F~ReJicXUG *{LnBQ,o)ni1- ,vEEQnw#K-@xySl>7V\֙T`H+grHT9kmH_69H~ ғE6x8IS RS&rqG/>*"^2=oҰ|S auð=~079q5#^Yi\5h~liHq; K!WDJ$_nH> oMZp{<%^\y]rVL/gUM쒴mc^N/~oJ ?On~x\Qqم@RP.πgach],6P4w?W>:6n=[&sm=$ ?5I 4k;{o)Z݄w>b6?58CiD }>a5޼nheŹD5Fk;89Ӡ M+޼D:m>؝R܉p:Ӓ3h2! 3seN|ڭ,C %ҫ^AM '̣')eA+-3JZHQLْy9뫮.bn`nkv^W=c5G)%܏'O[HF aWsuP\YdTwXsȥ/o ܦi@pX)=X ~QUP-O \O#7ߞͣP O߽8۩X5+ٙ+S@p!%*}cS/3D#% `3.2jz J^j5=N5l28K-knT/ャpq#)|ZetRh^%e|)zrEHZZm$dͨx~\-C&zTG"j$R2Kc Zڤ \\9VF4֣uRDKQ(/v4< hG g() ڴoH=gg^~|^jΏ-CO݇n]ȍlfm{F_A2K9Ety.[;-q7e|H瓇E5W<>4D5h]{_aζA,a rO`0#GכGgD@t֕:N807Epk[?">qJJMX,KUzxOu.:S}CrV#5/D* h0 @((TbA] 8a4NɹSOt/t*$M9%7J9N(O T`6W(qT!P*- 1w覊O~+[걽\Z(}vv:S;] Ƶ@й&p1=r"NPx y^-luq~@+h~wQߞCkkcubU>1 1( rs+b4r52rkqiI38gl5O#e4E6ѸOuƣFOo\f¬2y"fmZ!`/g\޸<@~AŦA%rWȎBe~|~޼}O~ϯ߿pfpD$& ?eU[S|ajE\|w\rü[[9)KV4SLB!&Cc!3 WV.:R'炆c>wL)V8:|%pTZX}Mw:B'n'B'^öN|x~c|<^5I:\**xhU,qJ-”꾜4oquPٴxaqAfW36kt;C\d2v:H,m%KX5l|o*'|o=  ++~:kRB`=?!qɎ晆=0e/U@Ɠ࡜Mm0A CO48 QJ5"GBӵsma! =Z : NYv\9Ftx&aR) N߆ ӵ69т Ew\iY2HB)eАtxoOsȢCȢ(DZʰ'*p#q4e\GG24Vvzf!}*+<}ku5KR B`N)qBt.( Oj*CZ Qq)*2,WP"fI-Gp($qPDp95Ov~hz.$y'7iAz:]33Vޞޭ 砵 #{T\$-G3FDBP"0 ,JRGdqZR$LR}qǞ1OS?97v8 s`( VgGS>rh"E]lʀ5:]Z4YTS\ ٳ̎q&Kpb&dNI L#x[ح]c(]ڭqǡZ@{ @QAu  P < k#x Eu&ޔg?>QA?UnܑrßnUދ !qB(B Ҕh"%B2)rS xaJ"̧=cއ`Hxb˧E/[FHXVK4,Kݲ;xQE+85.j=A[.JY@ RJ%Gr#gO_FK) Pй| nid`x;۝<_NjȂO\HK @9X;EK< qyi}ٓI~^ ?;(.M]L-[.RrOϽqF8M5{)ato*ξ#gi׿Èf gKߌ>a hVRtt{;+<"Cz?Zu5 Ez~B2{NhZc5v~Ve.-۫˟H?֭ҭ! G?EgW:[;ߓ[5~~ƶu)\b|%ՒH}ɚ: 5V*‡K˿[?Mu0KEc`"=mI \w}x؁.XNF-2AQ<@y Ij92ӓvyP

(&*C9Q`A h->(ER9E,!1͢QNҜ ח[=* m/%9-OHLhW2|'-~]XN@\jALgEBNhwQN<`Ueߋqȡa KywzlZMK(ҭ(5n's䅜Mr&E }dQo*䡫8>JxpFPq]%BHUE<Bw97햱bG}f 72C?r,`\N^]FC'cBvkZ\Y1/|2*X3n+|jz \oB ߡu=PAuyz7vy3;B5Jz뭾\kA|w}S߶}x3[o]Mǻi| ӕ3mc[trQ֓ ƅ=Pm_Kln[ԆU7iT2z2a:_o& p^.rHXUH? Ac ֒_Aɇ|(wߣGVAr\S;G]H 5VȄ#Rm8-/KI<IR:{IPDRBHψZcO92T:R9kW9qv \m}\J"Xß͛`}^HN(}prQ/H>iۿ @TLAeHQT`%,!OuhE /e Eo$z $ms>X<&eJZJu( D ]h6@}cFe;M\׉Iyzv- AL%!׸4~?է>O Oj<*U Yhb'/Fv𳃟l)*O(E\hi:S<#܁,6k_d Zl?M3*^7EɔY~/e֧Fl+E8DžSH)C1)#n%8Mb{`B >ެIΜ[A]{+mg¼ *0a sg;F3vߣJR ܨ=w're䷻SKq[{rKVߍQ2Jq~e'9T$/9g puCueSٛu~inZavXiaFkAja>(D *ͭrWʋ|؝ WuWEWW*zZ4pSQYQj͉AHDIeK9>}ާ}z{ӯ=PMVn^۬6s.,a2aP.2"ҙ|˜G=c$P:uT^$hP+榺S"]}v*/*o) [9jcT3NleQ.6Zݏ~,;}e*W3%LE+CgL=W~2Գ'(dtP+ ^i{ђD6F(0Dg|uEpe )VYA5߬m?<72bE )QR^%TF%41K$hi E4΀{-+\ o^NE3Td 7 O8bW| 6ξy!JTBgJ"B+A&Wsq%jvWL+t%VEcdTVˆz^]~XRϵjl*]/ +zzt4l~ޡ=".i{OՔSöT3H0bٳҙ\EKgj-iF*U~ZZ0 zAI }^?2K {7ƭG8hٜUF:!VWF];f`Tg؞/>ʤUJ^T =a\?}3..g\|)T/Cu#7ߚ`4Z#ƯT WLQ TJxb+'yr⣸WS}zS-Eǧ%e6hIsȗUPY\@Fȣ-iyX|eyz2neYٽ&̗s'ٻB`!䪳13VF*%BߢD/9ҵ]]!(C碮2'JI:uՕ&9HU&ؘQWH gZIZ͜ԲSWoP]ڊB99L.碮2*Sҩ޵>m$OU`ͫ*]M^.ok=ϒ#%y[oJ, ʔ mİLX9.Ĥtqir狲,bQ :{ę҇%qqܕnZGYYtr9BeDPRUUqtƅqUs7tֹ2n݁?,a')ҪKC_]v.ji} &+oN/135sqE ^k!$5O\sp&0|)>\A_?^?fWw 3ܿ%.6~4yÔw[9f\Dz׏@9c @>DRup(s V$eKگ9\<[ gB=lC?>/bY|B<l1iS?͗Ko?%[%Po;F." K8c:)c;)c;)cjS~ JhXw,;xrc9ޱXw,;xrgr5à_Oqa׽8bVy1 +=h,ϫfMh60rA5UiE cmn(s!?847ڊn|g.حbš==&!qッtX\n-W¨J, hd<6Ry_h9hD̐IQª-m9h&L12v@26XFsW|~qgv_ti[ c3hݍE9(2dI.#GyhJ@_R lP=Q<2u!gI/2\&UV'}a``l֬:*S GN-h}yp_.-n٭f$tnv#C͏ԥ4J ْ"D0?:`oǽ[qޭx^34#H0B%2Zv2CDlृ7ejN?+ D! K|X )P"+rYJd.sz'=cڄ`a4skBmlڥi%^}\vV3g[x~p->.*nt4rnHpҳ&Z甶z3,Y]']7xݶhtetՌ^OrݘZ4͚ hdծdM'o&Fv2rzfk-m?s5?\W7neff qv-:}rmdq.msџDMȑ`NJAf):Ym6{2a\\6P6δ(H#39?c)e Lw˰$_91G}hȒ[ ;ף;N/\ZID!%m%IAaXih^14 MqSWL Ёk"B +GT w:@ 6}:;Oa8wA>뤽nsAwY{u,3Z7ߊV ^W>wm8WU&ePTl\2ju|A2erg%1[$\82Y餳9%ҤXiӭMYrAЃƒ2]c"T`\K9(Kށq0vzdt C-ЍXW,|ps}S4㎊8ŞEl#f@gΧ/cTpJ"LDLR)TT坭aD*Y4c'AwBRU*d1P9bf؁{0v6#v]PP{ձ-jaQG vm?m3B^c>(À J.J)Z(ug% g&XtFTg`+I3䊬(D$dQ 6jW]=}CʐX$`lڨ_V+x(88D#"nŧ X<%%4,0H/rdz-d~X YΘ6lc%pLfjH1%-p<1;=:`!@XF.sU-.ʁqQ8D2ȃD'kH%qL)υVG\. :C50p6`x8яCF?яjbt5O1qS<<Ŷx7MZidO{W޽/i-LwMD]VG0w&FH:L9 :Rb9J攝یLAT)} K8c,=TRo_ҏ}A*g5aFUXlhSJvH0B#"Zn!koѐ*:v eEMDe1pW=9wRiw'2Wz M>ȃrOSFl&QIno:0X<(2@d\k=L[Ibr9RGQ6hp103_z:gD/ yJN1ape#b6)Mrd yI&B) Xg6˺?]=k#OgFW/]`Ɨ_ SBHbyLGUJ: 0r`& ևE?P2g)EC[vӰeL/ EA mս/ݑ]bs ; Z{f!86αK`d'.r9K0Ʒ ͹,}1zT.x ,:K^/OikNo-Ne:o"QcO~˺7˦uʶu;~Д9^n})kjKsf|t1^jszUyKOv糋IYNxzIOwqѬm{NhiDy4 %okz|ɦF?eF~YA2 1$sҶ\.RD読_" $W"R@r!s.}zw6D%VtqE/o?LVON&r4py3|(hȟg&Qlqr97[c+k.ImZ`Vc |VmOkOt-H)W+OAP784 +[݂AG C˝>dCR0ՋݭzMOݩĶ%ON.^ncMIo%3VU -5^ޣIl␼۽ ?M^OjIzL̂,=zWu 9ÞtV{e3Km&2ɬcz5vGƤ~R,PhIۚxGǞ,$vj C8Gv[vf{8hhx:4YG6A YX-R[b`lB# 9 ]V'_^@Do}sQ 2VG%$&:howCrc@WV`̧o`kc"@?#`kjlOb^pT*5CgכLSQh /f0̉3/z3ǝڼ;q+/;q)@ ^\Ԝi&nZ-;wu@pptw8@u胢(  S}X{lyl&nZ[pҵ;M`BT -?ɤ~[+3_WZ`W'M!rр `?MOܗ'd }Լs իpy>iՁ=_=0|<8FJr}QgUj̉2\sWFNkx.4;TGlf BXJy/ˠ-lJ\Ѣ\R'!f&DmQcLڃV:'(%68\9]F8HOP`hBFz0Bsl{re{e󎣿4VBy45kƮV˻bMr_rRO*>.~QjңugE\y"JQZE I l$* &GR8ΓܥRmPP ٹIrKۦӳRh8-dVƴ-ǤQdlhJr6;=嬖bU&)EKCihX6g]S]z0o MnO /r9N>H[i#RU^\Cu{Q+o:eC0vvvG$TNVŠ@6)*Ct6+.;PrhDp*1xP$,d$jGAAmÙ- Sev!,O-11 KNR,ȥ+6gu="H$/=Ûbn>[0ԑe'2娬S.;##04ape"}hn{iS8䡻4;KtLb nP( o!-˭Kk* Q ]l3igP2ZD? @e)"@N 剠G3y7.t{PDˋ@J$Y*#8dTqRNZjIbq9yL(+ MB.۸4 иp <"ㅐIńFEF;"Z9-YI6X=]׼ *orD Њ2pQXicip^T4 zyܞS/Q U1N )B--^M&n6ZfbZ%96\CYH>p#$%diS [cPȸi!m,ϒ+gh3`UE%x0g4c!ikE$8n c v?=!L9OYf`t41sHD}~\Sce}¹sƻpUٟ_fVʥUNy_)QS"oMk 3;< ^2ͫozỼJ'g~%YYaU !( H"؃3f(~C1=J&™NӵܞMg(ŵohr|SʻDRĻS׭>TyËiu 2r1Ş\ίmo*1RQ-ƫЫWGDra$n SzCn )F81/Z=$}A=q#Mʵr{O7🕋+?W.]fwѰ(B\syп_Wv9!>x̷ZAJkf2rLuifUᖍi8༎vӯs}Ƭ۳rCZuZRr%_FP(8gg7k}*˽_"; чOwo>oO>0n>OP)ՑIGpg꿾MM-FsԚ6ԋ?d^hk潻}gOHroR5 1RNj Uq%A&!hoh kڷl6\zX\m.Sk 0J+G8Qefx_QQ'TA^o۸ȕnW?7Nb0M}c㾭8H~vن!TXok <ꤍ+2󬔎!\'כ)Œ+^CsVAt[ڽtk=cGz.0=_1H\dId6rHF-Y]j $ÝL>ޖt]r .10@GcMRe0i !Lĝ7CuDukσa55R=bV_7 ] |:%;(&%'eN4re2:Jf%!}K{ЊҴi(qYu:zD6 :˾T L JK\i͙Wr^D@\̐`ǒsHGf*JYiHUƹo90`] mkB ?5]+1u.ye?>]nj6tA$RSt9/ΕYV&Йh2sV%˘18 8=ynrrLkePʁ6]1r;qw',ׅЭ:~|TQ56Eec+:,Kj5VdZr5;Vccq{|1Kl5ƄyGXuy£.7z̻'-톦gb F,pj9n|^rL-WJl20N0J5OQ*zMnIrr:t?'Bh;L!9"UM8WmD 8u1}8*[{V:2|ۮa.g …~֕m i% ?ZdzaS.iB`0 @X+n@E=sB3Ul='k0(BrȠ4R2f'5EGfRmP )["%p |ʐ 5Ƅ̽? ̼ȹ-Gd9p~y|`Lq/O[: dPmĢVyéG8'<n_nR6C%g%%%O;!Ou$.I&JjX$F7,C1P.}8'%J1ĈJu(lH6Gt-lZئR<5c&]'(둷!pSi&,A$扡HN/#QoEB>&C=U҇zdI.Q2CurN@& jINH9s<%A)e1 Y#l4XrС1\j+3홄spȹ?e"*vEF ?펺 <E큍FzY ]H$?ӵ+v|2$O 8FYJ!69O'ٷۼlE(wq/*4uo k' *knd]ޔ'縏I +Tvk/&^ Fs/Xq,%^J._5*~Xt f&_RQWO/} 6I4@-1TaF)* %VhhP|R {+ʡ2#ϻM70 Jdr=?elT-"\ J(W*ǪR*uUxv4WQ) )ѦIYD X\D03!Nvy;an可UKҵSSEX "Y71Ruq8͹MpidܹqѭPfZ,Pw=cs|GFMZu_t>mҦR9[:3 7\OLKbUJ'!= fR$4Bv eQYh 9Ӕʊ,C`<ۘ6sfb+%-Y4h:%|)z`GD28hufh3Y\5 j#@t%􋄢<܎G:4}C7)˂Vvh|k5x|5IoѾ&؟0f 5{ -|УWvI&r/iouV#cW˨Wfzd!W_&xwn2{=o ~jV$ n=|}{g7?m$ŸՏꗁEnp6lbяj,*d/P$9%Nb#wb>otwo޻wCd'{m 8o}v\znv*NyI]GF=ۻqئ6c;VK*0ٷ5M9`co=vdzl;eKY\ Dz=n7kYW0c[bJG%2,3xpi.F1 chEN^8"3Ҕ5'퍓p<&rB)i9HVyLW6^]gӆM!JY7MMޒءQͷOC.nAmA[0NI(y^KLRKƅ4Vj0AˬnB!Je2ąJ0k u)B9C`HKPW1X诜S"A "f:E,""R@ hRId`"HdBSH٤7(>|P}hkWƆmsCX+L >bš=&U2а<˜.?]4 thdP <6/hD:ʤIa!p˕64 S# K˨}[#ץmg|%wֱw}T+SNro4^!@:oi@AH>昌5N^I9dD:]FD6eΒZE"6㊾%WyrD!gI04t@T C ٯY=j7eKw\殯^_8]m#cAwM7Gv8jGVjŽ=~`]eY d6z=ښyANz'֞RۢsɓˢV^k!DKdP'`Ccwh~/R&P;f|?鐿L(yt3:|˳t3Qre5l[ʻ}!T}x,4$蘸5*Hץ j]v jqy^ðp(<4 S,MQ0 29?<˰n*qnu.A+vhM%HMM`9زbGJ65^Vf& ڂ%wŇم_+g`Z4m]Vݿ x}wt}%(|^49Ë+* L"N&M9Ca֯//7ۿa->m4 )Ȑ{T|G#lCZiz% wj=dQ:!d(ɜU곩Vlu nl5i#l6'ۧq&i>iLUc:i q *iSlgwp7pWW gګ(#394GŘt*2,Wm :'&X?f x^7Oǫ'=qv1Ќ4C IY<9*ъ{`hAZ(|ƺ5q@vU~VЋ#B &1B*i ;T 6־~FEs JV4.6xvs^N967^ep 2E OqYUk*Gl'%XDQ[TQ(d&‘b4L'eUHsHbVyѦ, уƒ̔]c"o0i\ϥISeX vtze( 9, ( *  .qGaÞ5NھS~>9-p]C,iiT J`"xf2@PQeR,;XHe\_WZ{> 9)ؔD AC2%ɶce1hle] ;/҆9]:ڼlڣ޸K_"|C.RV@%8+Y93 랕6>8L!R2!ZQRVE2x_YVCgVYw]܄~D1Af6C&jAVaOvH0BD8D(ܒ"EC@uDSL&xJ5yckА0[_Wi}}R%v @9U4x֣4búnJ?vϭq +gEV*25Ϟ~'0ӤzHbr9RQ1yBq1 :g:K=$O)d4]6L!fh#gO$`Hp02(yHx^A11Hp^ { a 4"*̃d∐^8#i;no pRK@NE:_Ep~M߮k5]Eےq 4)R:ܗz^lQnG0dUt<ퟧw|j@[\͛wL$/&m#y7=.>by}f}4@F۞ Z?c S7}[oD/?4eF~X26[M FuIt,,/}| ς&@;`&}mԅOߩ+F"xpzQ5aT!&r-DG1j4q,]j޳Tp>ܕ*{ItSyf& R`0'HE4"p2ahe}#{c[el:vV(P~Ne9X?kN{F}Mƛwh7ߴAt>(ZŸ$^@H=zCۡJǮLzE]oV,(vwn&{6/&+iM Z?:\w˕}GL*-zM2 }x$ަ5N>zbbjo6dT[GR\V+;!H'AL-8%윐Wx*5^W/d(]r<>,HB܀>cЙIv޵#ۿXmQ|ٝ Y,wd&8_#"يMr[nxza_/W@ꔥHSӝ 42f[VV *$W_/EC,7JWʝ;q|#\HC]dH.c8n)(LrIqrNΌxgtF7'94r,y~ ܗP*ĜDfz ׋򢽹xCmy㳾iu c't/~H7K߻qzzt~qyzO*h}NWW$S'4Hkc4}FV-,N/;kDő? ̾4qo̅4|ddo- Ɠ/WʼnP[?QђngKtw6s!i<=EAfg^ ܭκ٪`]W(^:+i5BJÊϟ'X.}5KIqO`U_^TwQM} NjÃ8<rFz*_߾~z y@O`"M 'L&i 鿼+6VM٣ij.Լ~9|4w<'y'vJi4oY t3Ec˄1t˔!itˆH&XuIP*ŅG|B &2p!kT[n}댟f'.$C:0Qtgeg2dU ~6y``^f\ܖT”|wJZFyZ":72Fo. DA€ʐ6`B$ّpZ `LGfϛ'Oٓi1 >vɧcO堙~5f;:m?BҶ'tz;5?_*f}ti/~vۘ3r ָ+%)ʨY*9j+žN%0qtuGpt\`0q' &x6 Uvj<<`O'yE)\{ED;b>'h3ERgG+b#||YseT2L:P@GTBtR*ӛ^;E{oQϼ|Kbm!u3z/nsgC5 E> 6ϧ7}ELu_-_o vb7/<"d/K)CU+?L6NJV᣶II&Gi|^Q10By`uU=# Iyb\IO. iH7+\j{j<azVx/-[  {_d|ݤpf8z4:c,*dLJ`b!dHVBʀw6Y AS2=yrR)4a)#uYKt)hl튡yco⵫zm^kqnv{h 10ńgR:1Z33V+%u` r(:d`bQ$lS)@bV"x_V3a%yy:,xjX<"oy{#Fou.8APu J0^8D_x *zpz-A m.Bt3 PlH6\ F&#EӘ vj5s^u 8떃:EY/_7[Iwҷa/OXλa),}J3tgeo0Y!g5c}ԥt)J_^vcr 91x A똴cN1A_MLfÎa04gvh)b<8#$w(T|qehm sS;g[f߹y&JEƵut*!) xP\4SLslgi{<҈`lwmX_i# lO/鏃OGrKr^")*[([q*HDExGޝu6Aè4ZXGmb)ҩÓB/dpT9y \wVvA1q'W8Y SBHbGM"%TJZg.ms=#ouZp(`A=Ֆe0Jx!^keM4Gg%ht8oc- 8 RE.{vʐ?5>m&5M~ ˋp]^,֥"}\8uEagمdiQS|Q]Oʗ4f:v4oKvUMe-۴lS.vFn]uošݛ=`~Yw;<|wy|3]R0װhVǨ^_WŸ"׋`~8tI9j_v.39O§>Km[>v}nHs|У'77/d6=~ȕw0Do/zTt*՛I#JAնӲKA_z9/8/ke2O|Fj]ݝOf骉,c]N㱹_me=Mf;fF͉ЉwCZ;5sf$4-kw O<.lг@]pUg&Ǽ]-1EvxR^v`کbY jh5͡Zn7gKh0+[4޺-O#=$ 1BYѥ>U`ILgu:4**3&OTVuP@^RYN;av\$+zClΪz584.IݺϪ8\zKP*a_꓆('Տ&Gc(ylDtv+C֍7 6%ӺRz"O/l^)s4>{v|4Z{TOA T0g}Yt$)HAgWs0iR*%cFc)DJL#IR&z˘-,'+)# grtm^;~4]zUhN?Vk_idW;/v*4OlKq>:3Q pDzDwFbĕS,!bKIee>6=ߟ-uwjq}^ڋV!n @bt K5>h n`,wPqN?1;Ǽ6d*gP&U%fQMp MFvϷ _ANzZv&Ir9(@ HGC!:BrpiMdڣ3K4; ߏ*m)]9N\b$1 i1 t`d. I9yr.Fv;MM%>JHP3t̷DyKNQ,riD)DZHr ք= Ujr(w`eoQ z;+kp% I6Yy韽\vtHMjug+w9% eGޤ!Lc7Z 7YXsS|]A(e;mvw#>; iCr*NL$cC<0#Zf㮸$D)歍a1QG {CSYlffZmoߤ1؞>M/}xƣ9+bjQh۴)Hݾ~7ߎ'=}WTdiyDtܾ˖i.z߮=WPmE_klo=@jEv@ZO4!-dͼPجq״.iڨ1GnJ8X7yѰL?+;@c ֚_Aɇ(wߣgV EiP4I&räU t$nE`eݜ )i],x}YqUJA,p-",7GM̵# ~) O} dE!E[r5oYSFd6*$8 RʐSyV{4\ɣR֑(0C1btYE쒎‡hɔ>i:#M'ymP`H'tqB 4 *#جt<6}Iqn%B% AL bvI-*8™Ma2m.rvv^?_Xɗ&_vfZ5һQFӄ%#47\eq}OǍiG47J&9E;2$MSC Qxu/}r[9k?zCjC{ȫeni#gm[tgCw}׫Ql4Ztj+j|rOG7cv:-W?8Z2QGv۵g刨qir;] f:yOtԁx[o%w.e~ l}^Cge'C _7xEc9( aiyTZ63'3Fl57*%}Q8HT1Z32$+-dRR΄؋2l?/]+߻w;W'o}a~%t߆# -$E $˙i$һ2O:l!).!{d9:T7FZ:bXxjùwgdWϾZ]w(zdVv\1(žHfgڠKYlra$iZn_}6v ٙƽ|M'ESH68u߁-f1CؗLLV % kj6~Zk75 t3ξQʚE&-Dv|љ9;7;|X9$jD3JJd*W!hՑlvF:7WN6r6p$z!!:r 'U;R2WՆs?s*w7K@6kcLBSpcwo|n `> fO7jDYt{MW]ӥdv9WkX|Fн&o0(xαbB*|"ܺ>r7Yi>mt0W.O|2+UU{"u \ͽ\m3 !kAѢRiMN.1Z˘J7LI}d0֠'WNgNA]Ђٍx]vaHA}.Tu;nm ]8z/~rSZ,BK8$1Δ:  ido&Cs)fCOP|7urݱ#9zp$>w~OMnf$JETP|zs"S:72=L^mIdI;bDγuZNcQ؟+\Ʒ(U ${Z;iVkoHQPM@Q 0KU6\2oWQY|M%g,j_}:Iqj" Dۡ52m#⍑tϰ&WR*%"T3g?"DN!pZN+B%0F )$Z%0) ^ktNJ!{¦g_5~ӞaϰM' '(F,Ʀ\y?d)ݓ746*9x)\9,An=&~\s G%RN,0ɛ΁YA@w3A;{LN,y(U\zu8}swfxB*\\ )*2R2cQD™% v9<\ti"="z(vál'oݒެӋ"9)&C&rg_9j٫IUR꠱ @)  [6306VF %O Ǜuk%kJMM0ȋ_*C GwK %k>KԲjJ5=MhlZXYxIm2D# &o(2 ^;M64fg`bM&X^O1Zg%25)Y1"&2TCx7,lL9gUsZը,[b"Z|`S0 E L]LxLsz+j"jв%ѣ q*1a L G%L@,9ˬrU_lHoڊu7-3ho͏B ٮлO5n=%?a%d̠{" ?h 6Uh.繂~N^rcbC(5^+P$ c~*O祍D.MHI&d3eB=τ{Q7gB4*%*h6'Y޹B(h$QǼev[L,df^rrm4DJR0Hp;=1BIt tW6Ch5d'E`R B -u^7RG'qm,s$>l*>QzBPx# P$#lDTaAA"ˌ^XINjq&.~ Cu & 1H'\$x0s]jI@j_^4bQިQ86\8+":(cYH2ƙ*3)$<؞'+>Yt̷C-ΠL@լ:#??]YHMAhjRP.JC()H4UZ>pٿ{7(E6~Q{>:<&.R2PR)b$Ҕ(pI?j_7xF OшJ@RM^o8]㶫kPa+ H!xfSUBBfRPBG5s y#9“p˛}1؞1 hTTPXI'hHV!O&a8:&LT{p7 #^Q'-C'ygx^ Q^\-Қ|ecuKRB4CDUzp+C|PLp4&)O7g2[Ğ4|k'WJh"xl M!bPHGRGz@DCAϙ(gzU;DQNc.1xf^!D) DD,za>:N%%@IOQWzShٶGCVvu/knwXg#} & x5`;9"EoR)GtRRJ9O[Kr*#W4RKzL{x6BӍBeZsy `zz<\i0 f BXAra1]Jso1-_hcZT8q8ZPK>`\D zP!ZVFZ +X2Ʌ5MoC-JG0lZY/OaJFD+?`?1i5c3Cn.d[-$mO17+Cԏ6ͫg]#Jwژ[jnKЦiZK1m4L΍YO~7tZBVasMY66)Cl_N`וnU[k_=a~l i9I`d^V{\rmQGM&xM37?q2![J{iUcsUXɉx?1LlU'aev^Q<0 /Vn[{5AZ⩓ 6U֓؎xW3f90_ɪQ6GeIz{N.kF~ FyzevsU:j[N޺-uN-w"e>1xtKrIjp:Ɖ>߯h!ʾh:g墦)AHB', g&ģN9fr&9e 鸜|/:Íc~lbۭ^I0>uI}3eI1ㄦ<{eJZ2'΃VPrZx^J !*'P !z0b]KQ 4o][ %BZG 2Y1=J:n}rIkƙ""5"ф%,CmY[U,I#IPVGI;iIg]7@fw =$ ꣖ -j.yfӗR* f '.I[n#^iBJ DRD]CqAN{&"#:uZa1ڧ:W+ $9!81U1(R\2?HP8ٔb,'ƱWW{ăֵ;{x*n &P/uii9-1 ϸ%ZTgҜ}zD9z/ς5Qso]RVCQ_~Ja'u@BlL&Z~աs,u7hyϝ#N47@̪,eN*:7՟o,KorG7tp+rOp6]>`kNOΙ=\F=Z!ںJ\&/`tQAGȪTchEg:Wo,o jA>r<z. ] PXr jZ*]Hĩ|CHĠ)jaKU"Q,Oѯ%ꚛfr%R-H"Bڕ(hW10i-y BEI2XsE@@q)@ ;mj//=.;ưڠL9 PŒJB)Tt, =8R3$ 8 yW@8AYR%F 1'<Ƿ K!RMUG4$R+ PdT&sOhɐ`" w411H`RO \+㺖mv Wr<54:2wxd9,~SU3 yΚ> ѪHŸGqD|.І %t2DQY40dSrp甇7Ù]`dճo8xeS; VyUmpUڬopT񎏏2x,>lj ;rNᲘT#c{[CtpR2n$ͤBA`=zwD qM/6=G9qfGWjTUOō7Z.MZ~Y>X=xw8Y}bv߇4,w۫)ƓO癉#v㷂 + yJgM˰ej\:|B0T wpYxtYojp!檌l_~ɦU4׫s*:>Of!O}>b4eoܞTnT!N}9p|\Ȍ>wG׏G>|:O!w`A<_xji\^՞Kk[6ͺWnX6ʣVV0nE@.~:8~7\5粎T.hbk0+\]cj)*߄ A*Ŋ9_@|Y;y*ᱚ"V.q/>mm$B.$j,dRQ+&ZkPc@s8B6v6,o aϣQDq.ivLUNz$ nD͌k zWj_|7FUbDfcT8P]z- C%SnJR!&N3MTܰX%aetW^JgHu"Hc~[O*jkTT,JгS(+MDs³+f u>a}?&Wg'Tg^1vf9oi3[1 8RD D=ZmfN cJtY|hVLl)\:̜ v)57}1Z zTRK4Z΀TS*JmloaM$]Uj`W>@O7Nf@ ([+5B6RI\%-p@s2 +egբ&2\OlZ&<{ ,ޙ]q zFV )ݓsih@\O͜Bk)IמE!aqpx/4z/ /;JPj (T ulyf~.Ŝ6:Xmgq2!:&j1AR2egΑіضAZplCǜ,АՋ)Ug8;W2٢hYa+R߫o.Jq*qG4G=NRp ɂ.2ٻ@C(%<#!9pYcb`$ܸZBMz=A~&x0i )f~*:_\5XQV"~q;k"jr5,Ka/s@\A;ݎGhQ}5:i?#G)-jU|;nF!&'h&h&L 9NM4gC1R\MB25WFZPodV!5m!djv okq/;>e~}ƘhBUgrur&v>F[s13^=^tF`Bw6t6M^҇`X>VqR"d"/U.[4eT%0gBŨsq9,X 7D4{Ӣ vJ9nRJ?wy8?r: HX#lp*Z Cp jdvcl/N&QD_a=>}d˗#q Ʒ{[  t@`c0TЖJ 9r1_$g>'>9 pF.@;G\Nql $dd}qU# 19aM+DLbp*Ҕ[@g6G) ?He׬B7-e͢t/ \{93QvFZ'BZ4CO׷}-U\s7x) ꉬtR)&Pɿe(w ֆ˛G-X/cAh@12*>ڊ--Z1m:KpДزwk ژ1ZtyNMq#t6f R.{exK/4$m722ogoCRKy]&0=,+"8CA >r=EM?mu28Eu9J*U@\K$<39.O;]mi;MΦQ'(Z@%p\T \NM;7!(P;5]sv^13S1 vF5Ϧ&Y̕5\j^ܭnI5ld̞!^T}ʵYmVT#3&haI8sp;̽[YB;kNhNKv> kc L亴Ph<5toŔ^B =dN'c"Go53g?Sxs5FVs69Z'H_ޣs~޼"_ߛ.˿yӄc=2]{}ZEʓx>爲z~ 8MP\Icp2ij9ٹL] yVa: ĵarZ0!cֱ9!pdR]X$ %$b$FJ~0oG_.VI[ڋs.Ce!MMӤ k:dѐ]Q 9qdE4u--{=&&ZOY!I6#DN$=*9ˊ[fz"j=:7xr|Ϳ\ U(* kv$+d0vz<^&- Al{LmY.PFK1R fl؄WQ._դeFHK5nY{]%p:L.FR+&LBak3:'u9Q:Z\ ;*77 ݒ WeU&mIwfE?_at&s+u$o}3CYu TZ !g]hYZMH^a)EmO#T|0]Fpf΁a!Ȕf3E*TtMV]q ȞJ\̒|&:P P|l׬M| 60bjsʆ[x:1&Tf`bES{'д|Z j495zd EYmbnf5dXrKN#amcc75ڌN1>p@n nC[=2#4Od 8}Ӷ€>"z|%Q, l' '40laX>s69/h\+ĝAɓM91.Mg8RIY^19*+WB#11O7׆s2O]G gɟ(zʛ;~g(ˉrooۈ>>ZZMȬtY*GYb[=v9O-Z̺BBbq"CJ0E$z(;Z٨NN\1w.\U]~QPwWʏB)]g2w"2wß?}nTE/.Lu?ZӕNPW8sx;_{>d_]öş'^HDe[CN=4%`b=~:=y-fɔWxOwXZ`n37݁gOa8F/7\NkZ:FY4d%0V nh}7]]us]U>c3rW`˳qW\⮺aFW޹\ At[^fE`lPHN6)-V7I]N~>Uuԣ1F]iT֨tNdt%Sѕ~J25֞\i*TX]3NLj,3$ gUۿ~jv}yihtr6W aP\Rѕкb&]PWhLFWoҕкO`&]]E( =L/ {qy~~y!A&p,7I`7E5w0M.IrEfST [̖m-nNxz-3*޾Ϛa"_[XeP; * *]a?|w~q/vn8K º'Sijr|o O2Gh,\*Ot\V7+)VGȳR&W'Ak*|vz*oG3Mz~_q^T_y*\epʜVv{A1 N!x.n=_VuΩmbzWξ LŬ>F%sݸo/E}xw*GK=˗_zܜOm~EkH]~9thm17v-z0Zm1-Zb ᏶\9WM]7 ΨP!M a*]ʝf1n\Uٲ 2orFپkܭi]mru~OYAaZ◳NJ#뵖kB25W1XnOub]H-WY*~+ShX}[wܯZo9n{)'Es|WVeR4ٖ{l.y XDTPv{}uV =w,-7߫+$B\"Q+m-_W3Cʋm:1^v^.syfۓ|z{Z w}Lǰ&Kg$ =S|Am F1vmG,|#: hκ-Rݾ4V%Fww#=q7N`Va{qL72)GF,qӷp9"q+x=7C;=r.V=wn^;WY6vg,SO&]{nvWw"r$VY [A)JfnNen>k Jg#bBb`W'+}h_WBau'+vG?qJh ]WLay9J]`侀M!0C0nj =Dsp{z$T5>e~i=Vmj9u\ ,r:yѿ;kvuVѻ[^)Sd&WwoeMw}6hm947J춑 M ן>-$朇mWu螫Pj1 s(ؼ lh_Vʳ3oVk:MټWS}yx!tzyȴЛjB~jX~rXp ~[]=~~`ǎv-zstBdt%֦+f躂`6j|W 銁M2\mRѕrh&]=ts93E3ąR 9@MW iZ}:f\RvJ7uǨiHazštŸ(] V24ƨ+:p`+!]V]WB S'xrt61;d:Bk+4jIi슁{ ]WU*Z37B&]QW,&5.&+M3ȴ'+iUXa 8}`?`^։] v-z@tA'+.$+ >]1m@3t] %IW#ԕg ) l ] J(LV }Bb`W'Ӻcq؏2IW#)c623P "U {o#A*Jk4Cz[[}}tC 7XZ%,Lǡ$xOBBq3L'&yQR6fRz=O~> &HPihaC F; B2\RӒ{f44j SzѕZ7t] F+Ц+֔NgPp+iJ(4xLBb`C&] Kfi- ~J(t5B]QۄtڤFBp-]1eߤuE[ѓ+c}q/`ȺkI{QVgTw]-tKуUXuV^WBi¤ -=>sW ] iIJ(N!!]11.] 3} ]1U8t] %¤+S 'F>CZue@nFhZpu>^Z C״PZ?iz&%)1] -J(aS O(jHFWBK ՓF+ LHW lNFW ] mCSځx0yt}3MFWZ+ 1*lv1|]1p@טTt%abD^c\A?'_}#9~CIW=*eWO`*HEWBaJc']PWZ(!] JpKEWLk]WBIƨ+ xz,Δ[ FX}Є4!M cI5-AM !Z/1(SNV <,?ɓJַJ6yz2*=3#zU(0c$iS5~=b3t^.TBY([.x(h:7TtŴ'+l͌t5]9TΘt^Yp!Bu%;n*0M.{kѕz@]WB+X83X+}ґHuI(a`IW=*K%+6DJp-+ jb,IWצR{4JpJEWBibʠ줫j f*,(, ̵Mmk$z(}̮nT4ʹߪʡ;iY4m 銁Kg̎qI% Zk+ zueQ?yTW @%+OEWL~+@t,r銁я 2-L1k.֕ZMFWk@+o]1דƨ}{u|2b`\* >i(YXƨ+ (%]LNGWF)Lf]hO+z}ueY:dvRfӳ3I1hI9i&+Wxp?UPȟcYuAkNu@-/[sDVWe;ȿëv ?]tß^o첈1?$W\\_]]dze}:XKIX_K^*;k䶟˯|cowM^Ŧu}wϭū'f-25C񮤚/WP?ъWQr@y**`|v|]{-[a a7sy5#yC!ޯ!N9mG^piwLk5ě*+/#Uq>RXG( P2P179h4kK De4 JVqt_.`s_iUg!BkKu4.yE|a T;UXzm 5'Ҁڜٻ8-ʒ\UBqcP:ۚV9 Uei[i)xPUU*PBU\J:/ . AS(A%:A`rr[i9kAS@..?\pj[,յ1E6 dFι%e*GT@RI-˪C,=PZ3chۘ)lyu]ܢ# UPyʺY=h`@?חӾrKt:%8Nemi')M9Pb(~$gϋ *cYE]$g9j ,G!H+ֈ *Ў>?WWgN0SZ:9[2X [iyd8W7"3:Wȕ~F'_G.Pqm஥A!a W)!YԨJMۗ Zʖm^:nA'Z/('U<$Tr1XkW?1j6frg9VXM΀,(Q6pveŵKCv`T2rܴ z7MyHXr9-xv]"B-:m>p*nks͡mTsj{;hxnR* 8Vm)Y7,N,55qñlJ[`Al8kʐ#o .]cl/3j[0MjDʉw6)MٔiP,>OU::2c]Q}EK05n0Э#)7fa}΂nލ9TUAPGU((|hNaRT) 1'[z[ ҀRS̨l`WP&4$ `AV$*(k@oB& 𯒡2UWH>Ʋd2*zbJe5͐jPoB+"X2nPƊ7|k(<&X !@YP6{it1(QdF*nYy$^w n!6K17)_JJ ufM0*Qkq(y #3K!\+B4ަ2ݙPHq Ls$e丽`r ڳFxwD"=dPIWm (ĩ(Hv\F*^eFTGU]%їT}FӘyիf !ѿ&伖H 안JcMȲZPҰ*4]AՊX>sA ԙy?/t^̈KQED$' GcE( / Uy2^UG˚] S4׮z|ou_ L mZI|txḰKy@xc#Tq:Ҫd'eIW}HV*U2P (yCs ~?XQ|E ) >@E&rZ5ȼ`|ڄLktq?X1FZi(^Bd`Ye>ZbM=A!%DB>hAjyq:EjC$*KtPK|̡P}ֽQ@R@"Pc*2;"1y:_Kɢ+hdIٌvtdݤG&Emv 0 tۍtcL'r\Nϥg?r!mR,HrpoEvVK~ڶy۶I qh|50~l $}']EYs9;&' 6 QG;hN N }&@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N?Hփf;)NCF@xJV84;>hN v}+'kA?"'Wq:] @b'; N v@b'; N v@b'; N v@b'; N v@b'; N vT@9'F d 8W8֪wzN @b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; x@{cBe'x@T8wc'Уt`b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v8>Z'֫_'O~VSjnl V}oBq S1.(b\ZZKW.ƍhJ;"V:]J/!]im]s+Mz)OVLWP~;?6W) uNn &;‚N~ةՇ:}BSVO_~bhEo08N|GJgJcS;߾6*1&& m8yU N|4tu ,a,ӼC ^.hw@zZ8& ,T|'K7"/*Ybw|%,LŇ I3bf<5ilszEʾ/ 1Fpzrʒ3FhQJ*%<_wۑ]z>ZΤ躘j(R_2J+yddd?'nS%S3θPr,ɩKWnJlvv<%6n,%6zN&KXb;eV p϶ULho!;+BtzDtUz<י  _NW@)fzt.bDtZ5tLh]q|?CbX0nmOñnm{H[LM*F9[A^F됝$NXL).RBf{^pKDWm4Ѻޜ/eƜH/eRMcT$ddҖXdh2+mkdlELr*Ѫ`vZp(.h\5Nxlxyᝮ <#nb]ޞl|Kd|[}]nCލ³z=lTN{k{oofGDY.+ j<?K&]@w,_Itotç h:g XZ^i6b6Qlf05gdTg҅3ܙS?;w|gs˗[F :C}Vxm|`n'Zณ3jr(V)i6#)[/W5g7o8$#$ ::b:y^VKTLGN7 ){)tl{7&;nE4[%l߶ $Z8t rcԣ֫"t,Tɩ2z /Sl:Ug[wm[!o6a:ǧ0ǻJ&3T 4~zR[uuEkrmGm$N%Z PMq>@?{HdR\ A./'Yڑ%,b[e[-vn=Ϩ)v5}bPm|X{Xo׫ _ش|2puK1FZ0v|EM Coz= Jg\$oBeDFC90ǻ=;Wk9e=tr0{Yڑ¼Dab6sfvcq0,F[AsFiP8 J$d"9_)z 9TfotTa7$e2.nķ Z;fD0ͭt69 }Swr/o ^0Lo;gۯ+]e=m= LJpt/ >Vu=x -#jv;7}?R^},F9},S㾉W7a֞'dqU'Ui\3A' U# mQl97zQҋ#t@d!J@] T%'kS#tk#kʢ<+Zڢ$%#b**F,`Q=&C/4$$KB|ެX3r6FŻ[Y?J$l ek'SSq}lu{d>Aj~,~$e` Nh<$A3iRyy -Y#vgr@<}!Q] fSo)q(Zg{A졛{zH0)9۪ц[rL#V5: ']}1B7etSLsK =|xW}d95'F(jPЌs#(t9>&iP4hw$ bB~,+# +|\x6VȮcr]]ܴYynM߻TJV+q"PJ{ɤ7]2~-8yiJz)AI/d, 9LبJg Mg};^WA]{5} 9E(YtF q{/|ءLi2{g.l4h"X)#|I} 281hMP,o d]K8#QAQɠRF[x+bV5MdKe&(&Q(@eH}{0b,doɚf׻A_h3.׳::;:Ǎ/_׭?&5`$h2Rd sxcOQ)Wya@8Od(2fVw4^w>^ӴwvaVJ={o|;77^K?9J(;iNV,/8NgcF'-O&q%kE*L4jpCՀaW6qc&U (I$Țhe+ F(9kg|L#I~]6=v~$JףVR$Dy.r_BI)E\i`/4uM.5Ƽ;nH~u̪upWԁ9KKX-#w}jzN ~NeqRJk0y]lr, K@XCmw|mQ4ڳSG'D# ԁm CS]5at) ;rFٛ}& WIP3|RRPRD^9od,3RWmpk/Sd"DaddQMd9}sTob>[v0f*$v[=+iu}ɓqioq1] KNڢvހWd4.@H&ۮZh!l4$SyA4Ym@PX,z) @%hhȩ66 k#{ [$s mՅ=qV/pO\ }]Pic(WJQ < 8J|," 'AJk]cT$&O C6XyJ@6e36 mzmŘc,eXe6LmhŤˁfzŒ K yE@LĶ!=Z$!!$ZS) 9cFL& ڞ 7qj C MgRP\QXѣ‹tV!hVf"!eJXUPr cxËb /τ&E 0 x&UX ]$ϐE:Z/_/𯞶0o =O7sv* r>}+j׻\Gu8C`'<3(0gq~:Zܩ1U_ 01M$ Mh4̅v5J("cdFʬYKTD+9o,aP,Q?dkOO!Kܰ_ykg#leN;Ӄ]OpS~Y^^ȽR':B-/M͏fz鮯{? ڐKC`bcABQ5da,YKXJ%@S:X2sqC1(=(+EIOVh"* }"kB~s ny$UPdX<cEw|7qOPFvOpPM86qfWQ#:ݘՏ]6U؃q 쁮׳]Wm策Vosu3Y9nnWt.DiX惧޳u4=vbʚMߵyMj~-ӱdJ nC{vsps'5TMAH^ Ti Jr&XDq_p;Ec]}Iq*InP8HY8318X\;;f~;g'Wgf'Z[gޔt'lh4?'8V\J&h HE4F5YH)K,xBNDǺʧNJ $úb}\\m2EVgJ -C&e_bɈOӤLT̗Wyt=g<{gÔ)SLfmَ HFZPf#Ic p<}X 0ze{ | ro9޽[6bQPkepB+.3źȈ)9/H%{:ѷzZtP!fvxXMFBpՀ׍wi= ucmXcRvދĞZ$ YC>ôэƟSt2<2 н|fqCҥ6]znTs@|ylEݦͭ/wu`kZbmu8l7+}8zQ (rl}:ƛlͼB+#I^eX]|G_KPX-H[no_n&WT]X5N=/n;n`\nnxb7:GlO7'W7jބYi48r⦋'N!-_#1FmDƿrŃL){I"z9cb -fcQh!% r11KgWe dò'p<îLlN$\$Ch6i*׸&h@6&ؠKEɞ^T4+JtYotzyTW2R[QE,:m0V’HQQ@jBG%aZ}N1yx~#u й?c|:)>@Ǫz IE t "km #l#Ip2R?_CVq%E`B?eÊ}gHLG5aGWwW}U]]eI/In6&x RAiOMhzc)^$cR*CX ˍK63Kq3##G!U%` {h@K"GMR*@̖k0jmIG%0- PQOE~FW CF+Gr|$c;K7>/ДkZh*yͮDU1*^q:78<G:k|6PXj}C.> F;i{x_C'ˏ'_q-P,tr췹 \^)"Q"MQOG'7fTQ ԻVwyi ?VKTWT/8[7PK`rs3k#h^*'~_6՗ ?OFϾ{:o_=Lo#0&$ٛ_# ^=iUSMc{˦٢iEbiv"`nJVߎwi([ՎG}p3#d+(aHQ_K_T;(VTPAft#j%7utۖGϛ\pI(}Rh!@LB!Ƙ hd%3׾&+@hy+ 1|1'Rz#p1|%pTZX/vOUaW·tYՇ$lGYa;IApqZ˩*?Q@E/Ly>ȋyDeJĔN8S&&rxC}=889l7[<eȖQX:xz\ݥrWb>4ڂNX6?g.()t.m?=ǣ|9%(atD+ȐXC#GЂ4ă~#uSher< W3V1Ww&>5 ],6T\bey:!9.O W؇UFF&[UGUM: ;dvm.Y'}ZybmQFo 3_/o]o&f\XvL5|oԣSoQ<y&KIUǦG|˫\獝'\x9\GpC[}E-y]Z#Q2P.*$HYU EYYn2v_e֖7 "" o"/J'[xxs5Zy5,,v',26,_ŢQJOU*]6բFd|f65b?uyjX$サb*qr\ 7dE o/<ӧi <0#?=YjGe i@P&%w=J$F (̊hX;_czgwhkM'6DBmN0˻] F8Bdi;{:ogwl6a\55>J{@ \6r>'҆ZaUS69+р Ew\iY2HB)eАtx82Ǒ)G-:,:_" e\ B8a OD tt S @snn6(Ebhi.3p]9CsZ|HF%S69kEO感 lj4FΖOBv‰nv>.^F2RJ .M:%\P(jj*rA-8PLlDOP"i 2O}BXI'.h߰6F*8ɩ{U 㓧-3-)[6M+uCq]y}.<9[7t‹V\TT.Y{Cןg娼EI9h-^(8rh"E]lʀ5:]Z4YT pg#9,g6T^3!DD021rGl75-w<iMIڽESDFf!ց"0K87f,5@-R"Qfk%!N !eHhϣB x"^s7$G:*tZy+>^ BDl?ڂGLg$ aN0Ps>q [0ZA2Y hJTqI2UReTGOVQI>9#SBڴ<yøz\q#,HjOV(۹* L:eD(&hCRƴzρ9w<a<3@*~C},^V:rҭom uEE\xG(z: ֦3ߐ\JdWοejhLϿ}=S/ jcp&{8ډZaj'*+\:fD 6+@3pPm+ p0=!B U&WUְzQp#p;W\]LsspԲW@ԯgB+nE-X\%9-ֶmUyzX +))~b2շ!N3 )njΤm`@gMwPTv3Ͻ?Jg*e/L*À*8S[Ð\ Ջ+ f=l Vt\CXWJ Bd+prJ/CpwiC4+;WB{zpRuДwDrR+(o;\!RkW/PF`Fig*:npT}Wj˩WDQCv!rݨ}npWQ ]pv+S*(\e5t\}ݨ=nTdW_ҚtɺnNWJF{zp5cv2ډ\EpL"mL!=\@"62S&N9\[@;QEoPo> ݋7cc3.|AI'Q .p:#ߙM__(0.(ǐZd*pɃ0E /v{kgӑts6*{zN{)?E% ,p-+Ÿ=+?߽:N2sW546cl }XѵaEMeo/l/ESpT^@}xSAzl7)+>)BTF OL3eb"eNEr!B=_/U)z9W4 'Oik0$Y!^G4PMθ^Ʉyyqs_ JPbd%pJ^kfE4 k 'yIFCVv%hKޕ/N5lby۵y[?UlOauj0{_MǿXo),)1-is$׿j_WZK7˛w`b\dDAI Dʔ4~&M.0:Zf+Z\ڙLm 3 Ux\e% \er \eji;\ CZ/Wp&kj:WHN4/n9¯prEH@WG̕/_ף05%xO.wF-uaB"٬f/O~@q *WW+D7perS^WP+Me8wF\eG ȭ+Sqe*W*=r $tq\m-y\m.mjㅮwߦroONp\=uɥlŋEn^6\]!l+ſkmvЦ:Fv]J mt w)ْOZf-q?̒ym_AZi%>FWYZ0a+R±pK ".+\W&7-+n+ҧWWLp^]F~\Zv{Ǖ5J Xy!q1 *WWNJkWIR\WXV׮w\]!rLp*+Sw_ B%;wꏂ+qzq/|3WFkWeGQ텶ڨRGqݱ2\V]Wo(*2+S+ĕOq!\Ap~\An'!+ W!or}>\Ap$^W~bjӅMeՁo+v2T|K^J~!R^Oށً0 ѹe0mrɯiSv*M%Ӂ+Ĵ䘈• V]W&7Upqe*wWW1gW:A9`rs^WP˿ _+'W'i!\A09ZW&h\ڰ{we*5*qǸLn jv Wȡ`.Ft`K$w߮KڢBnRIngp\с=Vg^Wմ Lm{Tꁫĕ'?b`䆼 Lm{Ǖr q<[W&22Vw+S܁+\-o/՗^K.7?uyv{J([[ٕeSs~aacOn_Ny[;Z< ָ LnUpDn2\]!4v+]g_W+n̦R7oS+U 1.1-+*LǕʞ>VZ2LnZ]evh2@z±| Z6 K'Qn=ԊT>|p\=u艅_W&7Upej%Wro&BLpep .;LMqkv şMMnUpqe*9B\G݁ .'|z ѐ4'e`Syʇ7tLcIWr9/+u\ڼ{\Aez qc`Z錨 a\A[WvkvR܁+ĕmqL5ޯ+S˴w\JWIJkW&X)Mn^WPiLOWY8-+eΈBmpn2pW\G}xTCfK$݅ڷ ]ͿMޮ7 p\=u)ms\W]j;LeWW+J 7x0ٹper+Sqe*B\q!\Ap J^W6]Av3=ݵYHݾn6߿#0>U ^[qa>:@7Op <-j'ͧscf/ /B0^#ǟۆ6o? zC_<4S+!xIsPL+Ś9WE1J`*%O뇏<~!WjߍxŻ(Lwt m.'rN|xN:FK5QcB yG3zIav*JxR-6o ~:Ks^}:9 H& NJrA\s3gIV@>K_У`2cSZPW8:&nHyZ5k_! Mf=h2uE@\sT;) Զs}@4pxub@&Am`1$>gx߻4GJ(+$|0"e~"_w9>)m}]T&--ֹ"+(kjɃ'yR٬jnʘ*BH%Phrr3F@nf;jI+$|16='[&ݒDBe.idi%qu$~_c懂͕SjsGisu6P )E!VKFS AFKVv u>)"Z ]גٻpZmG̠=7rj Ou.耳,9)#;:D{|j.D `(eohZpJlAse- agpQyрT,2hK@@m2c/m̭BxkYJs\ZlgG^[Fj@m\yR]U1 {:JsEh%>{V\f (;O&iSzpa \="*C7(cȆ e@'ic‘Lb+(%Bŵ' NV2K=tXPc K!@p^GFpB]9aJaPwjG1d0s 0ȂȄiLqҼkTSjWԭz`AOŜGY' sǥ<`a.bֲ%8TR 3=jANy 6B_`[betڛ{j(SѝUD.0͑NEUPP>kϲS;#JPO/ud@!N U䶺XեhY]LI14ZŶ%,`ՑjvBrI(PSl-@ >f@ILlu:a}jE}Lch"hi#:3E0hlVGXl+f3S糊QPa:2_~Ny O?wmYyyʞ] 7z|0mLMo3S00Q`Ƈݣ.S ~t*YR\g*@kݪ ,t<)"E=-VU,J芸Q^ 4x$QLȼa|zc.ˊ1:/lgxI$ZW}eMTBHpY#6X܉m`uωEUguSY߻%?2bN`v1A;~Z0__nwonnf%6\MQb+|mHi z<%iYN^Aiuj KsM)u ,,5(Xt,!e--hЊs;y N0BI^vOe3m)Y=$9j AK:+ j<30آ |d!l=i6sVH䚳[. u'(o"*";j=}F,*jR4]q(+!*BPݖ-Mִ$+0=>#l=iST< Z1MSP;O;jFa k%.d@@F@>;>-@%CCGKm-%bb|nn_=,7oOӾqhI ..n2Ň 4 Ņ6"la[(ViV@\kIly!Y j313HyySNdj3,>wJ <%2qr%׎lz3t8:)Z2\%0UdrE@σ'<)kmֲE0{aI;@>mpEOGcYHrElsHlٖ:ՎxxŽI. z5VuT OQ0pD4)3\*LB9pYo $o`$5%7 m!Jql'Z BOac=AcY@" 65Zj>pU(.Blc0:jfx!D$8D!`1ykt+,;6i)t0pU7Bj/nLVfo$2XAgA0)#j-Wޭ=z. k*HaBm/'K7.En0f >ۤhTwtrEG/dx>ߞ.f]Yq>x;!qsJᇝד2j '7o՜zvtXn?|TԝlllY?6hel:=m5!O-'f9;T~3vVUG p0l^ (E%1*,@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P JUY ]OJ j֟ b+\џ;>|% @GQJ T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*d,֯i[ҿVwRIf9^+oSvK$nq}T gׯɼM \s" q("ʜr` `mjVORnmPu/ (@ +*JQu ϛïnQ˴ȉS|llzR1z3NPQXxۓru38[Wy>IkL.҆J .z1 Y>K2,Y%OGerɯT.deZ|DZk&V)K \K}[ 27c BbQ@[}GG-qi^ fk .vuD'eC}oO fFL5HB aDeԜYl]sNY*Yt$3Yc)FՒNlVG+o?".%X Z5si"(IܫaFv 7'e•q!D ;MOTQ/)r).-keG>Q}TjUG>Q}TjUG>Q}TjUG>Q}TjUG>Q}TjUG>Q}Tj:d}R`^1:R3T0J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%(lȣ[ upn\R7nC/14kf(D:BrZi)8oU~;W#̂[sO]g9\X?ՃaH2dw促#/YPcgLrf4DZ#1 _D9,ֻRQJ:y1K׿)޹Af|kF-aX넵:-T(}>`j&r'NPiN;ɭ"q$4g2vX˞L6hSmIIx:Yf+OjJFk zM s0A~COW~(;Bk$|f(xG&쯧[Xi?Zw}׺`O[t'3kVi;.zUYhǖ'6<0ht`)^z*l;tf/CWۡ4JnAWML#mu/th%UNWHWGHWXl67tUZwUQ2&{DWU{]U:]U!]!]IFZ1~;iցuZEgʤD6{>x&jVm n.{8k.ɼM6G8g7\fBHV \h[zCWՊD< TiiZY#X*^nי1ҕn]`Xo•*Z}ApEiUsRzH#ϊh+e_誢59S!]!]Սau} fEU?wUQ:Fr>YJUoe*ZA*J8g>> \?U3,DFHPwN$NzY6Pܡ99eM\ldb1褙)h"(IܫjG3vqFU(hZ^u]׍Dݞ:}Q6hq~1/T~_N*;xx_]_y3L˻'f9o- 't@!U6wOW69r7O+B-v&UB/|;I4Mڷ}2)I/Wox0+݌[2;1?/ܺ"v4JRj|Y7?6z.[5Ndr~vV| %g^uN,vdM1xt\tsx}t,x2 :2e84I(c:9I`ç`Jpjvi_m7.~NHGv>uZ.,ig6ތ0>ƨ~nm2lonNx^7[?糡kYO! IMI;Lg_׿%$v~zNus:I[;l??^qbw_U zH  d~*_gqq{r ϻE/F^d[xG4;ZCFWÇ=?uMB"CN6f ݟ{ܟIK`|3pX5%{eph(RDQ1C,\3c\e ka۫ghr9kUޗSL}I/4%g %1pk1m{ qϰ Zn>$>)ϧד&?($ ~zKaz9-Z\? .Yov;+R yVfɺz J AhI\=uNK6@55iS?=TxH4{ax6 !%֧gPy̻'伖a{N&vs~C~'=ɸnXpY+6EIzM|͸U]^Wzzv9~ʴ=e\2|Uj}^?LnLh@~;II{I^нcGnPln]j_yӿ>FyJ 6|NmsE@U%tPKPJ9vr9>uh-> ilͳ: u.)t)\LDQD^:zB\Y6E=lGZ\H0<|Ҿ9Z'HbbE@Ԋ'|N9ǃ8w4ho`Bo0Wd ]Q.~o~+Wˬ2hKR[iKLaAJ뵰$ ePJǨhf}\93i@I2xmBe2W΀3q$Gs_'NLuA#)Fs\`).Ys£)dR (3@ >?SلY(|@C]*.,*c^"G 礵>!'j5{(GNČN=ځ Ucnm*I{ ])zq,&Y%"ł}4ɇQȜE&cQX'Ș`kf&I 67dS' !Xt)᳤ 2tNR]Yj Ab*(+^; ^FD V睰j ")dmTѤP7 ζP&ORFZgB:hĽ΁[m5$'bʉS͋eYDg99s ;1 LO%^_Zq<x)wID*YaYkm$GJc>傕 H]Cv0h+#KZ OYvmx؝)D\|P7n >j\.;a h,.aq1M/ {;'Ӎf*l>M{ 9l3sǎg95%ZE-zfrW`h3f*[\(-!!ٛ^|%('ٍ~=zQu3YWWFFpZW+c}h#7]nUv&u_[t_!:2DEp6()Q)FXB=hb!fYFwCfg]v;ijMC5g5V70.7椪~'>Ҷ٧fb"eޞ>=`~65٥]-;'Gm=>ش|Q?kϾҾ'f8?OZBf~>Hgcw;U겞ݤ.v2}%˔+TrX,pކ%&Cw^kꎖkoiu{|tjoөv;JT*ɸGNjw&kJG!7ET\5bVbDWu?+<3T~ڑ\[΅1*: ɉ)pr M8aH'*T/UeNʘ8e=L_,:(i*(%Xى*尰gB;`QNU֌z1+l}[u'`tR/8b!4/ODx+8Hp 4*#5TI4dcE#{YMtP+5Q'툱 ª Ma.Fӈ糸cbqǡMˢzP{`[8^)9! q^h" ss+("`h)%(R")(^YJ-d @+:pȥ4"0͹~|L2F5h a1rƨяI }AbD"4zhtQDvnctD',ڳNqIrWVC1HMJS`)LAE5ZLh4Eti1r:Ě.G\7DtYKE^ـ.>E s [4e:JIdkKr2V3%D8ʘ1Gp<,(0<<U\C9 ޏh׳ccf|}KHMA4b-!Z(H4T|pzxM69q#N@Ay'OAF%xd2RH()Q2%b,jp׾<*% i!ĩ\BurT*\HT ƹDla9<ޅbm)N|ynP#?^.7o43Ɲx1 =;`ӯaqK#x@34s"I 19*64u1!ttAl BrD8 "d^^q3QS@Yb%a0!Y?u!SԚ@@#E*(4$id$aڨ5g3%CT{*YB\`9OG? WKz]&;Zrdw\A}=#/5nyhPq1:€%)Jw4I4X2D5n  I)<< =5əFڄ:4CS(ʑF u(9̀M5 %,NW,Ԓկ0k*45=i"J5UqjYMۺu#ԗ.!Ơ]0#CK&Uhщ?5 B~y[Ӂjy,&vHvenWCM/"-h|: {}.SMI۴uk3mcι1w@ǹY//_mn#k-$)g>ǁ xP x$){A Ȼi4Ob#9޸nn{9~l.΋hoZ_OG" $=iS-oǜj6˰{LL*IXP6CwZ?;6Oyt  L:l2AVOΰM&>PĬbY=u 16])pJs*P:ed1r:@ O%9\]9eL$pOy:^;_ũiѵ y!ߖu2tIՓ$˙pzw0@$@Hyt6ցڑArInbmI g`V1R*)"P7!/dNfL֦dtčZ%>DLقI$5LYBKi’`-*$@+N Y1r:Y#Wlsf/ KD+Lp#0B}2QE%,URJe$,! "PR N2;GҀ)9DoNJr>|4AYP:.xDc|mU+,=ba;u~_! (A D*f!Hps)[q X,pjOtJ^m{ ܩ#tLQT?-M9ߠ%Dk 'ջ_ M*p,p»V%w9Wzp*r_b}uIIы5T~?ZAK&K˨Y-F2E8Rք{5;rGxwfyڹ?hYK<Cnzסj| g[Z֑avZG6~ժMmSd3f6OUij1W{TjspWa-q6>^.׈ˋjӻяxfRXZ_fܲ a0BrE;;o_"ZzלمDw?IK ZRzָ^%EK$b2F"'Npol89"0i5>jKS$.W hڧ!-rGwγu (Y 6;pev EyO!=([1(Ljvmj|iLkS0-NddDCAe-aS 02`17c·L9 hfTHc03ܕ8CS B_YKmM] 8wJq/6:N< ΍ٿMX jr>!A !Wv@QeHic<1S&C%Phcb<|E+3@Βsl6pZ+,V?ۃ>;*!`|Mod9ӯM4\OPSiOrmBVj2Nć miԇu:JP@> .%99ӊsÉkۓzxqS;Z?m'RlJy*/u qOJ-m>6q`\%TsEiǟයt 4-e{m_CxdH I.pMy5پ s>`$Cu֭U|6GT|5~ٝÑj6*{dNa^2au$!m>Rʯ(fpWI]Ϧ?9,{fGݼdۨmj Q Kd?!w}UC5OU1=|?w*T!O?{WFd`,0)+0zfuàai-QjR>7xIdQ62#ˌx_v>$_p%w?~Oo|w?~iM (\a[?Kot lpi˺亪[^qDnKflm'(՜JKĈ *zOGNjI6A=*\6@fyd5:hy2DRhK<#ݺ5gΑLJ&d낒FA%f:jQyo!"b4(=4 Tr}o6LS^f U+>Kl4[|Ty IIT҅3Zb&Xlw>.]^ 9~u?9O2&҂(DyUy@WYg=6zd̡֥/ׁk ʔtadhuV&wO%!(a*eғ-J#sF Df~gglXy0.4ë/_drwWmoo۬@gZ}!6{6r~01zr(XR"N>E33@#ҵ2/uR63;ȹAt%$Vt4r`]Ou3x)zxxuuW;v5qh^Q偘% w%t,T܌+_5n]( КɾQ m"CɸJ!wC!97wZ2GZ22G\Pؑ[b>"9:U!!m| DƺqF48N.r0p|Dּ::Q9'fr`)*2[¹Nr\ 3S<1^e](/꣐}X!H00vܞLALūY)xA_>ܔQ8P[#RCl d!;ǂnzTakU=8n]p|Dw5fnt˧ɝ-VLTooG|Ok:=7[ۦ;Y^k;ufrtqy2ۃ.qD^*%&TDa:pPDeJSJ=q<p=gڀL֟[-s5RQ$#2d 3GcP8C2P0 ̐4vgܽЊ@leHDO7BV]Jo)oCӠ!. ueoG셁4ǐu1l$|dw.PJ D #3RN~|~;jTrRp&,+lm̾t̾tt\J0:K)l69gr`5`lάfɦ 2qO{"gi frpWŊ6)`emph՜hD)m>âM$ʏ)]w]a*8tE"}OW%Jnjz2lK&A eæE0AQʾĘ+Sգ_VoUk|.I5ɵʰ/('g?zxӻM5ӧwݔ$o`$L 4jz,r;/a/7$℉z7)dyɿ_^Oٿ&8I2됄Q í,8&鯔 Dߓ6@V*]ۯ׫C{e_4%W=?u f`oiw㭟6A33gܶSE+|V2kBD Q:*JPBuT Q:*j~9jʒozmd||]4okGeb[+կVWZJ_i+~կ/2VJ{+~կVWZJ_i+~կ7s7~q6OW|ތKf<^6nSb gF \WׅSttx! fhDc2,3J$eAYC21MR2FBx,0+mC#NGY!S`z"v) ޵}W܏R<|K4\-muaH'_np3NJ/(ݴSyݓ_ r ʀ%sjW mIƅ T29y)B,nAj D0%^xTޚ0(R \"YTϕp[{ҧbLax,LHECh*"X}1)eFo, I9ݧQM`.>7B_Pw |]R-#Sd f>: ԯ(=ځьB]4Dޘ`Q9Zmrwp0"d IsZ:!{ä1:z4dž)͵1מ\͠x@Ql>nG}/t hR&۬o3Z=]T|T,b%a CT6Pg*A1 (&LdӀK*mukY6?ldDGM&JR{Sʲ0a6I ~9QM"t:A^Rep3Vt(1chrv#D>xtC'H_&of2ۈst9H }, 1Edd=7 tF:MD= 0N%&0rPȒCV :Ed! ޴:Wwl>AWY/SY-gDם-WnW?{"t[=S71G~h* h&wa_b&t Fi fUD"eSEO ĵ% D!+!*~I%DSP3ߐLi4ɯ¦ol=J-kZ\]h;~k_ھwq@H=wl0hnp6EkvR誹 5\}tWsqi?}ꀭfd7g E,i uekToV/-OP@8n{ny.iiMg נ:$spk1 p+%=#SnSr)P@K+.} ΗT,Fޡtďpb+/@Qa>a ϟDAs޵Y` ٙfpE|_-wrIژIbbW"Y,rIVS@E;;JV\(E6?ы>8oІ ߍ˲>(bOb&ՊT_2ñi28z٣6;rX:I0{bkoZSܨɿo6vrtEĿoVfG땳{uN{Bl64-B.8W;Ӯ7=z7Mm[L;ґ\LEc d(\<݃1?y8jEMNh}-A=t$}KZtYg&Z&4i K_"EI-I|J\`R=i9'sΦj6iFڝ}@৉o#(yXzOg||.2O-#Jn~ ›}~Qڗ {A{=nh$M㆏c>g`_gC-Rᔏ-:۱ ^yᅣ>ml9?k6M#> ABcpǠ2 cl2!HAB !ՖR=-zQ3i cҩrNnzbVJoBhCh-ݣ~dK},_Ͼ9&$a"BR%mΥ`XSs<04 -P>cݴ4` t4 *[IV+G@JbTw:_k{~ZZ-vVT<[6Hd~ܩ)䨔VS66WU1**?}m8)Z$ڢ L$3F!c0\y`:\.YR*O3ڔ%=m, "yL &͵+4㫌PR kIƑXYe,t(9Y_˸8Î*nS_̀fGBeJ BfHT"Q3r)ޕ5D&K IAHQ`Sj5 L&ߎY8s̢;biybIDZ6gj?f$Ac>a Ϥt%@J i ˒93 {,+m| Vq ,CEG%U*!ZQR|HY%ɩF}e$pZxX:Sqz5Tcc28\.~i6>5;2ji8H~Exލ;|gtBdWq&Yѡ_Ka!#$1rKpDM-I>jg!$9 ‹ B  0xL1L,+&i" !0 @:3K< V+ ]2F E)V^nW)J%Ee`N!GIEJWdgOWh\Np-3V8X<(Rqy=$)o&C:!Јiϣxb>OBπ<F1ape#b6&={(M ugTċo=$Al}SIg&k3αKbd'.r9K 0ƏJM+e $Jt \&&)zUr&bySrS87Mq9-R0ӴvJѹ&;nMlj'MM~X\Yy'%o:v'x]>Ce "Yغs(s|wt/;\Nh𸯽,mn',PCјMy>ڈɶ{qgcO![N~aFTcٮnӶ>w GgOzN2M.}p>&ǑcdPh 6Z&|ĔVa8:D_+[k}tdV;~WE@92P!d0VGLDRה2)howJW$0 W}]M6k{R}]{Ksʵ%-9wLT%M soDg!h=iآRiH /w_ؼğ}aݟ&)G~y;ST]P žBXMiGeVl%W|K.EmZRJ1&]@c!8NV^ɁKFVcC;xbǛkL3qK/NnoQ4elkJ9ShX>V$qBeS3DH9x`sRJ΋TyNbH*ͭk[M=DGNsy]*dhRFٚKxt!l݆Ie/B;I[(- d-ǤQdel hJee=NG=k5pjI IaUVL"Q0QY>HQ Ń+5U[-Pw/,޳ TSp7V֐uYB\DmmW^mjGn𓾬#r!zQ5aT!&r-DX8G^58h0K<{ ܩ#wus2?%O̠614fed.DO9G?2+FD%K. tr%"*|f|& R`0'HE4"prahe}zS1֫$X~AЫUX,!f>K (Ɛ(i'! %67C&үJ.'wR$Z'"eDŽd8 :$TEPOQ| (NK qcf^`*cB1K&R!hnY1̤R;YC!Je"#r!X`h@"lRNH,$C~Dr,DUhڧngrM% y1;4䮜IT*!IRM̔cH=!a~m4-YjEƽ]o/z7_R(Fa9@4ؒ5>izRR 0 >U"=yሀpu6ZI?'VgLN^D+B&?E\[JRR(9G6<.’MBE{rKl K8Kh0A*11?N[;])J-óe18#t'SQd<JukKm3ML|5 Gm&UNy_i90o&Ҍ#FG2ͫy!wGHνEr' oN,mPimÍ[rJ ^HKvrf\ZEݜ'8sgf*G/TȜDfćM7nx\zSNۯ/ 0<Ї&>5y7KH3ym?kpիrrGc%Utg9]ф; H1uRw%}Tnt牶G%F6!m>|7 ug>|}\[|1 ~ig8k :ghFOu4E-l'ZM3nW^LׇޟUómֶgurwR&V$>?-ZhG4&˟:DTL׆tx$ 0:͟?O_ {OoNk)ٳYg0`꿼6vSajj>v 5eއ곭78Ӹp׫_ӟXP~$xZ 4J_mʯ6r9[ W?A*^BxW V{iLFxjW:x߿8ݵ%>G/[\}>F1AjEV!1RIi cs.i'=}ܾ`>waymI۹ 1 Qa"1gɴN{$a0ȳH2N^7zWsK#fՐevDq|ZtP gj{ȑ_{7@ v6~: |)&(!x߯Rdm25n%4jaY|er#%7y '']VN.$?]&l|H|q$Cw3:zO?}_͓VWU4)K5E+"IOx q69[/o\Nκy zp-⵻,/z3 gl7uixm3NslGu6 ۻ_ClM?|s>[?`Q뫓2޳\̖&_~ұ^vϓavat>}1Am\c4b1: U3 ;dݛBcNێIfw'XL5iW6|8_\Zl򿃊˰rm0@Jw]:pRb.PBKkCJbrF`#0ΡR$͒[k4Oq0| Ӭ+,ڤןdzAil|x ez6bz1ܼ>vd2]sn60w̓jaO8e赾 {E9j xnsSIfߺ/[0 fܸ^$unTz+Te 5#|L)2UT}0W/73l<(H/%ȼPDZ\LSOF-i];`$6VKc`pzzwkux/r ^P7@ L Z0-\c!TG`_'zu/g/Wx~|atۯ(.TٸT[BoU<yS_O:|nna(q~gO?d1_uE#j(v0Ȇ=tBg\Ze<>޽{ >EXS~4Y\e#K+9RE/pC a1fϣ+)$!X*KK*KWy5D#+Xq4pťGB՗WZ҃E* lk(?r3vR`+wNUXwX*Kk TJ#=+X㙻ңjr;vR ~>pwz(6]?TM,S.&nj[Orۗw:P\vN )ﵭҽul72O׳|~1?F75KNk#d7D]팑܇xrYE :\2gZ keH(<5y%')'2%Ǯlj]>()֎Q]cc]oZƨq2Wʸ_+~eܯ,`q2WqReܯq2Wʸ_+~eܯq2Wʸ_Neܯq|Kz i;y)d+˰RXWbJa])+u֕ºRXF5Ja]=Ja])+u֕ºTOq87.vc6av{&Y"*xj2 *0N&JV}CC?Žy9btB('CD' ne^Z˓!`sˀ*E„0;m\хW L  %ՉKYS3Uxᵘ8;& ٣I*ب7j6 SZ>JC&}FPIIM4`C15TD[éB H6L"8hH4C9A+PP›,-E1\\\7a~q=bzWMYX߾[fu! lo-mܮ@My~\ݕq0 #G%\A.)э@фFŐE+ꉊ*ȈH1'Er2&&NB1YϢ&)R U02Uaa) ))bAp#Q~eOخ>7^t9]M3G<1Ҡoɒ4pPve]]6C\dd8˰ɕJx%f6f NMLXVm #v1qFl?ŭs,]L;EmZQv`v(|BI#[A1DL)AFD{ATo$!gHFсF@bLP8BɌ$# Tv0a LI3m҇0"~x qq5bZ/.¸*.V\wO$; h4ڶuȔז(nefJXcp1c8>!pq4;fKiǾx( a*Z^/W#wbj2h o$CvimIp59:TjlQ}o}#RhSf^49o ѢDJkげ͸I=z~>vɁ- 'y:tt2*/yL/%%"FB)M)eAPME&Q)ML $NO>jS2 zD1Υ\Oc»\,]҈:ʹZ}%j]3S)ċlwb,z,y!xa&tPiQa *%p y#pDȺܫV`R x ,0OѐBw&a8:&&Ej 1 DY>-0KmԚ3əy=b,H.PLa棯p7=Iyx5ήɇ pdW}X"fW3jx 0`I@R49CM3V &a0IY23_ =SsC4&D٤BDЎ4*CA6D9Sdw݅i7M'qqVЈzh"ڃ"Qo8G܉_q˶L9"ok1`% =$ԣґݹX@,BJx6ЧUf9`OdqKII)NO[Kʉ+)K%o=ܦ 0 Vn휚 Ż7aziw ,[ԟdOL'JUu.|zo-vij[ Pݩx;V;*sc6om_\m\y"kv[\<< iV8N8No@f҄|Ba[inuz\}mYeDi?7-HiD"盦0Lvw y_xl.gvlqO,{{`s;=0tõc:o9ܛbq~գݱv_"[ֻ/z?bSo_cnam֝S,[A~ĬFw֦oș·H}nKgF!.DKxNL'f2}*6z{_L/Ǖ-61g Z2+ϸxArQ A$ &ģN9fr&9H!iFufq1ؗ}׌}_8z:ҷmKvI+)~͍k0Oj&~ah'i~|I%bU8銼<3߹O "/OX jfuܾ\5l$i%QvqiRBh%SdѭC q6_n=96|o AvDAEBvEo. xo]#o=^x:P-6h&F{јt=Z::*b 0nrM˶4fqr=\lDX̯ɿz+Ֆ"u׉biS S^j״1MМ 3Lo鴗,\LS{̌8s83Nd^|zR^LǒP (C9lEcf tPFk!(\>1$9X[Bl^a)RxTPᶘ8;^:B}&yB1an[5o݊M9<"$C@֙]J;^]Q72r P Sp GM6pD-׎T-f(fN\)F9238 )q VIǭO.S|7Gq,@'j’`-*$@+H Y1qv֮zE;[$ 9#G-AZ]̢-P/T@2I\5&p;3s+ ؟CVH᠔H* ʂpAN{&"./%vj:"I뿍;#!Ih/Jªp)Ab!:rP}8Ǩ2^Yd&¦'{;7Km>iO&(Lb0`[5HIj~BGs eA.7<[JE5>]to|MH 3~gi LSIr#g>Ng Q@NPUӡm>Ć2 <גxr }jϞk) ]ч,DT*I3"UhZ7NF$fI]&BĸV]4ä8?%=|*)_s񱜴śۧ2;bW\:!w,ȟ}۾eyvmFv5O%q<^A"úo,soC;v7VmRO,ZNJ@Jˑ"Qf4(ڱD!4^ = D5E!ax0WϮnڂ8i\ s>z$[]^e1D% Mk,•0B)vWE'eWDY{ԭPTh_ЧNl#1bV6$MdYU\5cȤ"DLb1w|opw?fwC{m66|dyО7YmvӋdR!g]Y|WWR'+c( N%Òg:GZKM`Wl%DlKU5@TCqbʄAH,SܜwnR@fh񀡇$#*ېu.sw]Ǔlɉ&y EHxY^w=z=rq9j>A~r[b9 0>FL$?'9ρO bVRU7[F7Ι|뢊S_ ɀ7R/cfD:{Ҍ* %6AY$1*gq {s/SO.RͧkC޿~.~TkVΠf+ ĒC!HU-W2;)[#f*)刂 El ƢbXKPtIU;^78'u;ח7u9F;&qƦ, vdYȲNzӻeQ1>F 쪏DSr5 c "&D RaJ̈́Ʉ K&dބd[GΦ֒UV5Fdž BNMJm͍oQ",e̼вSbZmsM06G6\0Yӵg肗TF&9o.p=REp̭7 -^sS&H2V2YIHZԋ/]"B910"x]¸&EL1lݏlW@H"9:T:yo4w[ɆU9&W~d^ Gr:ߍ>q&Q% &8s%`I54:X5:)}㧙dy#h6 3.N>~JN&́Lf5J'aM7ח<̊$q}qQI:kkr$# s4Wd˨/狇axWR⥈M. 뽖Igc X,J[u58;?j\WϺ#]VB'?3%DS% N68 Q(N'c#,27S"UYau*K;uesL6;V4e=;v\^j峯,hYu1=]6;AlLlWPe%k&+YKj,Y-zVR)+f%ރ{J` qvT˲^Eҡ0%h5+df{M~2V['so,CgK%W$k1'Kl,l*@Jβy^Pˁ%;:z49ۨ/?$U $W @E c8 xzfK o{}G|{^f8;aYogv1J¤SRdLuTttBG9goH +6Uے3:"%Zۖ,y.fO,@IfwȾlǦC}qqm,sLJ%ʅ6U 6v, UWjsVo=UKcdrQ *l@tRUۀ9cFL6+'Hop^:A4ɂG>YT5[ $+}!bY/QV)6TTddaS?ӤyICgw6Ce] Z%6#Bx`:I e $k=ɿ^_ l C,ٵXX[+a?S3c09#5hRw7Z>X/UP@bXm!5L'|k)Pٸy֑ZCl,5hP5UHl|}bR^_Ȍ/7U6gBk4&MCݮt.:::R U\0!+z{q߶q^%9?c;,>cR MZONڽ%G__{}]Um0" 8%z*qe:phgy',+ y_,ՏNK*%Wgye~q~ TT#c[s܃7]dH{u r,!C:U`CZ_l_|46%Wqr^~;YBnxJwɯbono:S ɥu3ΚMI%% rV6/.J>mlQӧ,a<ڛO>0k:m?{gf`%ګikݮo\]m#^xwg,_Η0ݸY\4^y~wo\c9Q֖^k⽭An6oGq9Rmz_-jq{͟5}gqnW R,FxtN+Wnn-ݼ}m<v~gnA~|eZanh.tۓk>kf?S&>9W0@FP;OKQv'0]O&r|']nB7,No'v7Y Ij8sd3:kopʢ{&HjV,mܾ$n3ј%q?4j;Y'R-쬒vM~/'gWW{b|oxI\NǿQVEhykDB` UlNo'?`CFU u&Զ*ZPn91U3/mSq5cLc5cs0MeWu[ 5u+fp˶6(Q$I"HX@_A;ZW Xz5FiX٣R( ʁzqAvp+a@+a ]Rlu 2@IW]M0\r[@הTB, !.v} sYHg [o[/mK72d,U*2˝p6⪐Bb. 7 +isb@(U{DPco`\z6=,ʛ{ʧ,0z>\n%aҬrI3+A73{6?e:fB$_l,xc: p2YN>g.^M/ӕxӿKg/ZK7}!8"GoE0YIYc8AIDyE9OA@ Rl "\i R "tMD@wNlǙ}15B\u6PkĩBՍzȂVH\+$P8qUW AWoQ\Y{8#qUVcj/jq.PkOԎTFVĕt/.Z' Ơ-5SGVTS /T+fequk!⺺eAO]ᨰ%'/wnƇA? ],+Z~#]]*$ b’TSZ2g~mZ̪ Y^k?hĩ#ՎwtN@!ruO],Oi7蔭☓ꓠ 1kxX Iɽ ,[/Rv!D6ODK5 _0n-e^HX0bz ^yb3_*|K?lwciD2,Q"=y=uEqY/~=6]U_ʏ*p";fU^RI͋ t!ʎ\bO R*{~8{Xv~KqSM/-l-tP.v|zC7y5P/,Ͽ;h mhiʏi=h΅uo^SvH6:JbgcN9 2N ŐeNjʭhaÜ .yo} Y,!7J$v? ȝg<ځTW\JKu),("QF2EVB40Ϭ1&dEG9,䔹/2'+X:Ymh4akDD:̲묖-Z>,L?f431߽I+m/MyQƼ.m8!gK gMx}u x_{Gn%ݥ,.?P)$(q^KBr!&1Np%[P< fMn_{;Aŭ~@)^[cpn_?aS嬑  E>CT^쮥YWw8Xx<Bux]Y-x.@v=RѸ;Z. ;~9U"X~ԺF^Gس}pOV=E4Y딘8>@Ajv%?>Z .{v5!(@]$_yNqD$9)::kλXw~d`'":з~9bܱhyM2jB7,p]R|8'%J1ĈI^iX D MV$)@} ,Fۈ+kFjk[:yf)s@]26ojRjy:-L ^YG Wn{ٺBkG11Q3];ļq21/lOSuTdc 7,"tY9/ lg?O~D&l(EJ.{AC Lr#T1Z<Ľ39_/G7A293>"Eqn)]u`tvnV t{mtn>ׅn۹3!Ȕ0])qɧr SzkP,gILY`xjeLYK\8 8}b>&'ǴAq*j^{Y:A\;h[ :EQn+H%RxQb;~rڀZ(Vِ1cO5*˨ 詩 R .ڑD}YpnBE9cNE_jZlDDE& !*k(ѣpJ)*kN A)>ro3`ԟήޮ8~TGqQkXcsRa*Y@LWWU/WJ=y6e`+FSEn!HuiG3n*0]W_/l41~=`YB0<ҧ5TAiEC^ZŬAg&S^p1e ̞Ӡރ:8Z=1KDقP yMw٧(*zuzIr|:tlDghH*L3JaгIaPp.) z BRҝ&Ia/%;{+nm|noRFur]_xP_hwXGcM9OZ8EEM1VzcU= o:=7Sgk4NJ1o4TknQz8?_[PjK[_S5rA6e5r%ۇ*~27-O҉\-z{ɿ}3|vj9-MmGF>lH!Ii}J}+e)8mb2D$ljb4+Y^+zl:ղ鸔SF9urN@& t%M:e#/߅PʄlY^HLUQ`k4Xa2 K.S7Ib`.G]9 (Q_=iy`Kn=FqvOYzp^ŰP+vq$TҀOp J<5-oxkYS-qhO{S׳r0xX+(vEYoH YMymɠzhO)!ތtL*zUDAM$ JQbTo; 8 8 8 B`XkAqR"nIYpI"JGH &Q7sstMo<ÑSe~=ج~s.{q4zjb'b.)&Fʙxh_&8EܨsսST ,ektk]dS5<ѥ`Gx٪R)-z0^~4^4)?sP$D&CdJHadҜ5ks8wù)AΖGqYesbv useJ۔%1V`b.HG&g(y"oпn V+`t}ܢ2ĝY^ދ&a%ͱAGGBG`/?to] AU< !$) 4I+ oR ZGyEjs4- WwVQ|?Kg䝙E:% 50Rq2!LK"x a!SڃFΞ.[D$]\j2%GYeH gbZ HbWJ^z8CX_u]Z?(-{/^?~POՂsMDdѰ=UBשk}lNU8*\<|~U HI&$,-T$dV}Bow+t>XOC L3\,cv)X*U__Nv }@tJ@ܤ| n<);$k)Rdu Td~jgFEE_Qp,}`ۻ,KD6:onR3 +8 x$cWCfC, }iST<*Ta~gVFݰ8R w_X)Lo8XyGsِ~4N7EQ:ܠAE֞U/-C:}7[I|BlruƶYKW/ɤq8{lɻ}4 $dʭWmxռ(ix5/%!fok\yGWOk<|у^&YϚϊ;alfM+)w:򹛽,aKۑ,mBJZ6}9dy-7Jωd4Q%L(ڪK-q/gWTH<$pBIΡ&;lT6J*Uh6&< ',X<%[Ь^@N0 EzaY昒1TJB4ҥ"gKnv=UdW]jP X%z; ~|}[ppy5zRLA(#: aF(DR`9y!HiTqĊ*gMpolg<j>jr4JmJQo|PE<# ).yq@r0A蓎1$"3zO O3pnm6%sGMu#0X# :dH7Ss؏rL;otZT* ^hvSZc<7*hew.+q7XYfj0ɝ!{E1fPw}s%to*-Ơx$sR1va{Ӌ̀\$B(O1d ̀̀̀\ Zބ%IO!0+DVQ9ڼE| O1_qOBB5 c"SeaC Ŷ<&0>1ଢ଼Կ._ >Bnyn̍n r#sC혹quvwCOCSޯнWar{fr̽cU{EMDɛK GQk؛5*AMM޾&ocVH>7ՍUK5 :+@.9ks*̓Ka^3)IiS ߾eWGz m"#x%:H -1h)(,2]vPm!u =yӷ:m}OT,v}~X:˭zv2}*(@?ߵ9 IxsJڱUey&2q0NoD[w>ԼwnԭNUדTwݲN* TFy\2O-+z9eC)z¹ ચ'0G;Rq~0PP3*Up#6FQSatو gk~fs%$L/,_ eojwR:z[ {DkaGQ_-lW>bW`*8R%jNx-;vb!{YʸX2( 4ԐUd7։)+I>@76EÂG] r AXd ZӃn}ZjY>*$Z 0|Fbg5ƱR"UF &G}REѤT "@R6=)/5$S"I jQk HFnP {qƞXUg, >+(|q4-mWw ;w/j#$M(m(T!`\U0br jw}Q[Em&Ԟ4{Q#UF6R1WN -@) dAnVeTI3ھiCs"A,d 5-+UFÆ2V- 1bT3v#a<McAn1D"NxǗR@] .*" 6[س-A_}5@Zuفա׶x0PL9%mdʽsMCGv"cuv}qvE3ℋqsXw싇a?<|sUCMя/V,я7hȟ (-?#a 57 0J%)qj>=:d#?_=o3[5E9RsR9ֵjVASeK^%Z L,e0\VS*M@pU(̊b*tc"N~=`)x~wzxd'^|{&vΡŏ5ӏX>dLn$VQ0"Fo*l6/1c`טR*vBgAk60 `^TVcf *8e2RM>@#'#FЁb/`j1*Z혠C'i hDI;BR̦30t#a`8M%Mp7{woR4agGm-SvȈiҾA}}{>-||C6DeT҈F[L\zr 56u&㴙-.> zBτncOp1T1D2 sITԏȈ~֛U84e! ě};\/Gx0.ZPr2*1(2PUUНźt&u [}'0WhZU%Q !F+:rwCP[l6ĚBkTn94'Ipp|Zxr fA2O Y"Z5ԲOgɶwzݦX|n{}=.;?Qo\>-SUߐP ǟnMK;P-_ſ/RvFaݖgc6/}}<D6/87M;N[s.Kw[mlmڠ16V2]u&z 9ͭ/o(n"o s |LPq~=r]z|INtG<_UuY/ /OoU},RԧLڤzEkga9>ǟעv$ӫ6[h=a~ j2Sv c+6U>_3YO{n17x4u_\3=<Gd_z6[i=V0XA9Xb.s U=9Υ+B["ҤKuZYt,AW$ξ)ctH;qWe`뚕O*x V,|js #v%Lᇝ@q`G%_\%o[ٖFRٶ^8zzũȷ 7S e # R ddWm\#&H5SBzNiުC]è-ϗ9W 4 ً὞ۏ籠CjE5""2G aVڮVC_Li`J:>#y9{WJZ)B;gbZ> "zACXЩյkny_Pŗ=sS.$skkozmSnm.epOr/mW332"̵8V-Cdpm :F`9|TvvIrSKi] PZGsj/.ڐj6#F5FfMoye*`D[AYTGNo;Y7rvA Ƈ{Av1 e9Q:vcM&,h)4 q"wUB<8=YFdsMhч(ۅNRJG;VɈd'6# Nܟ#V6X2B@94Q6NPW#1-io.1n<ў`<ڻI*NQ:PƨE vK3%crum |fk/GDh\ŒȃO^,·]5lhǵ`Y7)ʈ*ń)bA>OO ./fk-_RmE[d)/n+^.4.69d0sR碞*E/ׅg~z`3A$cyGas'9h9<#ggkfЅ-N~:ҳty|RVPO{}9[{Mb 4?2MiޟG6%J'_//?0__$TAtlT&CI8`w,3f L`QؒW,_-ɲݶ[qyYMEvUbXX.i;p\xYK~?|Bkwz#^}Fk{i_^OUZH$P 54_% 8irbc'}xrʾ}1yެy9 k?fO?x̮cI咘3`ONf++jeל#hEQoK{W-#p[GRzHgۆۇ1f0D6pT'ZYDFBOVc][c֫ {$muRBB篣q,]_L7ʽ4fwzzWxYgq/O@o[~oso@+0H‹m$voq?=6wo~жi{ڲ&g=_}UMNy˸gz6fq$ޏoX\.":r$xPWQKfg!I/λUKYUb#A}Γ ^/vG_b[Ha+!gn#DQI E0$2hN`!9q C岸pJu6c;*} Q]08%= >*IN^ۊ:_/tKueEPRUe f};dzOVw`k6DeQ7>xݟ^V5SWG਷bv:X dJhY#ȸ2ȳ10}{ߒk٭ ]4aa4:_҉\x~9Sr/{4йo,}맞>Ͻϰ'2#2ȱ}K*"I얎:}%^QZJJr xfKL-oF{cD&4%%o. \@aGk;b{~{jW;;eeLKM9_;EPozsZ_L4\]Kq<ތs~I5jsOru~?eELW׻7 mv}v?zt]άaOb1DaJ"<*^VA _΄!fE )Q\7f9uڱ=L̵g=SWmNvjwʿ%@=M+~D`1yFg)`{#sZCF#!)v 8Jޘ覆7RtxԦ`C8SU8&U;ٯb#Z*Rg]J}">*~-GZiFͅ&+XA,RJf˨ W};5.Yq/ZormN4 oݵ֗v׮^nahnDIosK:?C"gRM&73IҙUEcr.8ۜcmKM}{wy`܀sN)JAPP 5$eR)$8fRzS%qA&Pϲ,9T& pFx$rHD\1r̅1>Vdv"$9;!󏃒)?/+ڕjwD3 ǀB$abJsJHIh/[XG_ }mF%ꚧڙ@׭mu,1 [*/YIoR-Dkis6[>o䐄tKLW*e(!Wn]!ѢC+9f%4w.!QfɮtdFh,G1 0EK"EpfMȲKXdVZLt>O6AdxL4"o6F(uNGx҂&/rʀNK6x#Aۤ iA(4a1jR96W:ig.)v $iɒ|XEDR@RDN*3DT`YHұwBT43JcɞDa32}&lg2>&wz@VF*1L@M PC9&L1w2",Y@֞Cp42_fgP|"wcqN~qۃ܀ )ͺRʄ lsdCF9; 92r-(qHIJr\TBd lh{s!xd,vsd$Lr:pK#fBJ讨-NjX[ |s^+wMŽS 8}>ˏ/dvBn14ƜZ;8::8?SԠ)W(uT@@(  u&v}/m{EDKȰxU= $D)FHҍJJ9183,2,ASP\}(74rYJf.skA4}9W]䖣B%2d5NfQ(Ia. "p1Hpn$'v] $vH3c/q8a_サQO&iyt>M"&3b2OqAgO]/SHzQQ>80 P&U iRrMeB(2/]-籙6 0|YdzѰnbQS Th<n?n^v' +h־?fOd-}D̛Dߠ@:?彦! 2 *j"gW ٕ/#!Bgg]]~1W~zێz<,l;MV&POp:< Y5*bc__6]d$.fsϊ^P~+Kn *kϮׄΟMN&#:d:Ŭu# 'bmuEլi9:]mլ;ܺZϫ7w>?٣絖a<otw4j<)\Vj}:mtW>ʬw,Ys1E}|?Vm ?~̧] -!G\;ۺ=MK#4"KrB0R B%+RF\-ƒ`*g!+e*,!>[\pKZQ¦,0l4`Rc2^e2QR 1愰JOM6EǥE(\Fid| \GxorsGeS]RXJ@@*ybFV$ZQ@\]ڢp;!`=$1]$t$~tZ*"w$<# 6r(a.d`rY )c9rbJ8Ah:&M"kP 2 hcJer9{Y-WbnQq[."Fi5H;Gdu:jG 7!jypEBDI\QgX%J) QpDqMpu lJ^yjjinۺNzZ QiKMF!tRΤhϱ\(m6 1Nϼ2BS( N/r1h{@>۴.H{N@EB:f-jY[e٢mDi7;_fbU)wC/zkE*|a[$ah҃* 9G_O-2q 'V&ݧc Ɩqdm8 KE=LTue|QpHM(J׺\%kQBkcS(,EuNϢ&"}O!dK{ )sy@qlHRq|m̼)TPc@L ʁLfRpYu9ӚKԦ]68BG[^= tn(I*"+&{ǃ!1c2Rە<.Q%$d{@fw'mjIQXZ-sToC pl-D4 Xen n+ cC6겏n}??{WFJ#ô@DMl?{ PEFb490 B0Mr/H &g~z%BGYA{BN#0tT@Sj8hw׾oo_XΒ=~ gqRi}=8o ' Mt u.;5#ky$o}} "\ҝ*8C_PĎKyqN}S C S7A)(LQs;3^.kWÀdJYKdx\T rQ,8Zc ey>up)aQ.r~`v>**bfâ֩6 ,1W$J4w)laz|F&#kgUt#_WeipY1MY b('R}=K#h:9[L0^2ZzRD=uCڻeVy"h\SPKWD}Ĭt zmF``/z-MHH88f 4pg*o7kTOV56UřLN`t~H}{1&w,8zm$d~} 7Zv5t j]K __[->ooib-ң#m47E  )\ G>Wl*\DIՏT9Ts_B"GDv|A ]]Y/n;`F!q Z4q,QqY+)=MʟE$W64{Mnxvαl)iaEls8,?zҹWNg^;/u7;_FA/_Tn;mG,ҭp(Ф(rGy#W2=?'3tny$XHrm~JΊv2!~IGͥ.l>*wهV:j&R_si,r'{`k@Grp)B§4F@A%`A0P0gz ,}W&7cSyZ]U(:χDy:.Oè ٚK|(U^6Lr}խm髵mt6_ZtD9=Z7H:e2C(hjnv'hߓn|g{1GwT($θ`% bnӶoc~}.U#QE #l(&r)y.Yۗ]΢\NϢdQf90p"bF~m@Ƙcl:vLQ{ɱ,@rFϵ,wJÃ& }!۹kv0`n-k]}us3= c_99pivޭto~ &lԋW{C*ȝ/.y:RDN4s9a s(P[Ȩw;##f;wud0ad#D5riT(|> aLQj6Wn-Cٻ1ha'bvҭ]FOD yUʂ^*a<%0$&p,XD|0a!$XP;#gQ'2:ap'«u'9yJ䵚s.3Rn玣1'5L) wBNQi0WQD0`H-^D)2B{ФPyJH1 N9ťodF^I <%kd쌜;];,36B:B­byҌ[b`Wkf$- 7_Up8`(pZeFĬf4ptZ#6(J 3ZEvp*LC2'($& fhBPt$LEt!űU#vgaę1"Rh h"88f`!fXю Xr1 @#FuН0F}<qW3D{D#\im3B7^ )$? `A BFNMV!!Ɓ3u)Id$&:łKs`I&dҺvFÈxy`-bµKgqɦH;E=.{xq~$vʁ<Vj$ҒF`B;s+Yx6-@Ѿ%?U1 V` 8CVnNp{(bs(_Ka)# $,OZDvq#Lѣ4su7Nl{'Ot{a9ц`{1cTGM*$vj5F,B0#BXY_x)#"b1)V 1q5Ysj'ǫrS!M=ly_W('&gXvy{U9*Ƥ7N.Tha+xf] T(.4G"W([#S(`160CAt XHǞ&hpNJ7bi1g;J"d:#a`Yop7wZ41]ɻAs-v@INu jt2E#_8( "DBh3$qՃjlӁ쁧Og~=гG/ yR)T)gHilP{!=CDrQ{yn0Jt-8^mqH"F6H\"JC7ĹEv ̣CB_ MբEf?P2QRabGّ'+n b q(kcED^Y@=XSz5ׁ\]2r! Qp ;Z#'B 1DmXr3 ڮi>a {Pe |ꤰA4n6^U95z`8HqLQ[g5e¨ ?Wʭ2Rg<_a[lڼ\yz7|WQ%8ٸNq_7o]& bkC ەfT#q0 e.:MO2Ȧ&O'-gy<:S-כ6ikJfm8Ujgx2)}!K![I(}'AZwZGJh} Ϸ ̯_ᨘۢM%> ŭ)cDퟛɇ1.P̚Ԥ?gQ{iJ!CIbӧLQ̸ @sկ.y]U23ˌIIi=wAB% u|i3 茱`8kjdz<;2xTWKg,<߇erڨՍ{叧v'{/$멝%@BrӒ_@sү]'rgf`_\ڰ?1*z!l0w Ɨ %t3 Fߋ)j<>e 6÷uWXwr#$f0^?.>G8=P>X2yhUN8{&t$gw2>{\pPmT<:iV>`+nos%eZwA-mDUY\Qsv[wX6h\}d7Q~ӣz7_A4(Xm8 +%Arn sJ(k$JPoBM\oZi[.Ʈeю$: gïG <ȤQDzzE"RϹh)FG̽wz]<ɋ0Ov%Fgiu#!׻ wqC)ʸ*a3ƐZ(`4@I?S, 9SL8+ܪtYgNp Bpu~=:2ݢ}3|ܵ[ξǹ6]Kߚk7|{ݫIw]Zʄ2O"Ma,GX5u[]Hy9h1k0>vo|&F+-„_dR m ,:!Dn1R|N\EJD$!b,*,`:L3qv?B!m"ji:GK >6w"W|_n-Ou4nۄ2aR_!=I6N,騛6.g<]v_@:ֿջ,QCiSDG?&*mo͎_;b-ǯ«!4Eeh!gt7e*R?~8=^#w/-5h%jC͇~8h`<'++.CTQY3!SztZ²aSgٔxdK&@liMmƒfᠴG?WHfvCrp*`;Qс{.* ߠ =Ɠ 9y]|so'sOʃ\dHNi!XQ]@gxJr![C6'^(gT}q)V.>BV]TME5NK`bLF3~ӥ <LRS,t:g_1Gi BJtSn۹uq26Y%w[e<[|\ͷ^hWv@Z'BZ !뛺sɳF'jd`!MA5v4@C 0>7:(K| A$ .>dR `}, :tD蘃HNtr}h^A!mx3yf켥΀ғh%}&8@@%C4_iԉ7(25:fx2!9y͐g:C$fLFeѕmR!&BX!SLu(Y}n4ۼ/l|,H^?0W1O1Vp<ʤ\}:\,O7V L& ?})?}i?}1?̃,ZSem+112*Fk )Hv,8wIzQ(eW ]F3U>v:i6q62{@hgljw-qW8(Kcݼ\A:;o$\G*K!"a@8ʲ$żH0a/Ay;Qz{ ˤh16b_o1x) ] Xo~s"~zYkiΦKsR=v+Fڡ䢀vPyQ<Inffֱ} :pa)dh(z2Vre+/w%^TJ~x9Fia%aR'"$2#<䇢ƯWma֍fsJ/絲2TRVh!l!j0%R^\^rƀIweZu⵺G]/Z@u@Ebhr]zzo:X@=Qixj;YϜ!&%(87kkTBsR]4JCUcm]_p*/BPܠU+#~ż=&]vkVyӽ]XR#kg\Akиd!8T:[E9(8aCLX s7vbuQ:cr()F@d94.8mKfpmtڸ !ȓ,eyd`̺.&E"$*S1r" !s* PkYO,`< 1R(h;Vy5oQKIF@qN#DMca5ZĔVCYzcpjru˞$h5bP%: E вH'Q+Ȟ9uj)xQ>}n,g,b]m'j뫊n+o($8~p6bo{fp̝1,AZ1-6|)]/q0x) yĆQiBFF\H fC͟)*L.:T48!YB?*L U ..9yob8/f>߬&lkmI_vpC&e2NfaZQr6o5EʏX"%2jɪ_WU!akkIX]1ޅדt>n(*3~I=p3P O5 *s1>Uswπ/zժ|'(]B.7s7_C Qkfn(UIf=ÞG֫; %rprj6u~ qRP[o\vϸ\ۛ];Kf{nD]1裚xrT|v:jƮNaRk4Nd8ń+{m3WZ˒k,dm95_Hs[fS͚SY/ xBzG8<^<=>qsϤ1mo~8ƑuǭO~0'm1󵑛a6l?y{ꞷyv`}{&Z*8>[wa'{ŚC1_SP]^(yqF"%ڏ:du?d_{YKwܬ-;ٗ;>$0TքE壏ҐI)}FP\}~ND>(&]1hˈU*dd8~4XG!ьKn:ex s0CQf90wڇb|S|Mw]/6,6xU޷(*ߟ-GʀDvI9jBP&0,,ZQOTTlȼI55)b+8e=Lȕ,:(i*(¥02#gitrXX3vBJ c=`bݜ"kk*9/Onn6/n4 W1Ҡoɒ4pPVe7:R ;7< | s=,&W:(ᕈ٨vĄaU HЦtR܏~2L}Abܱ+jӲM$@FfNHBH ʅ1&gJ JrfOJ`=F% H*BF͐ ŀl.+1Hՠ+ȹ[﬐q_ ""q4zh1!]T%#]Ύ S\(tܕ$@&)ZqG#@ -i&4wLP: 9[jK,%"/쀋\E s K4e:JIdkKr2V3%D8ʘ1~}bܱ+x(vG_6W)%㶢k~[$|C< >`E٣|x"G,f0rO|_èUq}>QsҘ]\)‘|ʣUn}n/.5x !CD.Ըm(SɰC{ & tޏۜb9vWZ_+~/9{u)ݞC̦$HIf.&#.(k4G~@GA00|+ hTTPXI'hHV!&a8:&&Ej 1 DM 6j͙DLIP<՞J1p$Xa`(FE};.Wz٠&η(19=[8')ʵ% l_}T,IU&It*T=!} ph0LRv23zBy6µڄ:4CS(ʑF u(sQMC3).mK6m#ojFf?J(YRJ[vC?}=,/,ݠ]Яl:bIL 'MI4^$Dw@GRJen^NJ!w^KJOfꤙpx8 C8y൛Z?9y\*2, gF]r5?w"$޿ok_!G iDQVڪs{X|'otU&fW.W*xڋE[?Z3Bp4GTWUg?VxuwKˠ.IW|PU? ֲO^OG9Bi#I,Wv\iЕ_|su:¤^f_!*?˧/Nhɴ 0N~ yA:NT#&#dw3<;)t2x[8 ¼]b8m'Y΋71M0+y8dNFڞs B)9bT#:nNޟו13/[7rΜ!!< ߜQ1x&V K)b'Ow&BMZXqֲPI I}<OYNw_uf[sbͩAOZ!B$_ A=g(V_ڵ?DJ7Ftm^ $Jܘn$?<]7mB_5sd#RkJ{<٪LW|ߛ2ab'(׳4?2 qz=7GnUw3t6n$uy}՚U˅-rVᘸv3fYIר"#v˶;e1\ogWNU U/Frq޺-Fމ`2ČA !&M'?lq7Skm})*V 8./@ " mT\XR:ʙl2C e!Y|1ݗxcגfOJ  \ Cҡ) #BϥFYFchtr BRs͹yӏUBsV1g.XO秝hY^xjMFNJ(#_$ƌzWWFp*S>rʃe(g9-J]T E]GeBI%TBJsX4a)j}RxTPn1rq7Hv-0!k|aqMٮ쵙}d1 :r aed^=Q^Wm3cyYNspHQZL eׁt6ցڑܽ-&9;Mm#(1op>B:&dmJFG\HUqKĔHr\3Δ%&, fjB`IAL2 R8#gK9k1}5~YJ$ \9#G-AZ]̢,Po_J,dz%$˯;?3;GҀSr )6I"5AYP:\.xDKg}۪V=IW- $A(A D*f!Hpq)[q XN?<Gثb<b*A*V^~|29GQ4L}cgA*HR @8\)/Gi[̠α$=m)Utaӝm0ԑ& yR fr‹Sp~H)M|2>('+el]-:AM^XÂCv*JPVۥkPZE" `lBW%}=KuR Ҫ6k|6^^ 08_W~j6YAV_0tߑ55cei\e4ʫo~߹k >l}|R76?-E6A*L> X6y>Qoޟh&oֻ8ȄIK Rzָ^%VK$t2F" )l4#*qB <߼͵yim 8Wp-6:N< ΍9PFX jc4A .NK;^ɨ2L$c<16&C%Ph11Hu2)ϱ -RO36SA7֕wJ~/u&d4wx ry|AoչG|Pu[E@ڣCu_ GsqR .jeoW=540$X`4<"0`,&b Vf]I5)ͻ[ᴒG-q=TNFݰ=_={zKcV1kguބ{5V:vPq#G9PN}7:ףŻG%E-Q6⅗g`,^ _&:|8=82f^Vs%|oKpe-5{i3g]-l.`ffӓOۜNgتWd_}} LkYΐz$/{}aQ#rG_)_>&Q; dWo^߾xoK޿o߼ 8* AaCm3Av޲iȞ5b^/ N4d{z(.U&֭vjrZ-NOt3Ojt=ˣ`cW\(@tWn(ĩwi~up-^/̏kH#RL6$҆U1g6Se}6^=UCL=u|~˟-:=0-Xh ^ⴎ C=F*Y?G'VPuE'гq2\poZ5K㹕Ǝ#0C9.ꘒяbtFvO/ݏx kػr ݚ{dv˵ `x&Q^sϓY7Gl?_.r9xm|M`(,{WҥɭvXhd0ḦZ{s˛=֗E-(x˨QEClt5EK1F? #w܇^L~>6>>zRu^~u%.,y,e ={cҹ|Xfh& z`sׇ)JqW.}&JOz!w13Զo&$|ētt ŭnOAQ|,SvEIsC/-%ǕЇl-o]fk !Z;]5 ꂆfJJwE,4IcF r/X jY)dU! QCH%GY=T"XˈP46AZy1O`!-O`׬||`ZC*ѥ_q8V䅶@dʃ&7S1A_4LL skcnUI,&b'Ye`Zl(c-aϘcrTL/#+Ar+"( 0h712TJc+w%CAQ~@PA R N-`u*g-t*x$0V(,IP2Rށ&!M(r/t'd qCb`vNuJk;&L͐jPoȻ\I1#~0 1$7޲"1C @ тH}%i*UCI$#s5FeAx`1g#f 3a ?%(0 ٔ| pȤPgLY*9lLWi_ iJ7eœD]`#(#EVW4eIX"< e̯_Hd-"R4wٕ K]5TuA$"F_MhjDӘyEkժB/E j SDd8R( e2>! a+"ƛpՕb1N Lg,k`yD3-Ю" V*y' 2JI;L')< +Y]})ta=TmU28)ZHL.hZ-p1)3v1e+q&CHJR| LĴL*a!}\TT E:,hXh=ɋ `h NԭN "2hG[U`A,TGW ?ON_żcQHYܶ )N";VVm:/j9TD` 56}m6@@F80= R=Aj y1h$!ɛ/룫 U\@RLFR{XRr(6y2ZЎ:!vp <exrj,5<9 0F,pZik$F΃p}Dڢ td..`m\gPɁH-F|d@)CeC #"*#+Y+Ġ2̹(c;"gD!Hƺt")1Ԛ5'v .WmtPœi ,v|Y MxHm@3U[=k/XHKO0 %0hd_絝٨-JP0n q.[hh&m++td\Hwi'{\2`la ƕF44z:o6`W- =u jZRUFs@n1&y(Ϧ? o95(kk80)C^"o*$|̈ȇty@1CK6W9#zI(BaLwzˈ0 SA)W!z-&1X!C|+2"FR.2x$EAӛcAyMaȅP(ҢXGd'`bpkB qOҹIUfz_1}۰yXd@̤_f xmkZIKm"7iTPf+h ΅ rZT:h*|:g t1THAkO"xmwAْ !BuM% dr$qoV 0܂X"P _B:/Dj|Pɐ8u=95@z Je]RW&%@/^@>@pAiP?XUӜɇ(Bpu5Dh`6c`n^Y_.EC] Dם= ziqֹ|e]! d"T<fԫ.iR۵a:]HZ@4A"$8# ĤPS%g8 yH^(P!@&> av_N?Qt8N,*lJCrRqkh]@# N t@B': N t@B': N t@B': N t@B': N t@B': N t@B':4rA Cqh5Fi: $@B': N t@B': N t@B': N t@B': N t@B': N t@B': Nu)',qMRn0N @ Jt`YYt@B': N t@B': N t@B': N t@B': N t@B': N t@B': "u&Ω8,g0N &tP : N t@B': N t@B': N t@B': N t@B': N t@B': N t@^i.\Zoo}ԥ^\ZP (w9͗/v8Ǚql`K5\gb\hűml7.(5¸tꁍ WU>8:pC}WgW -< ruY`jwQZo3^jmOW/ Z}bzIՋeDپ(J!{Wp%z%j@p3&W5\W5ZW% sc\ՀpU U։DiG:C*9 ŀU WUVU3+ ub5 ҿ]lëVL ru&e G:NU']|/fos "`~.-y/;+u1Z(1 L1v5\5VY50]ta aZACbh;\([],U5#  =vUõuc Ԋ!\!\E٠ ෰tW5̈h;\A"\#\) 6jr0j35J3WNỲl#kN ZYFaC:\gVzGcp%Upp͉eDC/RlfPաUϴ3Z ``આPઞeF)97; ?=zE;\AP3+QOY+h>Z1h-c}`WgW[V]o5?@|ϠWe͞]rs&VS.F㦠I0ڴis y;I A[h6C|N¥nz4'bU5fYC)Iq5҆\׷chқ\ FhB>. /(J`[է .ePVLbK k纱v,]5oRA7~_cƁd"DR`;GSfo:qdS.b DQDkWճ_:N)#tI_5ikأdN]mҴHb\DNS!:K\J4張i y>u!1e%JI|U IPoU%zJ/L"#u$t^+WZZFXi5 7[4_mgJW&UEZ? pxxn;Jț7w?@8נ6hՉUUn45YX'|?i7џ9/Ox2x Uo;ؾ/_Vv 6/WVLͯ~kTFcz}K=(+d47] x:cE|6E]&Q~ A*&L2М9v<~rʭN-NoǩʠOEr' /Й:ؑ\^rۿN&[UP$>ユMݮKG_?!+h^51H~ͯyVJReG=\ߋe ϖy2̶N?iO,j͉{3"ue=[+b@4H5puP_Ž@0ڱوLWIJ$kOLR$}.I╡IUtdfF|NB(ωiaȢTC0n4X^nsp0H]zFz;,ckٖ-?إ<'ATbo-8Cf,z=~C𾯺߿Q?K(=ɀ`CX.֯ԀV*+5Jgq9_qԵOHɼ9C|:L'_]n݁2\b'fGd̒x(qBP=~MCHˋ=@K=Wu9 o!yX ZE-+W|։"DZ9SL 6ZhDBj5ډV9߯/vo-Yvf5^01m11yiG~yՂ;2GsxUȐ9ibtgg(浏WŐ% |fKI_Of]"E"DN+D2lOH?{J?M ԋb.V7J9$92%uk3x#2"xO.yIk)l՚Yd.ِ "Gaieo!z_~vaH#M LT =lyb7%p t=ڹuL.׬f_Om,%1-qU+e#ʨrPm!)s@TTP\-8:JLunBzzWBfmɞùZxhT%Rq ,[<hdztQ(wm_Q{4E(,O,w,(8/f,1'.;%]by$&dQBGB2o h$AoHX%d ) "ޤ쀝U1a|eu:Q;%xF Xv.2| *yRRM{SE阀vt=|1\\߬a<;qᘦS KBޚ ]U/|Ț _=_rLsvx3@z DݦtqgH.#qӹw6[#U7j\3JTtHy AH)ger\kに8u |0T _BNJSpE?$Og53˪?ߐ_C%̉$ٽS)~J> A Zf cDfZFpO4e x[y[{y}kUL^HR6Em"(M.V1z`űNE蘒RrJZK w3)Sx|z|gCOZJAˬ$s(2o'w޾ yUF_>K0B0xMvXyߢ9J!Tt07@pn5JgQ Q˻_ 3igW?O21RF2^/ ])#&2JJ\uuu$1kK#yF-XsL[ׂK@榬YQ$ NHlKagBQ!(ɣwqv4jvINgo2snJ2k*9RLB_\97.P>9MzSpԪQ 붎+V 4`ђP_>>H_k#Y=mg?yW4k-8v6̈́o=خ6/|N1V .QE$rb Q-tL_Oי׻@|w7#Hםi%Ew.e&(t tN>1rQ@fMaJHKhFzBI{t@c11ZRR"NR+9q6]4K]ӛxhlox*+l HF Yo$f#v<\{u(:mIQj+Szi),HiVqa,YsC)9v ̪n@I2xmB՝[rbHƱMY8OAR H9Dc.ЋQd RG#S&@f &J.cSn V!YgUTL26Df:Xn`- 9Q˭)C9H*1SO6~Eypj[7S9@A%顷: 頭㱨DMp+L oK,AS!EX,:m7"t>UG  # W}_t5 h<BRLQDJws%@ nd!3 xP,g1 賢O12'eX2d,s1 DRmIE={sٝvlfATBdgWMXvLYvZ1HؤEꧬ P4lJڨz' !H/)H]V`JYC *$N"V5C*Ţ 2z8L꼓1dg= hS`*H5qk CMM(9'df3&>^9G NƄI0#2Xa,1cÔP'?^ -;  Z ddgn9(a#@bsVU_zx_.>@{|xu1y_OA wΤA+jӡ]\p}'o$Aw3t{{3cbs:϶q_4el>[ 9i&B1  Fib!>n09'2\$duθ\l=dxTVBR]ӟ"KV U992$iZ%R2 LQEB-A  T59p)on {-=[6V/Ux_ȭCN#:z&ta}]*OACBBA4h(*D4ȱ{ڽp`$K}dX YRV#D$FtTRsAB13.2,ASP\}([rYỉNzcQ;?]M;)Y'Q{R[RaZd[ A oAFsgY'RW?2ɯq> y\\~z;ysݡLBrJNi;›괽 \̝OtZ`$eGS~{}yv3B9@4)IHƁ65 ȼObᖜ_$qlW3a#D~8 ҩ%0 !|б lzٓ_;,tC N8PI-.37-nőAӍ~W gwm'lu^\`ܳS*8ְsn/wcyӻ`?kDŽx!~Vb kH}viս4.ڍ٩MOO/_yxW%{"ȭ:;Hrhm֭55/]Ss#{:/g7_}si4 ҙ ٬{]/&;;gׇNzwo'|ت[_} [3!|=N]Yު7bC(΍ݜ5P<Ejt[aLseМ׀3LjBry ɪЩyIJ#HPYa/ rO֎%o 96!(iyhZRc^j"Kbǒ`*g!+e*,mU(h ¦Ma 6V)12$A)sBXqnVAyc%W7%yUkVi&{Bם /(P}inBeR|q6ɑY ȕW'&id ]I Ih,6NR8ל'KYFx4FJ;9Dle h/]>3vY )JITY.Ld% l)31A1Ȁ (i8YgW Y}J1J謙D9`<$3Q;B>6Q ̓+h5M÷kG}~fcާ+D,H4FeY%5E4y.xB*Iz9PԊِvۺ1Nz^6 QiF\7I9Q>Dz߫F-=Mbe_ a8qTq-./N m@!Ӵ.H{N@EB:f-jY[emDi6\fŌv}MiwC/USl|B6 =xbИqxL)'(S8G=؂m<:u-!fgUtJӝ-g}<: Q6 cmY-!RiV+"sNϢ&!O!dv}!|@;6 PHRq|̼)TPKSydZ`U|`2˲qLG#eӞr&G%J (`̘ve5OKU z&4->y}qϓinIQXZ-sTEdpJGA 2wXpCW%#zg-"~N0L"B]D mg4+IAF!p͎Y%gK<_ofZVY}eN?dYF{OFa3) jL<ѻ|u&IWڝ=+sC_|Hޮ81-YR'((y&x8d'gHeswkwEp,snn>axzB{4^۟6?zuųh!AYr֞CN7WscENey✦zwʝ eNnZ?VG~p<_.0\K.\ߧ?{Mʷ}3I13;t7ƘavU lOJf>mt{3bws;zwTɾQ1 ec;n, I}-NyR)?jo}~Qt⟫MÓ8<'u9w?O?_O~;.|O?#M '}""P_ gCϏ M瀡-y~ȸ-П]0; 3 )N~y˴]5,u+efΤxZWq4Idͷvח]([E\e~t-d&Kbo|tK 87 фETˠu҆; MHhOd_jSjvad^Jsb"DcΒ:H*f<?7^&Q^]Tk5F^m4ٹm۰j,yGkCΡ]pY~լtA)k{sVGgT=oܨ\tpMZD)9濤N IGN>Ny7^z7ܐ<SϞ׽_ӨlEvb̤BJGI@PVY&P%H@ dr\pƶ{e Y|IY&q1L)# ;r24=gN}~n9?_|`"\73 )kȨ^Ӂ͙O]}|[9}oGT#f9:^r3nUz!T@Ƕ~zV Ry1$3B N#{."4gs}]t Y^]j~uE(%YjժktB7Y{hy Aњ̎AaA/-ulp!4zKީt'!Gj rMLG]C1`Tlά Ƙ*CtlPGj @mGڎ@ }1trA;'30t}B ĄR PW;i dL VuY]6Ԃ.r2LH!i`s9F!,(N35ץI^1"=(}ˁY'nn3H < d$V1$ : 8PGtsEMZ-=yRmTE^w>gzm_@kpN:)I^G%S6Gi}6^3Fq蜙`ZY+kc} !WEG|MB@pB@Ti?yj8ơxj1G8zĽ[ gxJJ}2`ɸ|+G7RalE8c*Q+]6IU"?RРV"HPVz&C}8{ďH:_lq )YMKee(F8Ž[Eȃ!2#M ,vʒ0:fH 8+r\I~~q0kiS\ضо?vyp"S *(da쏡;cQhC#ZrԍA5%oYVfJ#Ǣ#=°$Qw%zvpwɑ-_3_ u2A˔b`AkF*%9ϙ9CN?{ܶˉkzVacN*v΃e 3H.IR)A PHPȃ#🙞,dMb7j`9l:IXc8e>ZOY~}t9Mc|[~X&3xPڭI><]$`YiWb_;Gӻs/ɻ&b\yq%/aRAU#/Mlnפ:oqGS-VfbVo&{}{A~mٍVhbuo S<G(QZH5띖2DbIeg(FrfR qV]>Efpx T,!:v҉Lh KgJJ.,WOTĬV"Qޑ!&EتDƮ1zMKb6;/l=<~B _ԿK}w-<% 2E\.QPm{.QT)XKU^ e R,(GU-\⪖Z%Z*R•+jצ'H|P b|=t^q*-qA\1X2@y6+˥/BJW. V p)e r @TZWĕƔt>P]ʥ: nAGQat+J8ZG,d+ ޿{c0js S ) R6dd w/7y >&XSFi8FUvӒiC}x+kUJfmlۉh'TPW WQ Ph=P+Qz/PG' W P-o(UqeUO꾹/@V\C|5+TUqeƧ fz+k]Z[+Ti+]̹{t\lرqUO<{l=WTZ&ۅ+]WծMO%Fz+loVOԾ jv\JK{\uWL GIoprWV˶ UZ㪃R]lW(Wy3BWR㪃SF4 zHP3Ib%_Tӥh,IOnA%$&D0_:(/n0 {^,O)d%XKOGdXMq..2JB$"\glr {r .%5 Na4?8gtK)*,M?\cu3 =NO=Upl^ק'k\1 y(T&< fM$Rl/ԥ͒&Tb:v낃5? W %B".SFJa`{|e愑 Y(bX6̘dq"X٫%7}+7GM#TK1c$ "LatHȜ_L |cziY+mFByȪFibt(K&c;Ƕت &hշОu&'OO(W8PU#4d:WtGI[GxgVYD@0ʛ-+eK*-Tٶx,ȓlHMP5 RF|wAUq4aҧq̩?<({+T+[>*u}E\iAi<\\yPVmUqea'\`;U]Zz\JA{.ʙOB{+|qIT+T+Iq*U?z>2 0GG ޜ#XvzrqUO9t-lqkScWsoprl=VW1㪃b)m=>~Էzrڃ'WRW[@<`=+TLq*qA\ Fd){`ѐQ̃%PeVRg["[W`ET\n}4=V8Z*5=;HiE>M*A\\ =h T41T3pgvr7+P{W=+C> [ @̮P U~ˮTPӁ(f\\DA-'B3•v@(,nVK0N2ZO>r Zj99Rܜz*YfWl^*9$Qdzy9#Xoj9O.PLWP\?>d9 Ċ4h(Ȅ6b"tigH#I?WW0!d1ʫh,Z~ X*7{GdJ8r, $:u"~~=|FO|s9Liz3q ҙ 6 L.YO^{^?U#D~ 0{|rD9{7 %CʱV0TQ-iѩ~rPpoͪ 0;`Ilj dkv`-7?ZN ^q|4AB#"]0-47_t3(|`N71˄/1T+ugcR~6̴d$ؿAb((L3xAr e0g[fYUϲPTiﰭ`qV:[iKdRUy V1QEӥX ͶhX#.\zY%|ؓn5ep? B-7ܠkmj>\FW3n7wʖHn,d0y5Y拋f4F$ۋ|r{3bd"ӫ s4Fː\{Na ҖJ|/+"PAa([mwh\_pja)J0Y4D-Uk;.o40nMʏ^Lq ᭾ۅ(Ⱥ'V{4ʖfKSlawM%_Ė"Erfw%nܫ][X`|Br hKUKS[|?oZ}:ٽJɽp%ME86ð1*hȈ1a1'Ԕ[Jeڏ3+R63wI󝤹cK-$(ȉ3qW ¥v?!ؘ{[4eZ.7w2=ȐȰ{?2́Xsg80I㏓ʵ'Pk9mw蠓gp@\\}50Jz\uWQV2Жb!y-1z +sVUzW Jrs$_7; <&v+XXθm^A %N\ -\WGFo`i4o'H;IYl(,OmcVʣseGWʻkK;`UXM 5s0B%qhdQD8ۧ9o޺e+Ӳ/o=Zv)+<>p&hk+BZ@TjyPeiZւR[@ܠ\K|WWĕ֯sfl4W(x2:DW/wWZ"GIopr7PmcgJϮ+#4> =u-x3Ҵ~v*U?$ŇW;ϟvT+[t* q\pEId@4k 6G>'גH'5U,i%q.MOx+(7UZ_p)m;Pei?Uwp~6Fz+,W(Wk_pjա U2㪃J+=6x+˽-R\JKz\uWro!gC"*^@Sl̙ӤC(s_(jU'.Rh\Xp J PiUZ㪃R4> [ j P_WPB.JA{s_\` 2 P-m=+c4> [pr)f ՊoJz\uW|Z 5+kٻYBj>Fߙ`ű7游Vul +jצnB{+LW(W_pjh;PeN{\= 8ov(g J P%mDŽwUq%&'\`ʹ7B+TkZ?W]ST.P0IbzVfJg-̸Pa)*y0N=%DCnS1|M؈>q/j,؎[]]o9v+"#$ N$d --0v3a3K @+t+si.-xrtsp-Vj_ (Lzt4jl\BWmNW@I@t]*XCm`7CWn،ZDO;sh-y$v] Rho#޾0V!t|\K퇡}]|$sdte']}Г#eCtG 6Еul] 7CWnԭF{t5P0ҕsrtbfjn+t5в?v(,#]tg|p&B"a2Ef5;T_\Bc?iд`]01o\bhѫJn{?(]yfj%6PIWϐT-Ztb6Z+NW%Ou*h7DWOn> nf hNWt*JP+m\gBW? eIWrzxG<ZtuZ~g]Mڡ'u3tE^S80;] !]Y|t`Հ`j2ҕCy!`pzahja(5NztŖfrX7t]l)R{%~E q1?bLt8X6ݔWwbvosx^#pA;jD-}$_~QIg aH%ݤ|>5}Q0mw|{ J7W]G5mlX[xs|X8}ęхn'^NŔ&tޏ݄|'0S}rۚqvgu ##PQż|Bݲ03p@_5W牻淙)̏@b!_9 y>Kh]AnyYޞ *Yg9d9PsVBoZ&j3;ex>$c3>n Y!w3Wϯ_\* ෿8E30PWoj#\NDŘH-jrLlH&X(BŹf(QwJⷒ2 UHPclB2YK.@@w-V(!Ȯ0w|͐jPoB+"1u1J05: Ȅ ؙ|XS(#χPͨ[s ƒ]9 N0t AC\6)XK#B`@ ^D VU r ֑\_ J 7PP;E4GRF[jϲQwD"=dH_(hV qZ'gbUKCTݫ.2n LTgd0[ьm ɫ#*Q8(]l,@uG@H :%AuPe0M:V%SpBҕkɁJUe ,#(yG V %tF\x(Z R|$ Lȼ`|P![n 32]\ˊ1ZzOC kRt_ 5Pr.@ݺ2:@vdm /} ,:]UBN5~d}klU;1 d-Qةe6B?$nˋ }9ż ȓ$rS,ی7XktdDHc c.ISr yj"!=./9t 'puk ) Fc*2/ExXKf %jJhB;ڱ", + )jw ,%!F,ZҞYP`1% (YGt,EJ351bBi%&HTyP*Z(o<*"b+jV-},*Va!d&@gM7H!(vDq5ͽ n_Kct՞EwiQT,P47uAi*ZKo4^"EV-w$a5.09T_罆`cG#Kp03. ImUFjhI>/U+{i ,&].^_i{2I2ՁC]!ݤuA'J:4'P#(HuZ5;P'BI56#YEB,D7*3bO6V c &% xKd}pRSP.O(7"fhKq.(խ:)Jj 2T2J`(HȶČ,- = >`J{Hoݬ7al싀*'V$kcĨk~c H.-a5èdNUkO^s (TcJ ]\1lAb 0qJWTS.\;KC1ˡ옍f_MAH28ĥeBN Zf !Kk6<8D;+!G1  R Vo姽v+noxyC r48;E4I.O[>8}?|F".4OѠ~ yhHO;tSzSQ߂[ywj6Ňħozv_]}xtcHqroz'}|'-d(xon>Ɠ˂SCꝝ/bls˯./tr9/>bqz^FË}#ǗO?w 볗/ tQw#.-M7}:|'1KCh-G>mܵnvϤ |y:cSMێjl،}7\Miiiiiiiiiiiiiiiiiiiiiiiio2\7aK+ 7l|`ϸ ('\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\M4\Mկj–7d`a3W7󄫁6c7\%4\=GÕ&iiiiiiiiiiiiiiiǍYox+^ ,ds? .%?ÞL'Gt WU/- ԍBW}+qԣNf_e(q(]#J:u)Z2Wn]!\ML_*tQ+tut3NÅU•@BW,4 ]]!]qT=+4e7tpwU>V o<ŀ|OE,Ei#(nml֛gˍYV1 7%?}t2_4 6G89@E7;JM6>}ܚJ'+2/4*u( -4}4 W8q]e7NpFkT QJ ]]!]ICldo W0[Vt2FضЕ2p# ]ez,BUFy++m#F-@Rz zxK+y ڏBمvDCٵ˩BWN=%`:p<22\jBWbWǡTEmb=+uѽ+) ]ev2J ]]!]\2JW5tQR] JP ]e?`۟5<%ㅮ=>Oiqp?jpcqx/jQqc71 *6ۛMU0Ɠ|ggc ]!Qͧwq5*V̸nG(V<>̅vg v!Ϋ?a|`0 -?c2ؔ⨾>}&AloYn˺~K}J=-=?;W%Y%8 Q *5M,c^Ib&z(#TU 0ՇI~8Kz$EzϑCr-~,VB+^ ]"Jvt2ح-yYETJxb* 9̂9Ȳ)K#`=> e8ؐ|׻M3!9 ֎eҊ*[Tk#-4kj9bE%5偗9~TyablJ3ٹz̯on7%a]:&]|cKai|1Lr&n QoT R₁DX/Z۪{-Zyނz7CRNO$> ~V +sqdlWl빁F LA(tSk1Lt:CH[؊F3U(sZ8]^3{װX?6԰侥LUs"w{~]ŷF>쿝~cvX[z!qy|挏69y8zʵ#xeJR_K*JPan:N.uQ$, 1\M%71 XcB(y{:QZ S]WӓdZxsEMda׉v-[yJ]YwnlgTWuf$Ct''aqnt \}]u#"պC<:u ^H-NPuBh+@ZN(>(.8H"n<ja4\(56^8vC!20 E)D˺2_ ]yBpq,F W)o PTW"B"eHb9g4M|xCB񃵃BS;LWH:YOBN3:G@4*Y&TYgYʓ*T-QtE;jÅ9ލ a>ABC3-'' ׊w ] em`P ^&D4k-/ *8J*A%t51h@ UÀka@.< Um{jHRUD:(M\m"~{Th/fbvr.8L7!`!:]X&(F Pt ߪq(Kǽ%ƐP[5\"%)!蠵KZ 9VA() <`Jh)%Vf،8 !F$Key exAY1!ZߞhARKn\ВAi(ʊH ASW0$ZC⼘B Z~ئ8 H#Ki %wիM n#$@{{3%0wÙMs"\AzsUCF+;UCFMy9J`O?@;URW橅,*cEm+' QqZaK͟ I6#Kj&E ,)yF[Y0 ךE^)EV|$]T߼3&H^e0B:ZZ!Ćʩˢx{2QWyv4~^^Z욥Bv.ř394eV=<`df*FCȱ*JHS̟)BtˁEϋyo横E%5)hNϧ%g ;^rF1DTtZ*F4NScpru9CSD Dy ZkEsL*Nk  l@\B@R(ixЎ$U$@%?bt4\=HN 6-c{4wC77?VOBmyr7E'"fu0~6̦~@`c~A6#qu"zajCM2aTЬ ..BTFzh_ǧ<wNdE}@klo3<)S$wxk:|l[lvkw{QCקuբugtn' Yw-+lY-n/z4~=7ZdշOl|#!veC닟>'__&fd{ެEs5UCmڼf >}JRHk`G n9֌ÔHe"m. Z6%+pWIY hB$JV|k6/*2A)1PIISH뢒{墰[rK4 m1!y4.jg1)T9NX8mYm ۥC;tH+q}&Dl?xΧ?x\ )TX$Pk#XR"3],)ۚ :}\l p!C( SQQ`,C2&%Bʛ,7.DtlU,I+ƙ4 GEQ% CkY¥,D'рE+=oYZszWfN>T+K4$ <.(H<mr̠.P 3YC-*Ц <|-$xo-qREO18QG)ECy:sLS\13|ԥl[SoNH VadRsE.n[\1I( q+9Dr0Y5xKh~LRL'1nRdLxyV> 92[ iu78ѳ(_k"r򩖕oUP가 4hV[J4Vڈ3gEBI o/ўz9.hGkrv薵C_c4/M>&M|W |du 6a&@@huFG+zS#UёQ_Y~ p"`'H6Af0x%G#@f0tTV抗 =0$'B$c# B2|IJ{D҂@>;7\#$֦[L VDD c1_pn39?ny-wz4װ<ɯ cV!p0|@)s2MLƂx2Mf#HFǤ"N.R<^TqL I$*3 =Upơd .9F9EP%"3 q(W]P D uga>~(N`/"%\{CQzc6qXbA QW9+ h4~E{8*4#@Gh#1@ # AЖ /wĭԸՠ>̋?c` 8~WU:@2< Oro~z]\wyNq;\{ z8"v(䤮a@C ShΥSwd*\чjxuF\<[TÀdJYKdB}\T+rX;[$;K<D>t?M0B7;2|(E^-ZO7ϯw앜PA,|T]7&$J,5}٭fF<-hѺhvQ<~bɺbv NW\q /veל#x2:o0~jI5>Gbᑮچ!4g*Ad}Oj|Jd]&чBWcm[cQ >rmU#0WVb{XJHk2s*X$8G47qSS<{p<.pzNg懳{}:;_V`l Gm$$iO ߟeaР0D9gX]e92}!v1;t  JO=.2R>M·{&Xäp%/1jCU]Tk*0*wkQ4@T+isJϙ%jӾj%:8*Onl#8F-8G:*=X3k:4GI徹D3Ha yݻG 9>O gb6&s"*MGFytD7a\gN[ >*QQoX7W߼Ǘ͸EIl/3t#ou{$]\^Zu-RIwp)d0{ aLQjʛW ȸ[ dwz7-<º!!mTJDz7X,XbvjS CA"`71o$c <* s~]`DT;{ SP.f#gQ糺:aYr^QwO,|u$Q|zE&tw9)eJ o(0*pJ"E l"ҬcDAB)@(#'8>jK DR5_ȘǑ1 sqƞXQf,= _'͸%vpNJՇGU7 J3MnFj#vr @KۈՌN.PkqDEBIaFȼ\2$cxRYJlR!`V0/)EJGt`ZDR[8bد]Alܱ/j㼨QڽƏP1#8B28"Qya3!F)#"(Y1 4Cv4H$(!dz}904Bg#qj=rL?`&!~|G"kRdP+_LD˩PJJ %I:kӁ?뽽Rluass\x/%LR!ʔI;}bUc1=vR~ns܁b_I7`JSmYI\(μGqƕPoJB70t)mF.Tl aJ,} ;l課=qD/9}w=mw*HK.?l|TIoH]Xk= xy]× F e F#b+T.])Laʡ L,Rg2D'"k/%% XZY$G2C6rG%$whbn]Kb{7I)b-k] >sQTD;`HčT:Z #B MfScOVsCI &j$` !9R^HP:80e)\!aTK QQ(!He=ԃu970 9 {?tHZ=@8W0|aK)~=329Ž?O2L6OrS9l~Ex߄ESV.g)͇7 Q̸ up{58=-^'iZ<(z/*w< ̂hKiAj72NoEq~\j-OQ_.5A?XW>Ds=zM7F&S˟N<\MG)Bi+O$ۥ`򪌜'JϿ~aRoRݳoȤoKlx6'?i`W:Ώ/c?x `-8`pj/">Ϟٛ_}Fב| gg@p:Byċ 8΀i'rnUTiI/Lb:'n2J#۹K$a5Z?g;btD}7TX0ʮ֝&_/wW$BMsǯZfi2~c'`,B4]O)7[Hs/~n p#malY 0N׃xP0VZ?;4Wyfdln[䋆&iquyegGx:6~7]ta\N|2_ XwdYs$"r>zξ-wNPoZYK0X`R~e0V)%a0󨢚+&9NxQ>?$/;Aw`sm mo< k6nk⽛ 'eS:2KƑ)ա:ltKvirds'I}HuힴTݧA(И`P>(TwkBP&DSE%;N/L13+t˖":[Xk.4NBD#JHR.!V)a91F0v+Ou.rqζ#I0.t{ XA>e΋żJZӢ"ұ$z$t84fͻ~6_=}>/g+:d6j>?7}gbf7˟>q6ڊnX7iiQ£ѼZ׾Z9}.ue!V\m[ Xѕφb ޻e>Ι3!ޗ݁U7sP Z7PC@Jlla74΁6խ\zuV-41vM?<^t}0D=&}Ϡd+nJߍ 3=},H҉t֋KǸIKǴhsKVPM.]rxL\ӫ/q̬վbSWQXٶUNUhilB]Q#+$SZRtG,wi;CwmQsKzg蜲=.q펖r׃9?N9GɈGʈgov'7p_/ ^_%{q57YYkn.o1ںmuyu-ѧ䇦CN75%| Tn)N+;>`eG޵ eK`D:t=RtjGv<;8;FEq@](du56J|mkNf8pz0n^{9=as.0jσ]~ıBxt_ཞUa׳2 r0.l{?:rĔxuxN ZHAG1kK6i}m SbYvv?Q"X]WIю4*UGWiҕOЕ/ڵc\RbtŸZIӾv,҇ J`+p]mxXp]eTj2tTtEI7)"Z\bJWt5E]YVlW=w<@Xw? \kn*Z!`>.6v&-LZ0 '_ZtOC^O,lc~-?\\]Ny {UW׋|V}?]Y,\2 ttC_VaYN?=5kZ{ph]tc+ VQ[A t?޿{j[_ wK1utkjj{iqo֓gf3Z7vJW:s?3?0^nBSܾU@6p-Btt___{4B&Z1z-fK4{@t&!P(IW|+yQZ&+תּ  >j6B;U&' HW+ƍ^6jڢ *c2X,i2p+n1%]MQW5Nq+ٯ&JUxaׇX]WIȺJÍ#*HCiF奫PtkCDA"JWG)bZgsQFEW((HW씜q+ 1w]]MQWPA"`4rtŸ^I"+Zm _]MGWVn*iu1_/Wn-ˮY|l4|J~N胺y c1 lq?eiNwk}WnٳͻZ5<:-`_\.)t}*ڡ?KhYsQ9F/]μs3@Rl mZS5qk6m>[Sگ{vbnߞjI`Wy뭵VWav)oFo~U}P{]s\s\s{#lComVkMkuO/u)^ޣMd삃iW{]Y U]nN6[ܥ_iwӳ6*ꂦ5 nBֆ}8o-_ T@Z85N|M4'Pڍ-0:3`k3LQD+C Z+FWbtE eJ[&nI7 ]Wb晙d)-]MPWQ6L늀 2e1L|"ʰ5Xt5]!X%-:F[S7)"`O27®' $`oUQ7$xBWic-:N tvz(FWJӎ]Qz,Ja*Pbܠh2)u+1 HW $U+5Ţ jPvZ5'Xycml<}57KE1cvDr״5ؒOQnS ]vrtŸHSjUt5A]yU%HW5Z1I0ӎUO*2$x 6 )$`]1sRtŴ/`JEWUtZA",`\-f:2]MQW\Ir!l93,`ZڧJ]'+i8vt w]1/ue銀btŸQKFrQj+)o0mBu#ͯcjY7"uQW,IE"4IןZ:S̋=$.gsԝ5RSN靔Lh=Sp_{bw?jB*#i$L0R6d?qCZ)-$33A;h_}bJg&+9<辺bpU|L w]1% *P*%%lyf EWDkc]WD]j:AIZV1d+tEWSsl!]1ntRtE_ǔ]]iD!F=P{07{L*w|FW庺9jʀUtEZ1bܑH.w]1wEWԕQy#HWp]1)},ʡT"p 2.FsSfQՑtZƔe=)z>U$84[̟l:i>*1f\9&ڐSU,=AK{p ]p4 FWAQ֊*j銀MbtEiC90SbɁ A9 Zh>bJ[) 2A ;#gp~h_poXƮގ]'XJvJ q7H H-7]Aծ]&" ].1"Z]WLituVG}o_]08+ƕ++ިj2EQ 2nRteW+Ծj!i-HWIjV HlV9WVc~~ԘKA ;uqi63,,UZ/)JIDg\ϴ>U{v[Z `A:SHWZ8Ђs&hI7(EWL0w]1e ʻ" btEA]1~)]Y3E]StEQəgf\9 Ӻ2 *zʭ ]1ƥᢕ+EsSڒ NQW^'JW]1m~)cj;/z}* ȺJ…%H(5dNЕ.ڵ!Xg ]/0btŸiNJ(]MPW:DtEz1b\i+4B/:L銀]JWi*'FjU4טSSNWZAmj-IfI6fEGѴC+rRtŴGLuѪ銀əe fFhG5.R))* +6G^+bfD_WLJ28E]Q.Pq)Ҕ)+®7'J9P'iad0 G~V2VpQV$]( tE1"ܠi]WD-  ]}[4\ci]S-aQte<`A2^ ]O*+ %2* I[+)Vأ+ fi]WL]u\;-HW 'd\9bg 2eEWU QE%HW ^NtŸI0EW"JjQ)/gqRI\7*b]t)OWYnab>nf^sUgoO tsyN\S~ܜP7(>?nw'?x|Զt;myէ S95WŃNMbUy7xb3_h41ƛ_G̃7~c]қ_źyǗ^w{m=kgt,7ݴfVF]O_s5o^~+'WlZe3"ut]HR~cy:!!v wlۼS~3@wćwmG^RtwOR뒩k"}h쉪_Z;`_khnߦt w5jcG1~_"i5t~۳uvuߝ+5:jq]]\v;\[{=*0wuqV4)0.3$, Ö,ufR[*Tѵlsϭ!kz^,9.[%6ź35hrʬC#Uc]S!R*6c)jJ% oWե?0B#Vs'Wjeߺ1&X0ɤY[n͉!f-0c$ZZj j 9Z(Q5sͨwMsgfѢkH|l WTsԻfCw;:tY[PRyiNM#$B Tg11}XTИ7f(:\b/9%|nDF0sds|_luɺz4lҹj;ІpP(bC&S$s*0&@ʥa1,{{6;[ߨQNGD^WUt>*6Zڋd3<%cNօ9`'97g!1TRZ S $]1ڪS ;Qd.G@11,ɇ%jѻ)F$q -ڒBGZG7H }m f>% H/*͡(-. V\jH) AFUb}ɈX8pDڌ^QUEbR(>]ЇlTW)tO|E+ܔ}Q\` :`ўJwb=5ڐ]jy;akG-/V; S. }FnQxj v**6(:ݠ-;y9uԡy/JV-TbպJ|Yɠ-.ϤKgǚژ[]Qt%ρDKE`(.t-c64ˤcw552PU'aGRZ6fa=g5,9C*D@ %X_;wMBAqUSO9.H1$؎~okJ R5R c  ʄWgwKYBШ8׀TߔNS4_PNJXvl&!̂ - %juefH57]\?,1u6 DaK@ mu!n2֜A΂ŜE' sG !.A 7)ؗGC%P \O 84i^Ƞ xOqrm/fĥ*;f#A1 UIkBm~ 5un;/Lr.q[sJO?~oUA'W f=f˾t4MG1*|> ]y*}td 9{H\!QUF2bx蘊'8$;Z:#.c) >@E&rZed^S0P>x .΃e?-~OC }$H֠Q2:@vdm^XTu~Q% 9ՠjK0-"V1jlfnqYubl_NXZbM/=lC$Q52 j@eV޺[6R[U #O},edEIXE y6 G `j<U;R@K=܄ &yhHҳ/n^`ٖru rY]Xno~]鴶1-|͟;\ɛ7 Ih~3DZp{Nq.NMN6XO_/om7}j)ڸon_@Ϥ ?1*(ZqpMhٟ2P:-N"B '8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q:P&'k2+r_f@ٟg=P/ 18 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@/ b`&' lj@@kwD@/ uZ@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $NB㚜@8׮L ;2ʙ@9N J'ҧF"qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8t>hzoyXj:Z_^,/Z^w)۫p.\=q(Q'6.ט5.iKǡ<0= a\b1.=tGZ] BWS,C Xjn5tZjDž>J^ ] ?NJ ׫+%BW-NW BW/+vgYqC&1{u O/on NmKb~kEoapn>c'm&6Zhs"laM, ֩հV@gKD6+xEr a-t5*>w(ܲ{tX[VDW] ] BWϝJ'D_"]yv++?VG] a5Dg@t]2#?m=qa#O/?Ax6O?~O7۲|7IY4>Ǯ&@8E7T4,K^N~ܞF`|~i) 8ݻG,!Sδ]S(LS==AKrQ֖p~K^VM~bwM ̀fnc ߟvj:,v YLSN~E5i8q(ijwكt 󫳳#j'g EvWQ-HV9_ 5ǻlNWU(hQǼeVնWݿ]Ʉ51}cr2SR+']EAx7fOZ 򨞐%+M|mdŗ>Nq[ux;\yM^/W[tOVh0h erwNR'p8sEIƵaSY$[6*(+4@.I%.$@]$\(TdK2Y*3z=`$1:ҙR{[T-Wc2w./sͽnÅs8 L83SC%w&UN}CeC260YBj rZUj Ua ]CG{]4>=pV #tU"#~ώql Uk^`(y!Y@eү,~ٰA[n@\I'Y|a!u u݀kn@.T'<ᦓ}e匸)grTvF.alaޜFWE8WLX +5RZIʤ ,.~CvQ{i/fbYD@*Ks1cʟ|ك_*w@+nq¥׾!Y!_H(C:za$%%Xe톢N Y'lu^]AUkWWin[/a'&tS%i8gpγL<&|aNQF\ݽn??/pӇ,`ATց]o.L^)n2P\l:٠MF;l5[غsn{\ID6m_ͪClY`bwoynt狃zhfOG|ϣB>,e>"cS}r>xbM 7n%P4n wڀ vM?Li{#OTV 9fmXr BR)xϹNlф%,C-*$q-Yk(g\+Qw"bJ0#$pPLTi1EY ߾̶&dz%$^ۯ=JO!z+tpPJ$EId5AYP:.xDʓV{P+]mk{;m[/! FT N%`LFaU J6Hpsˁ Ř zъ(a~sݕ;D?٦5ĥQ&43hi0x-$5O;M;tc>^nd'U>S>:iA RJD8<_aS<{M ]8^ ^V7gu (Y၂KApQh ::)"$Mma#iL#LAht\xdK!,Z(Be/@sN(> @dɻO)gAJQI("|R`  Ҝga>~3٠'n≵i$g=Ks#xoB*2 $ RtŭW2 9Ohl4,QԎ?<&` dOzڗ]̭ 遼?|kk<9(T tV9lt&d;< Nd9OK7.?`xC Tړ|ŎVnEf4qV(`evR}#NAS !KvrMN.p :-ϖ?"U SM)XDW!ԏTi"7㺍" y!3e"āp?vZTm&;=ίq#Q,p>.O!pe&%opfkgƄ>lhzP7'Ec⃏11er&iFL+S,rL'B1GWm0Q?rM64Wb;,%$,}8M id1't0o:NNF /Oߟ9e|< !pD"/&Ǘ ^@g [瀡5is֋?d\},>jʘՆ[ gn~0u]eBu@υ{&Zģ A\~F1CF ,C+UTag\ftP$,Q틽6PvrcI +(>Q*jDD[XrܷTY!zK;*{V Q( B`\QAq2 hbҥvs` ڲNJ@d2.i[ < fv R`B˂9Ua!NWkOF%/xX-HWG)_0Zsa#e@r"t \G#x '2&VUjF#ňH eLL2T=L0Rd="prwʜ_5ck<*¶8H]HI˺UuVyMF 9V$-0͋_7(x<WcAoɒ4pP2Y舐E.C6\d$',MtP+5޵q,2&8;r/'8;5T[)&)_o($) G4gjf.1׾ĄaU ؚˆ]LۏqbiǾMˢ6%jwΌe"3'$! M$apnB R)L0sAI*7 dR,:pЈ5HP Gș(oL2+]8aCs6!c[~i"7ZoQ0& d$ 6F@dp"u ŔN5rcSTnҫKg|P`B@&̈́NSIJvDp=^W:iɾ "pŝ9,jL ,h۹*LymBVj5G3;\<.&cv싇0vY }KO7s*~)+;[!aMG~ԆXsDJP`CIcP2%YJu(yJa+ Z*X{~X͔gD*du^0 ѢεZ&8(h(F۴;F{/*ѽsU r ?gP>&!!Ȩ1yO #D˔Lɲ I&^L0ϣR 2 Hs-L%ʬbN0pHDbKx8wIxjH?~j&EAoY\ >gq1sa8OaZo/gfz8X='_r}٤7IBpN3KD^.&#.\܅[H{yA 5n0 08]/8U%V 1]WIM(Jv9ieLrMXmq%ۭQP7=y4RO,B74#sdߕ alax0St,‚hAVWF~v6rcwKOh/XE^2jZ]o<w3O5'Rg=pv\hIlNqo)(ѭ컶xr˙]'nVo9Q8o]Z#-w"e>1xtkBsIj z 2Kv[1E6 <iwyq\4=HIh4rx)ǬV$gv@/k`HoV5b/7-))^/l=Vq 5kA\OJ?K,hB?A {k"&DHFE+=ou}=&M~%A^/*v =TQb 䤪ry۫W÷ч&˾):p=\{{#[\8 &W/KZmammi=%Ӻ\gYæsI-CyVŊZ:бe"γ$#[v.֗X+=ZB_wVJQr6<uE* Ep+dBha¥s"IRzzIPDRBHZcLH{1PT:b8/DWv"='_Q̴y\Gr}I*q8}޼R4d [vVrD'\'Z]6()(j&AC74LPMTd>Dk,UCx;Vx9B `Hֹ1C[O$OXzH NT"Ms1,xt|4?qZz1'ZjO<%ٹٹ-u?ud"6$j%/H@%7BtxV4 <7huΫ._dv>s߈U8mOCluǁ:p ېRJm3W RK]JDTJo-^+Q$6Ke;Re928l@}2j͘ ;4+lĹ݂7Y_OFY }uqc, GՑ^ m{G,ґ:K{hm(%H;RkM|u9w?M>ͷ]φt(%Jڰdȼhu+^Y+^mk'vd)╢.(Dxш,i22ļ( 1VEB(_17H:)He"CUzetGᄌ>pJF * bܥYm*FcȳǮZ\1wytW!χfݻ7ݞm ֽ%҉kC_.'/T̄%6jZ1q^P&^{~bzC%^G[W%;;)Uk;HR=}w\twG*i[jvWe;vh3{tY!^Ʒ%8Z=Qa"Z2dXnk"Kh|-.XIH\P yTF8JhbH9hGȃշ3"ƻ޾E¼oiİ: pD*Zsj-\!(DE6miX#ֺX#֢X#֎X#!PeLq&D1́D/zJ\0i'r ]]g1'9~o2VwYzkkLNz8(x`,2t.`ŃH88 ƴE "th`xu .N« uB.jO:-w}$jꈲP`9,ǒ"?h{^Rv{O1{Z(;"[~4prz4YZd,b\=It%9hPiY 5*(Fй(^հ~A=UsJ2z~z.ݡux~׽Sto nV5#?~j5ޠK>?VlM^5AU7ngla 9 |-Iq\CE"ܥ:58ˏ-z\m)*6W|kF~S'/lד>rYV %R%=fEzEuݿ'\qMM\7~6Ts1q+v;^oޮۚ2=۝5;;Ÿ:!V%[>v?qz\o^ҮBzkR.oM[S﫧6w=:"B)QGWY\&ZvpJ ?"BGWY.w/iѺWYJ;zpT L)=򣁫,j=RS+NZXXR/?{gF$"ݶCVa/YLneHm,9/?U-hYn@ۭ&"drNT:?>Li'JxXiX^G nG22/WNjR?*M#GJ:>CJv"wM3)fڨr4QzU4=DM{mJ* X;9cvP֨Ģ*`+]JD)bZgsS] QWPJQFΘZ#EWL|bJ,oD+4 ]Fżb ` w]1-͞–U  &$ܠzjOk1SenBBC^@tE61b\ RtE.uŔ++]qJi)b}=]WLij$`]1A)bھǦQѮ3Zq<9mêGt2rG\M_3&yui-hLG1vH*׸\OOŒoShYV_h ccYPs9otP7ߖ<z_ .%HexrqA?CgzSL 8SL*;F_x:xlZkN-"'jt0U꧊nbGSkdd748]O&4g:F3v ;v;Ugwܜ7fnFYekM*ݙIm$4qZJ-?J4][miG.]i5Җ?Ѓ&7.J(|97N΋ JF=Ugt - gh97|+E+EWDڒ<뢫ʻ`+0btŸHF]WDiUy[V}Y~/><_ǸlvDGOl6jAS{_t,^=m{YaY5Y|^hcu{@gjUTV Jww돧yq+;9W*1+"38tgAH6S)ΤNs 4Ō0RFw֫Gw2@XygR6GGqDGƦۗہ^F9i5.wM3egZNp4)ВJQ-uh~S U<^-FWkL e80egfptg ]1p]E3#imbJEWZE銁]-_10)z6٪A*{}p=MąثiC?JļAmձEWz 'HW AQ֙uŔj2.]<ԞHS"] PWN }2Kõ=ϻJͿ3Ȕh+ 0a G`2qؔr{ARTIA)1fܞ3N& fV1M?FTT^NTɸ(LDJW솨+Z8==6׊3mO)/L6 btŸK:]WLʘuCQoD *'FWDWL!w],7P 'MqQX& SvzEWuz} 'HFZD\N*)YttZ:?Y"p4btŸRtEFuŔ] PW@i>1"ܠP\bʈEW+btŸ^LtŴH] OW&`>m $N$&&4I v*&\Z%s:VDTgyMS'@ks]+ #Y\ Wn_nT7kC 3}Fr-0e20\+PӀ+%f$i!L] GWAf6(FW]1m(آ*j OJ+)buENC@MT]IWLSvSfj]-*3$ܠ{JvOK ]V6'Nv(FW]1-uU] PWPL+{=IVj0&wI]1.g]1e EWԕ:VGz+ Z (i̎]0Zi ُWhzv:*qc++2f7D]y묒+vJ+5b+uُ1e(!*X ? cqm+TuhQ(+FWk]1m_Ʌ(=] PW< HWD鱌L%Zb)/z6-ØA>$`}* \ ^)Z4J,]AC^{2x+}ұtŴhrQ]t5D]PkAb`bte< )EWL0w]1eEWNRtE]1wRtŴ ItEau$j&yBJ LwjcKbf R- K:$UTL,ICI %&uGRXFYA  6 TPGF" sTlZP;AC mcqhpF)C L׾&I>=y“ -!M.Ӏ=F#EWDkUR] QW>Zo$!c\+FWL|"ʨTubh銀b㢖Q33-bڃ*r%FWk]1˾3ȔDWCӨ銀{1"\w$ZtbJSƮUo_#4ܾM =mGF +[tЪ7J?ʮWΊ EWLD$J+PU+}4`i#+PNu[] HWrtE*b+w] QW(@5^bׯFC/'z,^OrOzt/#d2}K|g~6>`;˚N/򲙝65}E[85z5?4 ":S+ho᫶m|1]Fjsi^0Pu9Ƈr'$mS=:X/xeN ?EAl|ucsYώ/ᩚ^7ЄBwO=R?P1xܴ(>ª_[| $:X<25ۯ{r|ND6{,7קfUtxYxqrNg9k.۟VE?{f>_^lIq@˄(G3b<֪q\OySཱུuuy|nO_O2捈ơ39fў)aў]J핈G*6>.˷ WЏ__u)u~C<9?Y|3a[ՒNG]7:{Za4ޚ:M◭ys%ЅN޴Kk .*rԚV'7_6'V >?WnջO;EG?za>^FjRTWgj0nyl̰7dd(ʯ*/>b] ~sU!v^? F9V"\x8uOWyոg5Hmv5xj4.]}/JRg/ +P#^1%uO0dYT { h.$7SR[vhIކF{hI=VN§O{VLC1Ϧg6^QOn :(z%7+y@;O*:P|Y }?A0z{~ *f+ǯR7/kBfhqEfZSٸO)Gye5UF"UgAX_\Ok !fZ~iiN};t;N.Pg/**~Z|Zu}-=U}YC{l|2:>Y,/TOzsa,1ϧ?,ZQss!;6`.O3zJ´ɸ;+|'́BUVo=bٷI {eQ[շmٞG/:u]t?~~_?WDG G[ur߫ίx9D4q YTFMzZ8v dl<`p^kNC }lM^Zpp58zJڌMݠuM`6z'Atژ[].m >-ffE 9<`(wO䎽<ً&:Wsj?܈hYŒc€WRrB-[rΕR2&XY{줂yK0rYt&2+J-g=:m-2{٢=ْT~RLgCaK S̟ ipVJ0ZWUjR-{(3lVrf lzpˇR>5R7ڂyMuNK m]{ٻq$s@nٗ )ۉvlw:`d-R4Erk&M*VU6k f%\ ġlV? |f#0I=^}0GpВX)pŜB@1j?nzibG[b!PHO&,/ xFٵ%ˇ7wߓ/u}yHy2¬Bҹ IcqaO0tkGKxeF4H_^ l-_]rVX.KP%c@Lf,X&Mf˔b*H2Fcm?Kr(2 N1Z~(ѡ-Xr&Λ:|qT}wٱj녅ėJd={Ye,a´dؚHk km`2Yf5ĜHBX]qO[ufU+JGMbO7%'ˬ~0Ogbo%N+;,,Ϫb /2^)";zn_\"|qJڃ82>YNW!қ@a]t8P:!:EB/q'⨲Ik^z:^(2-.;'n]B!EJ% ifb y1ԙ@+)i JMsLDfcD3r%l8Wb%k9ZێmYG/tb0 "B dmѧԹZA^н .(LJ!SΌ`haXҌ' sJD ⵅ-dlirM,5M!}s†p*8g\冺RJkcJ1&Je\G.99l&@\j*By5c d,!Ne_LzF%li|Zݹ==w|˻`Ktp&yKK] nP0 !>aDɶiVbbث$_n6ivGbBܮq|b9\ŃfP{c:5_Ƈf^y)/^] yѦѢ% s3@dzcz!B TPoNR/(i0&R DBVzs%//çX!{?'8e09V ap/:E \ +G R}qZcma/b!KP?8 =.̆*Aez e_$\wiωD,9/Svqb5ِ}Q- ѓҺ~t˭5~PJAg8G89IsP\cAJBea4 Rc IWbOXٕW.Gb7a"ؗ؂㣣]Fco/Nx&^:4^(32x VCB3ڽ_pPQcuz>G!7V iz mzI o6+.ʟ+Q~6?1 &qo En_(,gf9-[Z.x!30gOE׋$?d*w|:y7_of.i3͗AJT >o<,!# InM jx s:ݜO65_)#=5H@:-zƘ v_/u\/ j)/f:O lu6J _)ν<.;KTfEuf-vwjɹW51N߅3F:&(k i`9ddJ[nZlfLZM3Pƾ&%1n ٨l4{<r|LꐍP؍h  ږvBh# ʹ*΃ s9$cT5C#AkhbzAmKOe133+$X<HY( 75j>}ZfuQ[JK+/4*%mip JsϓIBhd6$Z RH9Wt Y*e t4c}N-u"_v iGzn.yծz{e5+FtuMeX_,K*αtf?).AZk EEg?ezӷOhAQ.?In.K}sMgK;=C8!蜝-M=@ѫ9~ȆsڕJ}T"UJI X,%s`CZ% yKK#WDMH%Ȩ$(dR|W~ 5Ϲ6<bɴP*C "[y)Az_": ykWkDꉩ~/גlq]ZPSG^D鿦I)RnGfnѷIW6av4[$!!RIfL"@A‘ QA)!t.\,&w^@1BL0w(3Y ()rd44JקڀzoNIk {U쇥&XNupJepnN-R|Ϗ8O :KLJxCˉ:2")Ʌ slsͱ2y~rp $\kHP07A\j- o0u~UeZPz W!??+Z)q2*@'84qh5`2߫rvWd,W,x%wGaX]͡= Ks@{$D(%E.pXn:^ LcJ.(A9ryq:ƁJ!!& EoQ.yꅫ~ur^h:3wɞ2ӋA' Y/}׎kd~Q͉򬡚PM@^{/I}.NZs=敨K:ڰUf wdlݮv^S!Gt<Ǖ_HHE.ڠyJ0)h@ Jܞ/!P D/r4ՓaZ/B"ENcE}_G nT4,Ee`d ǯu$MOC?xO"+.Y+<]nB /'oFK` vp!إ+}TTk`47d̡Y(z+QtW`PBPK u{s0c➛ai Z=,#S]v~y~"qߤH*܋`ZK9nU]\5ɚ 9]Ġa&:yxdF>WNi.]Ֆ{TWR^|I:Wc5㿅3`xgɿ[I|67#gV;t<<ìr-۷W}jO,J垆NAl~GJ9߶j.]~QE/8T%ay>ϿKwsҊ s\]L~|y(,0GϳŪô|\Tٺ +-Z. #B-K= eA6 ~%ւPUUt^ .?/K(UHg7NEWniHQ̤e+Nzja(zƄxʩao!R]tև{uݬ~&خcxۑѯ"j;N y39 s^9!;O{qaDcuڃ >n FZQ/οjEx=Zए{ (091դ8 :<( D}(X_}Hm rke81DXS.0Pz:Q4:|XM*N [jem=~,O@YZIP3[ %.wN.^%lU4R +˜n:QTH KZ"XE[7+0,BqvBK} %ҳ:Ȧzc窧/GilDOd [?-nKd0ddQcWTP$B[W/BDyA)Tx~f9(Edsg BBsO"Bv`qPkZӁunm:F}bͨl>O;]8`kKV>\ /R_bN{Wȑ4 !(0q`"h]p[0aX<`'q]E+ν}DR;/í nH^m~uoc$tC3D`Ȳe]`&ba6T;$fȟ?mGR 䐕{Dtg );κRCm(TBF\Z檬<~\hЌG&OԂS&m"e<uԍe=ՕbG:5_G! R s.vkފM)rRt {8V'bJ'e><=q. vQN9ixM^Tܹ^';>;DdP}Jf-J{]B^ kcK+?:*fT?/c_N6W o9S?B-Ea-]DܻWJG(s u<;(*u,%cH,y*rt$;zjN_$Z.7QҖ#Ͽ ٍ7n{R4i zx_-Wl=KV 4?OAaLz~.JSs>1Yaκg-T t>?Vw)dEf~װFu=խvlȜUpG7qX(%k*{|#t9)WɤhX'Ene=WKX򊩙/ɫ;LN̺anaM%]4]rhNW(FQcN|h?)޴'j3#%ZvQFcDE 0ۋhvu{'{-ݾ9{Bkm 2 a-3L\Fl0 =PzӺ޺ۇgRC(|JbܕaȊy hap0%eduR+Y\ڿd˕JKz#"0bzal4rP((ye`H ܅)G ޼Xr ~7K+xphdH: Xx YU4\65\QdWb}Rei*RKJSi1 m8n&?ڎXF ^B;3I2$+F߭S2OZ|^DyD Z^pri\*"1IXKpb48:#xW㤜>dICʜz4xLyq}e2t; 0:ȘRwǐM3yٴ]_>=B`~]XdI c}ٹ"#q¥$"kP8Ȅ*p-aHȇ5 *1.+,EAUb)bIg?*48>~SF r4Nь?޼(bCHZElbd2z3}#90Iz,}?9{TCgav\׾ 2C^2tZNE1kLFqmϭjJ/ZeII3kCW B_fyQ&?+d^0`&̈́{e**1¸.InxeDbEh< &w s%,{2y.#w*N7(]EZ 4D#wE#NJޗES .ڹ+cEWa L\;].L*I`(HOVd_Lpɵ(˷cb-5scyl,:tͧMˆ5QL-NR ;QRq]ɗS~=r׳XH\WɪZ;,Qu \YOU 0;ӈ@NĢwqU} xLFoFV=@[@?phy#N%'0JCO[k]#-B@:3F.v}@W]!sbq ku6 w]bC0kнSM)'i)-;ΝbG}l*oGpP%!;1&9zs" T CQب%XC4ɂ Ň'͒yRW}}WjzP:w%4dq9p‘a #)D.2$|ύzn/-ɼJUXM '4Y^{GR pȖ?}@]  ѥ_q'hy!!^)8Q*ݚٛ6 ]Srq&#B0)E y u3|CNQxvT:)jFtm@I>'D@<2t6{^rL7k#}6_?:FdIhQZ?*|}*Q2`K+dBnH}{bHM%c4L7'=H)HZfIEt ,'j1Mbdz=XZ|^Pyx]N2p&+cļ1EDHE#CU+kJVQAᚷiPAOYH-Y_P(e ?@U6'iljbr=6KCqBk::3GRW˹Ro֐6 DP}mق[LZ+:F1Qu`7bE}ɡn0mh_9vH #Z] ñtʡۣ2Zz m:f3j PR@Ų,sq }$"JAdnh {U;@2+&=:ep(u;L!\~%aRa!Z 0m*="@_p9Qoܒ\{Bb-Q`R(&X kUݰٝl~W$)&+^@AQq{?G*:+l 7 R%)ylYe  F'{^?R!vr2>#<-$ 6@(|ǎ uX4"JJ!dR"IrAVb@?2On*hU3N4G2a ,D3c,DHlH&YM?8!C;<\Щ"VW% oeRc!48!÷E&m(HsIXhsz k^fdGN+PH;/Q"HjD1.tL}*>wl>BL J4ѳ,˚'G=iQ4g02V2uǪ"ɹF\i\feJSk?tv"˺gE@4fE/%<NR慑YKX)jXd+گRr^H3;ea٣55r,1&{x}qGyhkEOYt&PĨ(a\u= <<Ϧh.96:DT/LEb I}sLP9R);h5jԉ%*9C. 6[et~aGk+w4I6'<4 43dhFQ_c9Rѯ.pYU|yL@c'$jwP[lhEV2Cc[4t5BPX \gz65?Թ %eSsF K`=10 K0T冯? `b~*Ρ(j6Q5?/Vxoշrfѷ2cUU\oZ-_GGܛsV! Y U5SG9.;۬X%yJ~9 \M?Mdz7ObӠ37dmPl~Tجu3h0fV+}= 8}p(W9r{QQgPN(tI;lSTPDlE[{J¹Dٷ9kƕV(N0jԽ鯛5'ƃn7+Mc|yIB3͌BU0V9$ɿB^[~(lF\mgoQO #d{be5Հ=B:YYڠG= 2jǚXMI@!YUERjrc32 6l.2cɚbm{Y*AþVJ+at/~2xc{s2K,;d\c;g0 ɲ5}_?LJLw6z]c{KTW-vtLxL3 1$y3u!zxaK4ndԣ;0WШO%+%0g?b/UAn>CkLNAW츻ޅT͏]lW" UnJT\{nB{S]J S}|:4:~c_!3ދEc )mi3byCHX{0,'):{wf.{6?ޤ~r.Pά'i/&Cތ!Eﳠ,h?eA,n<)CX/#rZKr:{#S%(n@^ތ}q0+{ Xi+t{;5ӛI!20_524 /;K䐲; M&H+W;މ|z 63; ia,] u1/)Y&F]8<;_!M=u"P=CIKsd7ݷ{t7@/$P"ng;L#6bGZPk8mpKRV!ŠCj%Ju|~V2K[܎KZE1O~Ow;`MU&RgKGq@"A& J[4\3-NUJiR%ܡ=רu?]k5&w .BR0hjG.,up#(O A<xw`O~o?]7/6HR<8IJI hN-*^RᲹPwDrplPR@ϧA:.!TC.Q]5EKٞ#ލjѬb5!x+> 4SQwjf#Gl}=R,\J">KJ$ݝ_^Mf6rv!gc2}'/JŘ܉|4EԲM͊!ˆMc u9b\{”a*OIHIjmmj &Fqm8ŕO>u Y_9ƈtcwp=٢@"LmEa@~*Dtt[0W}$^9#(BPbPѰc;F@e_Óe1[Lݴ^gA-0 B[P'MJTq=ǥ@BrߣQ6:\JrK ƾ;EMwI$똏esN jhF4'\>n5 |SϚ6q.5%C1E/}΃{tucfVc U.~,Xr},un}rčdj|#ICwwȇl:#} SNRMX Tٴ.[(gOuTTi龻[ED[p9R؟bɝ1MF.3檱 wddk8Zq&{f'k:R ?!)6 JtNy 6Sc1(iiO!LTu,6ZJ0NuP*&J] eg fdje#DER!1a&m +)%D"C,:E 1 FåW0FmhݬՇצ[W+M 17{@)WW3]rϼf<˙U;&_"O =XXt;r,*c MrQDtOt̼T1  ` a='|Vt }8.l YcWŰx+ӔjvRkEh>?t {K{Յ. swxфuG")X=9!G6/X04Fs;XM*0/t8YL #p8sohƍU)SIfo[pBvP@̏cE§ҩ*d.gĜ+ P!z8.bF-.QƉRQ=Sou_9Ke g@XKYy%hedpdr.:,"#Zŧi4b3fnJ篮"2$|*W nihxe|Yjj}4oWez[y;2b}Vf xEGH~O#u͑: (ͱWNQ#Fi!8B`K`  qdP-m.П34uNO>|z)ۺgI!<Wjh50k`zb; 'ߣxȭʚP L,<-yF:HZ>洮9ri37sEL JR"AVbR/gŤ8ů__ݍ4닸4ᖸt-0 :^t9iN;SBA؀=QǩHMhGB^iŨEVvZbr0qrFu{נct\<vw8DD,qR\Pj4E<ð@q+$ (`2U-g> A|ŝr@ "o]C;`f4r)gS[xE8 ]€K I/lڜ'nA7 6"4.`.+Ϭ2 Gt`JUN`ISmjqnŇ*4Pkb4C NF-m՞wh#S_ydA,aU: #QZ`,T=6p5slOVe|7s #?Lhj x޵6d"KXTa33 3y&Hj˒Z/ʒLKbJbQTvSu\CBEa~`(j F3LƛD K0ǙªX)e5@G*]R.}sb5)C* : 2.㘚Xk*0m?ԤKyxҡr]&cRg1`ZPUjR^_|240<~zYԶ@U>S؇u:Fx+ÞK >+qpY@AțZh|x>2+H1>wB(M9gdJXpի\glZIͅdB;k$7`99/-qRGX]$="5[[d4!JI$j0҅: GDíPX݃.ў"jpc]73}CaHcuO(!WVis3I(it$w 2cFY`|rUB;4צr IJ_~6,d,a?.W;B<*~XEb˻ {`+Юoήe[>M`NJ.s {;ݡ—nb7Gux4~ 9(Du^MBzԑ)Ԝ E`&Tj?0aݞz볃 O_^ >O"qx=GCU #&7~ISemn+9+.;>N[gO>p9PxN9HiA|4|57/pAMt4( m'Fѿ/ +6[nU1"ndRG4'|xd|j^F4a'B2Er5xVR*Ch4ltW $tپYQ3)屖 2ҺXђ^.y"H 05H-)/~;rY2cM[.խG8`qJT9-0Idh4s %EE zj[[6akPlgLw؞8Sx8ąCV 9Ik#RܢŚkt4 O8/".$QXGYN *0ͤ6̹F 2Vz0c~Tʤv+n]CuI8rwE "phɷlQXG2)'4n=uy5ȸ[lwAzo9XxX.zaC=:aJqӋKܥT2i(9"ID:Pܙ[e؀U5E (&RnR}!cUY91||;}ߊ!,z!(,;47¼yyK߭fG·4!qe _,`T^s&cJ*7ix"Y͐჎PU~_QD6B 36H(sVc?o28{ l{|C*<@wbˮar<)v$)m4s ܜYJE;) Ԛz]@ 6+L~*gK~l6TX)Q6:ݗ;ܼ\A)d{, BD%wd0[〿wEǏI+U42] Ҵ?W, i[-q%\IV|(;]1#c߯!1*,ۀqR(WYIJ;Mjz(邒aisiJKhp?|dޞ˻ {xuSbv0Xҧ5AƙVXզXy)I C6.T^ m tჷڈ`2dP k;j8~, H!+ys>@1_lPPFG7'u-|)71] n`6xR4ȸԺ FYk!h@WOhT:94< T4ͯnk~uf0k-Y ǺҰ5hE/$te\-VWE#KHӷL{T\fėO89FPLJ SbZ߶\uP(DGG#M&,T$s'ɬtU1Aa% m"զmگwUmbX3*>^كڢ1`6RXD Li(a@ K58WvTwRwu} #҅B20 XȜ:%j:˭FsS/u."9^5> S eo?gmsͳT7ȸ~? *`5ȸ(,oQm}s)xZۜWxsW"Z G_r}CPm9o>ŤWbrW,Ӕ$򁽠0MG2a/m4,'rд…qPk9?G$ѭi.ÇW1? ]ji?/|aAb*n+]!Dj+~%}hJ ^ ،`o=/zL&;OpXٛxC]/*&lWQ輦~?zQ>ISoURz{ۢ7OgQ{m)& ZuJNb]!4$1OKOH鸥s2rRW2*d12ݺӒH5bgm$6%ǼmC'%fwIx{-::RP7aCb7͆=ϑvЁ[>^/1^O=P1$M7rENC%'ƍ-(&>!CGRv6LɺKH*2un"H)4mS"[f1#*z/Smp+ԁkvT$UŨSXrA*!kmPq&Vor>KX-p0 3bpy*`9P*7ǣ/d 7$ʍ*Ө/7qC!izo}nxR"UJу>֒:d&JDYVեEԽkimJ^ ئg8B#1;% HDDV$}\Y40(Jz=ż6T\D\m 5i-z)_OcLqߗM3?)h\_ EE^hx7R|K\+4Ԙ>+ XT Xi/9ЄS!([e֔*A130pZQ0ms4~/?+~:a1[@o_<3 8wh"Q#*Y/l;`qbSdjd\ U~7#+OW~<{ѡOIhdtM~8G)AIRW1M,(f z!c,*YJJ]d\5K;6޳~g!`UAEDIT ,C ^Tu}3?Ue-i(|&l6㥘lۦ"Ui9]O'H晩/"͟_<]{OGx?~o3qZYOؙ}zΩ.5%2n!KaQ"WV̖t&3L紪?JL$hVdG+l5p:7>~~=?W0 u=yu{\ˆ2˜#&?1Jnϊ~MM@sc\9w a4-n]ai:U Y]O5'_)bجh7J P(,6SLS%7:DiU*r 8 PA)Cpd&rQ dZ{u7q+8 RcZW( |JGd&$FEzBo*h.]HLَ)Dw|2ÿX!0>S%6Tlu4TTya41no Zm {D(JW!DvI`|x5ȸ*bFju*6*%#Ƕ#'ҙ>m- 2S NL{F'0j-.LKFE 'J1yLegUP4UNQOB2^^l 2>,YCd3l"C6 b}d ?;c?pT[*qꐂ~Rd Aʓ]&مќ;#B7V~r³WPc %fd͸>zY+|•Fql1+ [],DF},JI^ 0"VRJ2gweFi)3 ]=_6{fLb߁ilN: dmGDZjDU<;Pl@TԇԦA! HR]{b5ZxwbZՒsǾe!+(J2S\L\ cɥkHU½aա1e\QU99`ܺ{_ :bi}Цg1)>MLlFRkMI Rq\ 9BNwpsD77kDӗwwju֪ND r);`ljy焩=,Zw6<"d=`So7}u7Y[sO F^Gwk$*񬇃r |gEsyޞGW&uuřnW_)kY'ߌƷEK-ZhnW㶕v ܏7>|fMI):m]1RQ]NA:לdȦvEA^È0e`Vd!Z5-VCtq$+zd+t@Vx>p)?vBxfKիLe"`Y X)5D%FE4Y90 H^L4g_{UXLY塗LErV$Shu<md_\qJ!$ܫ'F>A(] Nb"sK#3؉VP%Y훊TIBk)@Xθv|Zg 'K.G/_*B+|鈬5/10hֳfw?4yYՔ!*y~I/]K*Ng|? (YL2?,'TxnZÆ_Whez5X pFR^GȨ痸+ÄLK юi KCn{[GdY#sYqߊ1@)tuNVVŜIYe=|JM#)W2*la^Qj嬃v0=$]qw|AB O@t z#ck%/;LE;;[.qE+2.Bf (dlc+k&-d\*eiCIlzVbcM5ǔ}P ΚR`"`K%cPN+[ɮD(_WOQ&0< d7\Idicg  Ȧ0=ek#L`3'CK)_+|źْjR:Ӻ3k ee#VIݘ*bdVwĂ5'BKDŽ% H*-'OXI'kdtL=Ca3!T&yd*"eccL13v}!a Yf!yEؒl ;!EMnaq.B`Rq(`: "vӇJYh9:?wbK&<7l%6#5F%zd2~pج΍ '׎F?. _ܱp'݉}QsYOm e-z=LG0g9ޏkSjI/ot`tn0L8t^;'V\ O&RPj.śQ#>{覷һAwuv3-0S{@(0:Pug^Xgx& !567P HAE@8onmrm|"ww˓J 1}i7"Q ןQkux+ 2LΡM?靸8 !B,* p}, ~YIK2/.nJBU M2JRQ^+ :;ѬetK8yK9G+Y-Q XC?<qM]h.&WoODEJbYUl\h)O }T;T0/%C9hV.>T ?| `͚eʔZIVo[߇d{ʸ[K1'øIoZ;/^ r}gk\tdW͋d&GUD vփtsIõ~ E7$Z(C9U[E_{uC b1U-k)2wSRTdJ:QCr }om3Lמw+N-!I'c-A2?AR!)]vqjWDe.=2wŔ"re5rŵjNmZIx5;p39%OLϝ俠G}tn^G=vݤ70z!a:uֹ _Ebl WfH]}.񕿹THUNy2*VـU1"[n"FCLBz|dn837\^d棖B<*OKfϜ[w~%ovXwB+UE\-!J`R{2fowBJVWeV*Utj6`caRLrA*[ޤ!G`sĔA 5ۅ-^ 'kg;)FeǏCbiJSud5\ VaDki7r#IUeKp[s0GI:en,-Olr3hm+O[6SuE /?2^sa-JJG?gn؂г cHjC܀)&ehV2}R`U A/-fK׫=:kٲ[#(C9aOSaG>$A$8NTaƂVI Mp6KUZk80V iBHBcizMIÔJڨ1ND,NJc|5Y:A/eTaY^+ <MYV4PI)rI0^L΅oۣם?6Krjty>5ʕ{@(%:p)QR-ݫߝR="5Je fq:W շxQlo +ix()pDlT(dA-G2İ;MR.$Pt>΁֘jqIG5+]ׄ`Gl*dW} khc; ].fHGNCΣ"֓] wŇ+2!Sྦa`KN\kEEJ]LS+Y 2>PJZc[)K>dɣKa;ū>%ǐV;D,5Mvc w@)\YLˆ>@-KKBenȮto:YZ ŀ)ltvOQ͍ /U[ZsKA#ߝ%{a6!Ƒ}Q;XzǸS4hQwC|>F#WjɓqW10;q-Xw5!g@ ~E෩RhwRwSߟ*u>R.[׏46Q!EX6&%"nNi0+}<yM[Tw?_ 1F`h@Mu{ŨQ7q9/WmBɝ7@'!`;z{UȍHfs} !#{+t)ɬZ# / ZvC #jɓx>L37&{*0~]lםώ3͍xY &Q{ਕ/?;9c.3^]ޤwe%球~D wrs]:mΏ!JM~s~\P3\d3d*c9?f[=2kgB5f "E P ȱ뚊3a)2"Z 6/OQ7^ 1[ml^D_P48)\{?52w`se3w܁RusSR $Il32vK76!r[M،9GFTAي#6].۱6`;{~v7Ŝzd mnNQT_x_kW2b П瑟݃Pܐώ5*sBvagi Vx!;E*\y_3G_/\ǻٿ /(Z4㿬uKA/wt^ϫ&ra)b'z߈LYJq*!k.`򢾖ZC"%^PcЦ1Tf Gt eBn_/_k}ƞMAb)٧<`pa+L5лSP&TEkѭ*ќV)3m4>Oc4>OS_9'y}; "d'MEl?-ohMl^Խ;1Vv&(5s!Bd_ بJV G?ՔSJEUmfz*G߫;)vs :SIbb4b+Ɓv^VҊ!W\lKO:iM1`1lZ)f{˫hzʮNM-o.a3(%,0'-K۫Wq0f+`7u $j₃)£di*ac,GH="IP8'(F-}=oZ>h[ZZz| UhŞԝrc|6_9p(>ggIՎu2jk 3@vHѢR>HT@_2T-ᢪJm}2,&djNZgu{wQs%xʰH]C1!Nڤv4Sj69WSQz uDH]/MKZ5ɡ\eS,jdPTNjI@ua<5'ƶdcKAsHP84rKp54* ǫf 597P=heyGnA=и&BG ޯ$j79ףαO̓j -V9 Hv|4Q@|G'j@] [Ta M79ij>nUjiFE*&qN垕\CT$bIZLŋhq9f+t8}cxZa;<*,gFkB` Gym5;z/>AcF7n27i5૘̨̂k N(Z cfU_[lRuk5*e;5{["b[ 3ÛԤD r9]rA.#4C2~Eg!oߓk@[}xB61BٹU$/R\dc+ [e-'^2;!i s3ˢ#tY5҉r5}oC/V'ў6B0Z&R(迈HٻFrWzPi)j}{4JLL.3=c;e]vI 3}e#ElQ us.5m>'S,HECHb@$1Glj'I@ڑm A#T!fs由#8n:o,)!]P])`7\ y;P,vK/| v9A+p(Ȅ١`'։~qn*n+Q@ n|QNU. a&G%{!0Qbb63_ 1&W0pL&!1RWY 4ŏrțW6䝸UK{느seE^q:ģIM7,Ěf9趍 Mk7 :%Uhq0\PE62|;* @mKM6%X1ܹGSk@W^Oqkѹ{F,kd- />}" jca"Lˋhb>Vq[p0ͽOO^Q]& ;-rě10Ǹ9H\s-~~-ht~hmܢHZG:?w>o[=4x$u5W5/@]yLnoos Y Δ%3ZɺlM/*Ȩ{f vnjW_7= ׋ q@LvHzޡ0\N!N *4|Nt7ܼxUox>rۘΕ0|_./~_Cr{-G=s-:?BӸ鿵e?mv삗/o9^j:bsqbJib=)l1W 2PgwU/ga՝X5G47ְIgۺO_pNqkg=c# U~Jh1]'ς'%gb~-)Wцz8K o"ZSJ&)zv!A]NSl8t^@ U}>P"L)8ЫйN|v0N_}VMpb5"L3+oxLݪZ|sD/{ gLH{]ͱPŪb9|4ʊaBoDϫ ڽrh']yyc~[c^B MnC#3S#'6naF{*DaC; 1`EUCY\sSrqbm*&XSH"쯃FWCU'UrA@`bISfMCґI(ۋ6~~OQ"A6@yA+\\; "6;yFt/E!UmBTO;s1c]G^;f#Z؆L90]Aq-)~ -r@I|֪FvFCh:g# zbI`ڬT#8!cW\_^)֫^vzz'5\@Sjaar:ktMbVP5Fl@Vx2 (Pmٲy4>*@N,] P( Fs GEnv{wN6FWl"'D}cZ,ڳ;j8[^7!vPBfhu*#j9 jGDs}.4j){Qb5fu5"8(Du-#;JCb򤬓ZY.ɉǶD jYuʆdk&B0BmBj)mXb>LGk6$ѕt~ tEHꯊ40~昅e yLZ{xmϘ0EE^ξ=|LZ- iYpmƫӈlɛA\ƃ`c!-.)a+MTF\߂BRMc* m +Ji7JEEz8G |t%ݺ9qϭ!/z. ) Ϥ!ɥ$[u":s ;#{ |ߊLֆz26PLXƁoSїj狶:Ŵhah\ ".H㒘}U՝ Ѩ o5kl#i^?IoVSS{uqIjfE^}ro-bȗO#lƘ+i aY[ tU`Te2!IYMH5- {8ﴎ6 `ǰ)qlF/j=zf=kݤ&`46ULIn*C+H<{Nd3iֹAEP,?!2ڜJPSō!,hSxI뤄M TӢ &_RQjz,T;F5F"FaBY[Q0$,2 |g(mM$(n/T7i[.lRS4ژV5mB ?A'St,> N@ 渠`ej";SRQŔl(2qtUɜoK`"Q/Uj-FЖ0VV*1A9 .4B]Q) cmI9QVV+V+&<:҇Lmmzoi?^ͷv *s豹}{3p$O'5=3kwJ8;#,&ؤ`07u!G}8ۏ9B݅xz:|u5A3tЁ33M٭N58n:Сv r̙Z]=?k/X.Xnvف4đOǏ?Ƽ7{̒0lYPVf[ي3 ft3_uYflzO7k2^WO nfM7Q}Nz wv>@C#rțq}?!+<>YAV;+kGJMT1' T|g U5c1吷C%dw Y[r{NDzL eiTZ(\ՐZW$˛Z"g dk crdvԁ\OC%*KY]>܉\8mpANT8nPakϸHr5_@C]2,G+m'U4{*"6YOU~%;Rë9UPš+D5tx5^7-q]~K={Hv5X0B5kT='*iDVTX'= aեSVC = c7{ ֜fS#эNLG]Q{:jKVl#4 ڧݪϢѝD ̞:Wpu ^+/-O-Nтu[VnggM)XЫj ~r-෯%d> ;H* z}yw[SQyjm>޲s@ň<E~ Y_C ]\|!a{#R_gi8d/MsEB1z*4O~?}:L1|VbbNS L*ߴ ѥ>Φl1pydR=i& CZO7M+wd}'#{&?ֲmX6L&+3.3e`"5r4 ՛-K_ԇ>7U6f&Y wע,.S?ՔO5SMM_L/#+{ U}w -q`7ЧM҈4iN`ĖUwuUu7Rq&,4gH ͈62 AD<ίu9|9NBN4Qpy X%-~h<f`*{S5pBZd74B.R1cA[ſU7.tj ?%ş'X&h *EӠ+-ʴ͋R&~JExtߣ@ϳ.ςh3!yzjlF[~Y l\l2V8<Ln/Ocs7\RfDEvE[md爊Do)mj}<`xw{>L5?y󳿺|=UHM7gi>ya4tlmn'blyWLtKs PA'҂m/t %Tq{P& ) aܸlʜe\FGto҆ j5hFE5b.*|H"h ^;KD%クP lDRBcQ%e^mqn{]U9p~"mǻNjTKa󻐐X \}ѾBS)#ᙵ2ǧm̂1GS٬tuq#$azxM$QzRWȂ0w,%Qܕ$'cnU0B m91'so_sFk֭rBnCG[U;JAQqJBk6I($mKV{@1\AZ@T;F mלӈ %_(e6U7Eh* Jo_v&eGuuTcF{}_<ǛqT_vo0=^G4UUN滑7(M.=7v@z˪4srbge9eӛ"qfЀ{bK-p C2?Fa)FCxfdXpkQP8;AE彳o%ë j{땅lyLU&[櫻F~ kâc|%ҲHrtF6j]1[ o?ʝrEq8%A.ʄ&mX3*@W킟_Om* 筟4 \q{n!wUߡj5*dW !R/j(Bam֭\)eݬ=7֑y_{f-6ɛnam͍.'pO*= /?k⃐*Lv3ԕDlN:uSp$4 !g;i|kɤX 2K:2rfw%}]֐xJy|!r7ߙr?U]?`M7BB:h)π .xRȂ ü, TC`,:#*}򰁓<n32;m+D%,Ȝ9 [km-@qg&k`Fbr%9UARh? ?U:KkB@V&nIHe4PygeB mwHzJ9[淤|񵨶 ߜPAdOwAhFRVdf>@b"ʝ_1F O)dׇ}TE5<-#}ŜbE }8|{ŖU-\AYz o{y'?%*eewP~߿ N۽^ l i鮺j]aqJ 8Jפ/o+֧$IkD4#{b@B\h"@,J 7+g~>پ1LKm{{cg86{N8Y%i0L)&IapC 8]kA|zr6s}(װdo\UK9|wsGk蝟 QQ0Fl1X!gO&:iq:3u&i-3ywgPꆾ}s m\rB%"` J\hg /Ԋh(VW]Ђ1="$d% `X*5,r 6:J|2i5(=4X: %Xe@C D-Dt w ao-X zܕ ĂuI0JFF@#񟒦ؚ$rZ216V*ƿ>DPSk  SKSPdbqdJ<t}iv5Ews*$qN.5XfRԢx6@!і!__2%%)-V^Pgi_Vl:lȁ9bQͨ#b sJ8) h%ݏ_&e.rKb<̅x.ʴPR5tg^)0{32_җ/fϻW<)6ӳBTZSv;?*mg]|JO O;;)d?Nj# .[s jq t;sijs+e64RI9;JA䞮[Ukm(~8W!\&\Ӄ>3(;lφxbȞP{X=0_k@TxNk_Mb{e+ϡS."5S 8ُA3úA\gy/rmދ|-vk^TkN_>lr&RADjE5|uw.EBR"y]OʀMTSOi:PY|3p๲ Љh6.,\ ZsC7^(ט`tq=QNh3h_ ߖ"ew]gْ\Φco.zB1k ͹&"iCmҴB4kZ!5Xk6;CO]!TJ1|9"G]Tzx6(t5U1#5xjL@TrZJA35;jFx@5\ ,٬m=a"(5dz4[oɲbݳnI/7, eC:&t^] o{~8n0}<*l۽m%}peaQVQ&B/@Sp`ОHWŽ7ϜBK:jyCh_[ Bieѽ6dM6^I-=}_./Ϳ6NHӻrYS#pY9ZH,5$pBTt4*'zH oAܠ ?QM zQ[|ѯ6FK# :\DŽFc44pgBc.fw0kbx߇dT@q*#`Y>0EC8SD(95j#.:*39;iШ,HfV Y9&xڨ5@mE\BYjl(GnlzFa\blޫN"wY0D$P p,g<ʄ\ka%hG〖h.O.2<X46:Fy > Aƞ6I 6# OrA\h;ⅨtYit)"}QsNwzx2}Gxٙw>%ۤ{\fC94k@Rv'1kqb㶖ķ4r@3}sPjfO'L IX/OV\xdVdŹl;r&ge]f\ k6 Za3`2)v7mE΢F$9T"@X:}V38ƃ5q(Q=s9;0y'3םKALZX.YTTɅ[-cKUnCvCee(tyua' d1~Uwb(\pޓ+ih@Fo?T2Ywɼ$?Q?n`YӅA|cֆ\M'Rž1ˠbѬ,7͸][7SU4b&jƛ{3cr).K (3 BAw%!JMA<Γ<~zA(0rxx,Jf4{j) B}QPm_dX3. o/d6hqhΪ?}KjXg,Qo0mTTYnN~zMTOəB ξ3MF`REKz\Gj=sFtbc\AdHg+ %P;Bq]O%DR[&A st o9%Y4 i\߯Ni\8h0-ƻahFN݌SKk%H0RDb2- <`%lw6#77@|XT![ABx~Ȑ)hDH#VxB^3abU A*f+@X^Y޶B[Cbna9s0finT0)!T$pS*W<(@N܊aptjjD~e\*7ZZ(I7VHбI(ŬP4Hi cxJNpB[W|CPȒz0Q jg;8$tk\~]Ph6 !I߷Лo!%ߣo&]>Xm.UR}ݽIRMr%Jq#e- s7O.K E{?(Gr> ϟ6JTfwHW~g}zmJZ;Ol~9)4DHp(҅䮱0Uj D0D6pL~m_*د˗fΜ$堪9xr"sDF+'&8/ThHM\%lɃywqvj^f=,ns k+~x^sXqD`-yqH9EaG-|)0 w’҄Jm %"p60=570ƕ Ku*\;0اÖh3  :LiC,-`n>Cp-ˊɒw3crjKSP9D!ɝE[.>k2B (Zn`|f>S;@^)+oá4O~ fccoek/ӦzApqꄛRa'G?zo_|ѫE7fŸUWte+iDv$j\.<}@v8_';AӇ|O;ꁎ C qpm& p[n'W;paU~!FR(穂x.ȉ,ּc;rQys3Ge qKg| ;wR)ߪA}X2wĠD k1f6&"0,Df5^'_~¼sfsLZ9%Ww>TN5E7kɸɷV*͔tI㭑S-e`m=nw!!(n3#:۴IͰTd>'\Rpu;G^?j,FK֣k38]ˀ~&#s9~EJ:BmD:Pxbe cfw(Lj1 =p^eCx368pg!oa*92-ڀ"R\n8 D?&jy7hԹ]︿1 J(0@5j#-Iq_='vsN IPu8_bC)z2~rӃ{ს>mgf=ưΜM[>, H"wp}BM# w{@IV6gp1WA-n7g.望Q6HseB&,QΆ"q*hOj98BK},\3L QXcS7:g}*VGY[^㈩%#B `PW )Ŭ$g[9kC|Sy/2!'L N3HLSOQ{uYؚ!|m Yzȓ%@)@q V4tśe1O +[Cr٬.I >SRӪb-*S4њj޾Jn"M̀ Ի3U8RXI!Y:,_3|r [%-2TwY7԰X>k)dz G (7,EZPL8[(haU49 ҃{ xדz1fc3^:gL˫MiA lyw@ (N|,sDk왶s\b.:e!ZFTry(J'kRSvlM35)5;rN3i[u̴د|ی^h ]|yfɹwo͔ds~, 1&A;8߀~)wuSQ ͰR?| D4Z }U Ye <ۿ(wdi+9>r#rzKlf !ld,?8zfz.>\sy4g4N Q68g٪r>O]'oT5?>-(:=>}`'XIjhǝPD&ʯ@42N39S/%{QOПm4gg&dwtx1:&?Iarrt4?p:Y?-ہz|98-g{wlρ?vgǯ1 tpPyLGgO_WaHǣ//:\#@uM2tz_|=\iJ j_K>Nu_Ϣq&ɢ9\a`.JUN}ipfMM:W` Fg4WB>jWultl3+<*bz<5ӗv֋07Ҭ:Ç"aZ=_i OˈB wK7<3c'՗ag?}հE؟>~Q]~Ͼ!,T~u?>t]O٫Vx_ͮiesϕ)~8  :UW^b0x:E&S$_o XO|*&>U)aXELkeѓY\)5_]I)3=}dqW+rҮDWtN{$g{6ώTH2\Vps+)&e2YwA%,.ExV|y|43DYOJ}$~|˷w>fTSaV %E|^ALNmԖUHkTțGz"Ziz}Y2?C{\댑9uI5p5(+#=(+ee/'PVAYY^ӭ~͏c@49yR)U40,,(cca\NX<)ycq^c1yvR)ΔbL#1+S=!cD'3XpLJ0DXĴGT }=6)_EgGo,G 1&@zIGc/-YǷ~s{kIZwgeȏER6,JsleNV0qrV: ՞%E`eFC1 >/$!H̜^>FŰf Tf'#?Lmgb+Dd5""l.҅ ѕ 4EY:{}jNy"8`p< %Vq+aA晴֪8}cFo>"X0BSX^z"=SX*59 Oa"C9-ǁ꘎+ 5{ 2$})jw7pzl^Ƙ4x4镕j]@=4QM]!hVĚEqo1)@ ]K:ōrn"3s1'o;k;\cm f`VnqrD #bǵrk¦YL]`"(-q,aw6DŊG& ,Ads&6k 杉d'o`zou{EI!h Dko]dh`h%!֍ DpY_E[E8n i!B/&|8O%{yRPOS-;i}䯪4xMɢ߼x9}{,64mG*ȬqduI<]S=}^ZAs*u&3fZ{Eb БWhĵ %Cֈ>(,:6H2ArZsMg7If\Ou$ܺgjpN[䪩!# Hr -# kI*vNmcct0h_3q&(/z45=U&/FN5G.w* m"zfEL2ä΢m˘T!^X7i]DϠ|i߯zPw:}HZʲA.h kC+UkMQH[U͟o'=8t)ZTRNSTpxkֿ/f~]/Zr]ќieB##n@"Y @VCVvlmDAoUo˂V||nzXz-8}7W HT̈=wsKm!z<꣟nR}]ÿO?Vʺ̘LNMyMDY+oQdQWϹF@H53h3gF/V9M!s2MSin0sV/E ChٓiI`*Kec'xy)Ȝ=PLΎõ>dN[TRGk:߫%C CFU#ܮ$`9(T9qٯӫP>LS~k~2u<4VFs1EyePYDZl#a:4kri^ssxɣc_ۇFtNwFI5<_[ɂ<_:_,ᄧ=DFAgu34W{@vyF׻0a:GxŎ,} >;m? :r{yŜ:FUq{2Yo<}=e19}'L-7{l-Ww?N$ %~byUP}R:A;:Ss':  2k5:퍮 [}@2{cPNwF1xnR~R?(!37Jڸg j_m$@/΍_|Q^=o3a<GYWѹN dP;aeѹ$` `7q͠m.0Q EoFGBm'v7dj—Z'oW L\q#87cuLX6b.gcrq+a"y@1Nh%]G }J2BEPIr&t!0 1VkB`O SܪRVI37O5]H}6G/nP%Q] "nT beZ ,?tBBEfTzS茤FxRP1J{Ժ;݀1i S2`Jhd2R:+T2"B/# %( id|C GT2[0FN65YH(2gZ֌<Rޅ ,d$ͩ/ vWBu[r~|?8lepT?jfH$Ÿ%$"Gw<䉰-8KMT[8jsjkRmYt-W'e6nRr!Å'g΍;{[,bxr~_/o\xj}QUδsͤKI;Hj}j@hegj7UK:vA}Gv:ާMo6݆\D{ɔgoqh7d5Ob#:}Eҋ@fn'݆\D2C% %d@ABHQǟkIRWytܸ#(6ŜoQaĈηehsn#CQ2{2/3t-F :bZWW˻` ZO.4gSfq峸l9ɼS0IzBDH!̪izza?ۤ}$}n?䑠󄖀F9z0>f@ h #=m01e=Ȇ vY(!7ϥߞ"k1 YgԠKӚ6u*E#5u3xiMk账ѹN iM8 幭iMq(iMdӚF$tG2zݫ Vn r%)vUnYu:@֧!r˿~pR֭V,Wy}q).ׯ ҿ.>|Z)qa[g3o( &+: ˘K46,2) A/㲯NM. Y-H,O֚aϣ01;A(^o#3o@2Ķ'C2p i_ur`dW* B /BasUH󼰹TQYι0,VΉqHp:]T ]Z"6-l%P%ڻHS΃Ky|.mײ0iT-O$[{:4TN TtBk 4@UP9 a7%P+ޡFFJl[ԡGrGɋ&Ӝy"-QS ѐs3T)hegw{r晣^h\29=&X fD@kYwgȒzVZ%D,ǍբY&9vX-նKPHJtY+K4 ʎ\sഘ17t\+ȶih!H_昐%+䄕d)+r@ᴹ( 0*OK3gS2M O:TB4goڿۡYd5 &;dƜ] <[;3Ddfhw?WeG ]{ʚ =1JN5Ng0|<{5\D{ɔgoqh7r;YlATB|wtpB/BZ7l; HW.˔esʂɕZ㔎&"Ʌ0!=޹u ރ3sQ+ /RmHӞZŘu,.2Y<]s4Ey+ ~dLT,AF~ZZ8+ɫK!չ#h=kC F>m= .Z]Ng pIm*H5RZHp.lBZ4> |T,l03imcb(Y&w³$Ox8z=˝`e*'!%7X1$)'%2O1V R1GBk"|RZ{р Uh [M'=W`V F{1vӘx*5YeWw,ld܆?VL]a6i9=P]/轺w6M޼9rys=+N4]3`I3'K`Se|Hĵm̧jFҋ.q(eQSO=xAֶ9:dv'ʇ {ÒaV X}J R:=[ɮ`ʐSP@"~~G+B<1ͦ^lNdĩ _ %4kzzɛf$r J8'ϊEڛgt/U-4lZèbI~yN<T-hL03~|8dcR0ȏSȏa_F~_8j UkMG8N|5rez=€ɀ@ "eţ !E@ّow:#_yCr*-=އ/ /owhc"{ZDw'gkϋἼ')7^tq;}kqݦpL^nւyyl+JA]uO.y"XwEBvT1Ex)쫘Lq=Pq0TZ6'kI-TGI&{Umz/l"Pi-1W4=hD"]kfިjGWd!%"Z սRGhJb2D~Y&B>̖ڐj!]^cTQR\s++D6I#^_8IBTM i!7V dzrfp8 L3-* Ԏhnm8;F| ?6FZОjdkup6 Uׄo9)˘qYjo#HgDKI*r@8Ӟj@r>Bpb kmVXJk,GY sC-%d^kF5;2 `dk~`~hf(h\qzաsu+{eQ~y?VEXJ[;.HNto탻;!{} \s+r<|B,Igv 8@py Xqwyy˓Č!Էž6Xߩ#=Yc؁#"K]u_xk$;P℗Ŷ:ہjDR%hMj ,Nb5] 4@#U:3]Nb_ cRr}W:IG#(d'ܫpkeݛ@¨-*@%~ֿ70StzNsXꪣǶ12B -!D`D8wGS$օز̙sPjH'C[d5Հ/OmD5&*wse@B1XD EfU.tI EFPǹ7EPME8?vzJxnTBPKȬU!R4sxҹ 9af`BgKDq,Њp.DmS(EWXij&}7Sd}7#}7А?j#wa-VuT'Mۀ RZ1ۻE3Z׻u!r)Gqz7U:>wAՉ}G6,cZJۻE3Z׻u!r)Sp8A|0 %0#ŭ4,{se3)36ߕZ&q㜶0d5%Պ Nr$eZ%e9Q:FXP.XU K<6>̷_\WrO, zK&(lDϓOtmUnM%iΣJI;.Iiμ*hS>f!),Mžɼ/'RQcP)؛bU/[|պx_Շ^9>R${ѡ'PgQd ~W =% @WJYy} g|ը1 '/`R,A:~՘1|i0XjpP}Q"_(kfLm)Bj46Q1#14ӳ L{Wi=|ޣqle̒WsQ FGܯ 4~}} DILC!HV@rd}0k߇Jneh~?l1༦f1'j0ěM޽y R5B5 G2.N,ڵBHǪM}cp%GZ"Fm[)}M|`_tµ/7_ixy˯L^JRѣ6oD6$(9+C>vɁ&2dM .9-KZRZ:Q5Nh+sΚ]1^MYb%+ƣ ƕۮڇF٫(bC>k^%I>WiQZC)-io8H`¢}ָrz#Eƒu׫dO8?WCFCST={ .1nsEI67*M O AKe;7Q^B)&=~ParKScsm1Pp 9v̋) >Lq6:N,f35zmSMHifRj2U*J DJZۭ[J h{XH{¥𑄓?, HM_z]4!6ъm?yYja:A.ogFOᓍsM[bnOؑ|~h5 ҡá qO,^;vEARHI{* )uucP8C҆]XGP&Q20[ H-'[Kt K'ac@v$fFc)] j']JX2* T֠Lh4)gEs3+:fi pح5Mr{"=/PZ!rŊ ۰7=>}|W :, `_^7;؞E1,Wb8`USpPu%[5ЏL,'İ º2@ 4!lya ,%lЍ|CD 1W(JD $(lgwdϓ?nFE.l}e<= {*/|OwƔA!pݭaҚpy)>3ԞB1J1_8c )y4 =Ma[ivoiLKXMIM8 L& ~\R'-ʰAu_TUh'f otV=%,l[ưxZ~&3H*okxNI#y9)I)~@lv4j,s5>8!)6nr *J2c&'#sS)yxM`;ͽoONѰ!QQ@SBBF<lH>%F0)$XS-䰓J0~lJ9ه?0rxc:!]`wN`t8<':Juj [(V[||ns0{ T- @QKH΋̀ ]y3G{Ν#&xheOLa~yE$b界ҍd^yw>L pw^xÉ)"'J}{mxĈw][91;r^:2z&/b,%ͨ*\iL 9=?^̞oPV^i2a%~f׫W_Qjxzw{Ѐ2M@l7WX+}X; J58(2qI \rK ejH0s9q8y%'!NPHD$Q .4^ T.BQeʁaA&,p$NLJqrF6 0qjus9_otQ+<|/GX-w //!*ITOz; .W̋Bs]B?'+?kP|{yO拕ͥO ?{|?uÃ>C "87%(%+w_R/rxfM$Dn[59WI+]x BTZB u+D^ksYx v~KW_u6z֕8{R/^ >?yqs71b{( mhgba/6(:=/^ء˥Of$7Bf~DZ U ^zo߆E;ʱGdYGEXA5?:IC&Dzs2inPb#s,8޽bK.wPK secr 0`g't/#z^{{^L[,GkC%YH($L}`.!D_]~.Ԫy݆ϋӻ0 p`F܀Babn7/>]L+xNxKZcaEm2SS,mlaB Z%^tvfMMZĩ$=w@1Dw~ $ϸ"6L"pseh)oGEY#  j1r6RM*ֶ^779YM&^S/艧q{}>=16g(F̭D8Ck2"V[c"kpH ;n}{WG%`H:j4g*F,,XB.+Q'tp;tWl8ZEP>V"gq&&F,S@ J,;NCЩOGHE2 d~AZ0L BgVs@ ;y sQw)maO?H_8'L.a~b"" ,ˤ`E:3Xq`S]䴼̓[mO}Hg^²1yW3uygt r^C: aoP(^k00xC)zjwfkJu P0 R0 d @ ÂA;?͌L.Ds࠶f1xшTkn*ht#ɾP/ 50ijVaٚP?rCK=e(gX*]T&lp:1-5.ƃX# S#0TESn^QJ-yoVC,cjg 4ψ%JJdǾc]P*]8c WbP\qyHL(˰;˸D9tAM<}"$e[@4;DjȓE'~E|~:6s|BPʶP>*n~IJL8JU6gzs,I HzM݀UYYb%K!PKe7ar@=Cl.%=ԡRǥ.nE<=7)z[#$Z)I\y35C$f¹\"\XH?Z- p p@aPq- J//ïG܄rb>^6X?Mf <ڥv2w@!60V86 8$Y $a (Akmn}y[ }PS@$ubLgX#n?3}W[Q.{oJxEH6 * vi2|N>wL:. ϮހC)Mm޴@ h##e7[GGl CFhKp[r/9y{=T* Q^! Lc Q3d_?O/3v?=ns54/~P\?to`.g斞}Yymʤw%dW5 e]13#&~#7vN]0=G!\uJD OG*HRafX4Vhb}`8Q^Pνp G,/ey>?ݿ âp_G&CH1 IeOI!/ yB>}Oɝ^NԔg}0?z @.g0;j YYju2psEnX'\t7n4I9(]D uB+Qa4T QP O9a(E KN~;W3u@?< Ja wv!yEkR=YڏxؼnjIE3Wzn؛ms@:}t$49vv49RN+zcjQ +; $ O?жAlֹ7AvJЕۂm̾[w-U7`³ܹ2>gFAw\<U xð1?SbrُOb'U >(yW r 1T aĩBVac)xCS/-x,m-Wqx \x4m@;RD{oHIIR/9u`QD q(`g☜ gd3J!|N=` dBJ.:XM%^~z 3*{.Qre9ꬖoYFTP_+IV ]„s @IZK8O8`polFD~c3FwVwmvη+]}'J |v-o>K ܪl7jm۝ūΞ%ih^߃dzakԷ+ʩ*n mu^+^y{7`dWם_pWFLWCgd]['@UWܬ^waV%1epPI$bG98_!8J \]i7ևz[@Ԯp@ϴ*bG<G䟰0lӮ ^HNgs_VOT`Zg7uת};/IAքvNn8'q2Zgs\ 8'aJIWh4KyY̑w^-?^zAUJNbWMIGDC`zp<[4h?!JBmk E &W6)Ìrrgp4V*q‘4ZDJgЎJ1M aڶFh˃(gͿsAN26OTg0 _.[tid+E1qR="iNFiA |+)2abx*Ջ" eg,*tps7Y M TǼ"^ 2O_*v2莯bm4_ IB*ˠ0hMaB>Xn#NbePtbĻ.~:k+nьnoDlIVAgʠ>w;]tZ9بmh]v M4ʦ;źwJwAӉ}FvzTVQm{hFzCX7nQ6E8׼FJwAӉ}Fv5WpJnьnoDl_ Ynsn2c:hN:ǫE3Z׻BqۦB^vt'˶Sa˶3vȨbԲ#]As7awƓag~ya'l*]'|l!%3B/vU4Y]nxoyQS!@:}dqJuiMҔ5){T)TA֥{bM0MM+Xߺ0i z䤇5a<͋ӼFr4i^Ӽ4/^F 83UAQq؎FeUЫT@9֗U 4Q'Xo<ke:&|qv.ܹ"5VRM2Un™NQ2/(vHr7pPc8w`~ʔWS/WHǘRq6v ,sTUPu~M(X½b,`/dr|2SL@IVP'nsN{p`e7AtI 5Z%H"1-l'qW<KĭJe1'\) F4fՂ)L:Nʀ}q{ءbSIrCle/P0]1~u'E?|2jVy@КjIB Lڒ!EZgX@8UH{fMϵƨ)ׄ}ktv*Nqʞ0 Ŏ(T5c^zAU)I%j4Y9/k$Kd0$?A780gu ѥ{xg%j@# PRﶼ_ DٔbKʼn _B߈+Q ;XXɧ1-24dTq0aB" }>Zd$+4a`rpQJS2h~&rh#Ն'Yp¼9V*Jc0Ys8E*/FX %QJ->)p/$rqa=OaE.0bC_#c3)IXEoM|)K MC`)a&ZcdpqZ9-]I5nD~Xv'kOgJ'>2qYR8M,OQ^1 #W%'iB\eePJ'>9߂ň:z88-"_ZBOQ HU 蟤u^:C=d J ; 0sŬH%"D1"S>'>7"%OƉ~v7YvUfϮ⋣Mn\hGObu@dJg\},Lk6zps5_sr6B"HIأYpP0qIDԭ8VHMVH8fsWq[3|g0 x6 "; Y~Ma 7dJsE4HG0u}@70>21P(@3u> C;JBC:]~%"M]iM O2%+ĪhM9QHldID^)c2!Q[5u{ Jy+jnoHVO|:D 33~wU{oƧ.8ouV ū;? luߍGt~v#rUJ"ΧPL؟b84A⢧-XG:7/R0[e4;Fd &?q 8H t[!;BU_lI)\aN*dz7yn?mď?rG4ϖQ'=#w%h,}vUXLm؟:B2gh ČwӉ |IƬfrye_^?9&$/_<\pU"o<ټ5ࢬ5.:F:i~-ݳGL:CM|~g\hh˃o3$e *׵Y꼐=KТ$[e)?//!N;N]Uj˞Q{ l,/NJ(ĥUـz ]9NfS08fRSɳu*A d\qBQS=!\ AV-մ CHT{Q">yc$5N7WY#u'Y@dJ%#v4k*t#C­PN'A?Sme69R ~W/&K[f0+$IMSMdH"ي@iWh̏y.)Y)ļ(լzbg {/F%q+JS P 0h/?0[p'/UʂcӰ =}|͆j][FA#T+c7}js fB тS.vQYu<^k@?LJ&,`P'^aI i_\d6rC?JSFQBf7̱[Y#tUl3yd‚ɂa&;)ۜVߪ:gHpKKo<,Yvf+fҹ١bKfqv:EK$f`"O6#lϨaeu N}%B?K?!4^~^рpHߕ7FJh+l r͏c"*$-0.%TP/? #;|peĀ#ȆҶSgn >?b'mI~FN_G<,`\/ M[.Krx1dHf d SYWWW4bKB1y~c)}$s>lc1 W %e8m+K +$wv.kt#@uK:R+xQ9|$GRPū9(b=EwKȨ'sġ0F2x݇*(YYf1 ʹܬĂwUU?ۉQL@ vo'|))SQkii %{f{rVtV^"DBсg DnY6l{{0̈gL |R/QXW'{**jN UyTF!Mp L3UG/ʐba|d!@rFn;`I `Y&F?z7}U1s)qX.>Xirv j6!ZפIt%HƮ'ѕ(H^AP"V47 [ D\ z&h Rqt<۞DbHB(鷬uCO)2˒~M9!wl%]P31 <Igbl[p$bl4'vl6f%N\)`1x`")mb/ѾU(n_e;qD.1^Yѣ .~<~_,gwL(FFd J5xVkTbu7NfQNZdw_YP@"ÃcO_ >V7dpH2=0W^'㕡 De) :g\lA9>{Z{8s^xAr Q"I8BJL)$Zs$+I_~[ SCcQi%͹j'TEv] !Ob Hm+tn38v)uq;K0A(\dw cMFSYg7; b}L%A=PqFd6*RQ-[bFˀZŸ1aM&wSY BS$̌b a_pqBIxҌbl'̐NL={E-4SXZ%B'Z &pR,%<%HQQ =*^h,a%JRވV-&(φ\F-Ia"'CL[`*0z/ך0&v9t )YFH(!بh,s"ˋ@@HP!n;M FlUxd44uAZ,QU'@46 &E.2ȹiBFnDIDE^]iN|l#+oকl~nc}ZF_dQJ+(useGh&*1搀Qѩ=)m %sJTq{=bDO\)deo^Wc'K{ؒﳡړ6x0UqRyE *7zƖ%炉ӭlU]wѸ.ge2}iFPyM͟0ffWOTXk)T,tbp)`=wxBB{c`ꫤڠCȾu5=$ル)"e%=w<.a%fי$_L d4ԋ10[4j%׃Wgم=23or0 $yV:FWY1=YmeжCWs1 dygޙ/I/^4a*2H&R/2/|bmL[{͛[%}SVsjTsbΆ fӵ gfh6knŨ)Rn]ח|vX6bߧ;m{mw[{A/2*$an3dab~ _GZMp;*wɱo?v7=y2e]]}Vu^>,^WBU 40( T 4$OH k., %y :JaC5:  ՜Ei0& G =!)pRߍS.lwСnu{Q~(DwfjZ` N=X^s!*Xs!bpMC=Άcw:ٚQcn7w9q:}/oˡv _|?X?QG23wzq ر\@exuY^ 3lbk.2+moՍ.휟"l#N7Ȫjd/8r3g-L ĭ^DkNrs8'T@2z9*bwjǻ|K˴uZ{?:Z܃7:f c872x=D90 %]]փqw蚷p8,XB8$uw߉vDŽߛ{PүHpKYiYy\&$a\0-sKBc\h\<|h5c]aًRi9Dp>!aDѝDc{P/ O\')7j(9#IdF@g45U0as͉nvz?2_|JtIYkwx/D:w澻0\xIؿYiU398q+GJ 6y~"ؙ$1_('vd_QC}ْ(ukcAbdUXUX;D9$80FU"BI '*'6du*^lMX"fn;.+[GeD_A"_HM2qVIG1)HD :18cS00rž:œ(Mn]J/uO.h6/6N]9k}z)rG /W*@S L]`؂cAI4֩TjSX-DhZv pX/ndr4%ZB$`q0QiD92@B"#v(P\@okngqcTΛľleʳ$wPe .EuKyBX]{;OF.dڝW>žjIUP!ԗb'DF |h \{rU@M S9կ y*SVGn2(:cTnGDԊm[`FZ&4䑫hNq;췬CYGn2(:cTnGqy0hYք j'/Mh#W R;ɶud*2:FvDts*$Ѳ֭ y*SZuqnY7 hBePFuBǨbݎEY8ضu fukBCTNJ-Q}%ӺfZDZJӄ0Je1!qZawLbJ$\Ը-Jƒ>m'Ɓ ʉRPMSCa37kXiD:!$N.BePƸQe;ުsZ8~Lׄ\"ґ4䑫hNQyĹe0Hq A u;shYքՂh"[)W NF7XxpnKKu30t9찉8aL,S)-5T`\h_W"ѩc4M,&IJ(Z3E%ZIj"b)Rb4N-8d#E)m 3$u5|cV ܉xx 1dzZPP%wJMh#W"Lv~'ٲnde7rdP.Q*lj|B02Z>/\CT]v@Ĺm݊-TeT't*^̩89`FZ&4䑫h}bZ.nȮX2nn85Cs otO<1|5c8MƟ#_E.$L[K)ݕ MO2Zdd?M߾6IU%IU'[ںkAKfZJJ9{UĸcLzdCQlnj܁D&>,&<+O`eYH@簫侸ǣ?0Ϧ{V0W[<:[U ػ9RhrA_FKcs#ݸi Ÿ-i Ÿ[x`LN 0̳!+e;pѩ_?'@x c[l+׷gL܂ ,\??@VIE/. e./@'Sds\s)O_a$ eFV5 Q=0>˰q£LB?e=NलV9_ ty9܀?y2\??{YGCU1Ѫ%2Zve r +-i7-Zs Ul@RY ]~k][VxŒu#eNL)WvEv2YrOjwg}bWC`3`Oa/+34s6$'Q_?**UqctHl(3qsStpXfgLף^n7ᣅbإskQ3`hFCܟ>,\qx3w 6[7dW%N=j9\Q3۟.cJ:Jkި=DfiTjJ\#*Y0Z!0& RD }%mI8s Psi!Bm-BEkbGp] 笿)1Y$xԧZjQR| މ2(U8/pJ QLW1ҵАGuJ+[U8#/4 Qor6A`U=YXX+ kX9/gg[/Q\?ptrq8}($<*DM,y~n }ĺ^jc/Dq՚8`lP(JeknG`ܠLp ~f;zj?7. `ݚ0[k(E+l),&b+ _{k2<:Yh] l; 7㽿s6f7fL.ˇx{5nY6{^?/f-fHPQNZڨǙ?TaYZ*qh_Dfo# ^䟾 e?̨8ND(VEr_ʎHTiqo*.޻ҭDݚ'$} Dž ힽrYi ؕU<ݬ?$>[dJ|hpx#'`uU%L1K&R,2ZF\j$4I#s\Ēm$Q*w'QlSXkcI& =a_pBHBbƓ80. G&a3$*';%Y( 2}p3*!)/#K;&3B͍ٕCj:6duAOp}gf/ֽdZIOoRN0y`u7G˦OѪ/7'y;Oq!t-~L\n6B{O(5@eDpV0 ~龎OoJ>JN:BVs@+Sb 8H}k[kϡdF;}*vl^P='0́z;J*SaNt2dMV~N1Yd k}_lem&ʙ`-OKOH`wB\DYs,i1AuVc{ͫ|)Ow'MRdLjY91 Iد$֯XcRCL̅C*EH~یsjτ9Mf]L{ .Tۈ! P+`PFjKJHY$+I2-7MSD1V$4`)sĘ+!xEb~HJgqwhdB_3L} 5}ٔJEMS| ni-:]$OR!\$jBլ=/yH4a9i鶩 ڴ֮h5¸5.Bl\(&|Hh!rb3FENK  .aKqz. Z?d1&Iz~]{9Sft)gTR+ZOYwFV tAwpVg,G kAZD=Jx3=~/Qrr˻(ZHJD;E [uVKP>gPQV T"~تTkS8 16E9=_*t˽;w'@&]qNfDQ]Hm=$S+[|wq QL_.293$CaL6zimp.U >fp3ZakZ󦽙sSWNţQZNL];ZSJGk^jWLe2KGAjĖc T;Ί) &L'U~9 Md믙Ri ֨F50u C̸`#K1giStD$o'K!p-Q?Te*QwMT+KoBC5[@ulC,_cQCUa&p$_T+uwm͍X25;K0R;tu $@[۲Hr'H٢$RLRD8;W\ 偏+Ir<m1%h%<VbH.[2%=eJdݪfV^&=On@v^" 3Sm8}CSgY*I莇QaT8ZRA짭Ǜ-k 2]kp+~ 1ɘ1VOMwN]!2t4e2VT:;iF,0şykSb1޻AEB $"*$6>)!&!JYzz(RNx&4̴ tLB $0$*hg'W1̺39RH ,nI؅ASa'hH"q5Wa/=nd9 s"j`276NaTt)P =Mb:D1^ j1 CAhEAB (;U Å:cQG0I4ay, 1LXTH*O#@$f*ۗ=5 q2Jcrl%N` 9BKTF1Z0*Of+0rկHRpЩ= cR`'HIw$:2e"(y")Af LA e.r-nLiL)ի""=F4rG^aF{ѐ߳]c cL#m-Iu|&仳tCqMt_ǙJ~WJ̵K1G!XYhGD+(J(nC xZg;ft 㵎Mq8 L/ۨnT2Smxכ~+ɂ5x{B/[Xy &7]2p-x&,F(R5 tBn|>pC1`+f-oeYI_& pJ)]Ξ${x*!*>#Ur,X$ѫZDŔ6D(B$4|E:Q: ` 3!L񄤷Tx[1Cl(奙uX!+vx[/w:c$e},vs/dbOdzXo2(8]%2JxsCx%_It.gu_:(jȽW`z i _I `sچגۄ Ɏ`[5 U-!/ Kߗb5+ ?m01aW⠲&MGAqd% "&#ҧBQ!#`x`b=O2%02l,$UVZdBWIQ HS?meur)B&|\jSojݭڠyp P#k3tb~uuuuTU xADmxUU)Z)rRP[ƙ}#P`rs5q}e_] V8 [mթ`yg u7Ro4uHE)b䂊 *Fՠ*CδDaLYf$ 甠L lH@+ ЁD=tEP*:jc7?D/5?LrC YA.B)>pF ٌB3ka㫄,l՚bP*(־]\_=" T9ЇulH}$&' Iȵ6DД4\!J 8K #9e< SHXY3:r r㻂ȧ>~ŷNK/ޯ}.磸U7ɲ(g!8\r~YbFFBF4 c]Z+ۦr1Z杘%>\(hS"ssFiMŮ0H.qm*`k6ּAl`-`Sk-kG@>熖:oq]TrDŽ1ۄ%,aH.z+75RN B,fPEa?ip|YTXNPjHRp±V603ӆ+̬lVYؕYIXwU?.ɪѯAg=AW7F;ԺYMWpq9h^mL2)Fm牞>n9YDmg/1U//>S,#9ڏ'EɃMif/ rB)ں)K@& 0q{\T$Fg RīZ%>KnX>坻is2 a@x{_g `[Zgۑ-:HUϣᴣ2#}ưmg#ڪԧ"pK}H lKcIqO"F(2D9 JoK_@'N'<@0ۊkH `u (=b$'tt_.'Nd/?IIII*12ʍ0ϐyӒBψBH1 2-V͙Ё/%S;QwXoh>7TKL97L`OhS, 4,Pq%kp7Jy!?)ݯ 3vͨ@̤"TA`r rD믩TR$D'zMgrJ C`+6S` (hK9g$\XŭhzM%4 Jd s)"gR&8\K[T!9q> WSwk*_L26y-׿/*kżdWlp]E++X]ݼ 2rz'225Xg&X&g(IaN 3RmV}UsI Q .t 5" r+! {?LO_3CX9h]>Ez0huO}vO=[mi.Ն-_~oiL$?f9ݣͿ1?<fAWJ=xo]z8x7yOP|߃ _jh Bs~bhm⦆<9v;38WО׶U -6H8m<9ZYIwt8l^i3W"{tXN^S;E2:xoͧqfީ0Β&2`t&UL< .&F>'*Yqh =wjX=`-J}|c9s BX;nc\!HQ^v)hTE<ߛ'#ʽ\̢z$QJΛ=l*@u?s&M>ٌbSڊ܄{RzȷfY<IBp=X @ |ݏ] = C);gGeֱ(< s fbLJlpWTzt]:DM6/ܝ/huUt/>_8Ɇ~6rdb%<^GBSD =c9jCU{ A:*SDhppǀU  1]WVc> ы sQq81A d0\ρ^ . "Vl'W߀(6Jنj@g•nD'}dڧ"JC1A(27< I(׀k;r 0(q Y}_!g׌ QJJJKξI0eK2sM]i_r8⒇r$.ˬH"L"~E&  L檔 kVdr{ƚwb %xۊCOC.lZ8>5V-idfgIYBbcWf0q&2t\q|]j2l9#"L+yNF)H*.N\|}tOOo'G='Bg:X 4I 1'chHr ȉɞ1֙I7>7-kKrfd¶wQgKAElD<Fڑy9@ 敻ya\p,6„5[[t{ABZw5_;G2_lGHKHbcULZkۜxT{R{ncCyKa'2Wso`s.x` ky? 0H Hr! -D'`*2 &k_Z1.,D˅miGB&-e}3y!'nQ+" i@A ZZf`7MJE눤tDG?y ~z psR}O!ws&L2ZS'pթ?o]T tT ӏI8Ф3r1#3r1z|AJX kmH /PotvM.AgdO M $Y!% )R=R4TM0(c65>kerAs(ljb֊UINV9iPJ#VQP" hn]Dށc*tJ~)E;ЛHDWŪBxO7 5+G c]5 SMRqqva@vclޏ3i7**+w22Z7ѣ+rN{&S7Zwkc,9d :ő !FFZ Q\;R 6'Lkܚ J/9xfTtjϭMBJg5 cl9^,[?^E]lVyAi_ }Ikoc9YTBO`D>guڮ "3ȷnt|߲p!הt/ ;%<0[,C6x]}rDI1, Mzg;K&ag3SdR|/L՝㋊WW9i }ٖ5y' T`*&m`91ٯ@De{܊K j5}:ݞǾiOߏg2`OvrO6⨖-<1/aڲ%~ /E衆&ha|Bl:|-8FhW?7@ |tJN^uiʄ%QZ FS{J(:"PK<Rf`Җ!.TC8ngݶؚ8k3"1ɞG?@zp6CHӵc{JdzԑK EKr}=5=׀i)œc L}Je߿H20F!يru"[7~ͫ.`$ k!o?o^|>#ZEWhG??ǫ!Y?3x!Zb﯋<]I FzNj?3IGP0`ۯc#0g_:ts#ȾlU"!yo%O;Of P;oq4C(bʭbp]O`:[W|ݹz?7^}?ϷS;z0R*׳',,_X6c)iۙ]dVVSaV^TVy&MQEB*j4Vo-߇e#!_6)Hsy[!CiPOY8]h C]!Tsx.aQ6}wPC݁9WjV׹_v57F`ёmÛo&-ݿ3;Q-0XHTJRBᲩrbv`˴* z;FA# DxH-5Q/t18uRoBdҁ&l6P;IA:YF/gl2C%XI-"^j 2"$}Oԃ@qx-"N, FhM),|&0cPBMN\;U4ƅP̶ͅ$CfQ5s[hEվ YL0I:8LO ֹ`ւS `Ec BEm?|tWY< ^I(t2G 7AQ-%xv=N(JNb4z5Py:Izȧ~#:D1\UPT\ 9)ī׉*Z MLH)=Qq֚@-7t%+lw1԰b}ݎ/&14r{bO0 &<-LsvHBIKHE+ lwX7j c(ya5<|5_ 7\\ ,?|>2n@A;} ) -bqGEߓO䔷€u`( &\)Р43?JHQF-P"C*~wNAUTG8>q[]>X$%z[U [NFP7įI&=.f2Xoʧ}aRޫ _S,}<]9F)cvѺ8ly>Φٯ#`j1.J x7oƟBBh L)kzb<\^QP#s>[ sc;,`b[t<{i#,M@&# nOF2XswHB.Xf\9r!Sx.|z?dB `/*& =*yFUY`Ձ;烻|5IC؉"cm<Ea,?'MznD5wq5}`ѷqxxPqV_LC]Pd.ʅ/UPn&ܨHgPOo&/!P\]ொsY\RU۬!؛C.pb.4h(3hN(U(ħbΉtG+;5wgcjjm$' /h'T4֦jMOFKl{X86^KBx+BI$YDh$jw#Do㰻r.+&>,0FUъ8/]PJ[}T9<(4QZ+t]!A7U@#҅3{!cr14#R.c 4HH7rC׋sR))=7JsQld C)7&g7 ȟtE[v0"tNWbSf*JuBYpJnGi-9I͆j"TΓ9t;g|U9WXO?2:R"z6\|PP{d] bʝicgU$qR:u*Ot(d[TUex6G4g9>W)^Cʲ5 ]^ڏ")0=Xl{zs4_R:^xeu!:*}g,5D2CM0둓kpsJz$?έ+rn]J]m[/~lKka7ۨɶoc?tlzYJHxrt6APMY^؄q/EiehvxFÛMR\ +jxn!wTpJ׷I8vU-d?o_x 5RT'F؅zL'ax#ٗI/d_&ɾ,Kn>e(N q4poWCF%!N<"J^s. y[q>xZ 6wt>X7o~K"u凰}<?Z%QZ/ڮugxk$(f9D*Qy-qN:+! AcغbqUyvX ЃM!|䁽w(t*b w,v=*F?w@lGpOr :3>O_ R>SMyTM*=v[[&4+2K=WOh决zTɾnqٷ9| k,5AqPf;gv jnagH(g{v[R*sp<`0/T]kʼ2qb\ڀ)U68V,p 2GY\!`/G|=/ߑG)7{xj_ :o>$qK؛;><D/p.IύPؾu̍]Onw ,KӾh/>q3˰\% ڥB8ogrZ nȑJwb{̏\.:GCTgz_iu>]=X`3Xz 3nR*XoTUQL2`wy|_0d#$3 ٤9!-zP8dڈoli\XmRc UD @kJJ+mVęyyg[]F#be Jտeg Y0 kU '-Iqڲz< @X vpK3Pci&贑޵<3ۂ9?#2- Z/*;{YyT#FL8ΰ)bLZIJ)?<ɭwՇ03[cLtȵGds46aK!r%HH:\#uI|Yt%YIUMqVczYlW :Vd BR]-!0 ٩ S1«! R@b)XE25<(ktM ?}6M5Y~Ysa_Q~_c0o&b",n6ɆJETT6ӀJV4x9 ͂7j,Xts#Oã;giyRyP]ƹgqRm(bv(*>_ݪ_êhxu]!XƮ ABP5܀$#HIFHF}Řu /ݵ:jXyw:w-/͎Xdg>b-lx >3A曪06  MGOֳ\Ѡ2>ORƄn-vey]Pĺ Յi}QNu0*A%ݭ~ئ^#mzeODz }cuݶpk=j}0s:?h{whm'jVc&F1OtO0f%Weg'G ػ 3aϮ뼭?=ԯQ 3}ryޗ1/z[N]cT#ƛ}k8{r/ZN!ID 8nϲ{!Sc|r%ޙzyD}T8eԷxjCpI#`cCm^QEx&5C!x(CA`X4=ku `.ajKO.N4.xk12K|Yk(wK({V. zǐ _Qq8!M'&@ut7:/Bβ9|0oPGYXEF]-}5\lAlt8B 9C\Ku>M!5+agy{UĽ~S\cYuF6"fM)_0pѹNdt_a3*e%.9n{y|xxo,)Qnz5ƒ 5W@[rMyq/g^8z^4(S*f:dsSxjdž %9@.+%RLi$&i’{bBZJBRih~A5.}(" BNqZiĩ0^b"GY%Ztxk'"~0^i$fպF/ԔPA2z;EU5w5tN7m~^| 7vͮLLIc0^vYJ)~+gmfk 2z}A{;ub:T}M=l/Pi=R& m_=}3JZ Fhj-TA'V޲D5Q:0yRl{hPg4E*c)J(Z$ְ6뒗5J4;Jϥ2V% QNb-XNh*T Þ) ðVr @QmqLa!73v (-Ьp$1 0IAgTyK{k3e!c$ш VzŘ,M!7J x !Tz"Mg5/TE@'j/ R%8Sťu;luxWHPbF@Pr(ȜHW {b汅}j-.!!2*-J<ݥ/j!w*= gE'qpE7Qq&_e_pE1Mͪ8 WWfrIqe||#lln~+ZO(|8ɃHsk٤Fw @R{FID̃iN9z#d>٠2?'=!nx=!rC:0%ncd@"Rx04i9x%A@0/u3'N.x\],Z6`WkAŠC. 5ZX"|vDkKq9a#In#_M#+/$&)s˜z҅4>iVsꊞ^&̓S.9K&z:zny|h)F.On'̤DǓAyĄ>e^0YuQxމ*]3EQj.]oC=FVI-%jTE͙>̑9]E?"_= u*lVY5mhf]}4&h稅K?G0 bU[&6Nx HLL<޻C"FZ 5/9XDئjH;OHѝAU(i!d*ЋlKzt ihU S(.F^d'GJۂپg(tJ/$[ M n7跍GM"-~~ Gh ۏk[KHK[JHQk/ `uo_ Auz~|Ђ A;b#4]Ÿ]|+`W#Zp3}SZ(qZ<9ȿ)MIozH ]^XKρ&;r~]$]bZ$99wmK`W]r|+B;xP/ggpb2nϳ=+|/<ړ/_i@O O 9 b] w v>"O c:u#4IXU"Xxag"FCj/L/@i 5GD Z}~C 4Km@8:cTGc_C_5b 8Fl=je hC _6GM_׌l廣QKk1SC~[zi_Tn]e˘]՛oo֟az}~W^-Wۙ_oApDb͟~Yl?}\ba>#&r6__hJ*=[WY~"3شaM{"{ϵx?m@f/~i$E4Jj˰i7̂[,uD'&mҺ)G7=Ѻ-\Fǣuƣb1#:}4nea87=Ѻ-\ BƚX:f@Aщv.[#p*{ݢ'ZW+("r@8bPGtbh݆ ?j-zu[ r)FG)Xq5 Aщv.3p*MEOvK!!W.Q2E3k7QdA=?f'ӡWJ4rOr"%Sj#A9[,uD'&mjBݢ'ZW+(RdAMQLc1#:}4n?=Ѻ-\FWA>ni=Aщv2 p-zu[ r)GVx\מړ~fp| >f!13>cNxh} >cNacc3\ ߜds1'X O /%ǜ}IWlr>fFGǜ})WE O̪c>+ cfUm1gsҕ 'Psh )}ǜr%0vO c>+I<=3">+Aj<=3=F[dU9|z>f{|}Wc49J M,H/u*|̂P>=Jfs1'] AcJwǜv%{>f0scN*owDlc] +yЍuq}f:3eZӷ7 j0 [K jq7#8 b|~6MO~|{s`ַ[.y@03$[,Kzsfkc I+ow[ZP$kQ쨡 &/f_叡iX]lP_:@ NfO^8%w!ip6V`mw: JX^QAj`bL^q rS-Z+ve3-OB :ߓOOY~tٚX$OCL^@R)I+j@XvFj4| LjEE@(FOnD7wT{jϕ|pV.;8Y-.E}X6gBͱ%HHA_}zp "5'O@3\ˏ 2Q2"eDJ%&Vh" Oa֦=+ALZChV!kmHʀ_P~9[w`6v)za1C!)߯zHJC(IawEأX&X`2ZIUZuWbd!4FH"*+U:l2l4 n% DZNBHkX30gQX',N8s[Aʙ՘RYB*>K "4Ca96Ĕ3&VUPVB 2+ 2@0#Qq @ |n8'$9!&83"PާuI+HUs4i5];V"XijFʄ*Eib~|aGh8W0Fóıǽ-_q:rlG6sNOy'yq +xr|h2Mw"c *Uӌ( XD2L8w;̕yg{ML5@` r%kdA}Ə( wɜ%0s oc5\cRO:F`?M0=jpZxUЎتQlF];oȷܖ4nP+~:܀yw3%pqI,"e:z>j]8 0P}L51cZ@I :gM',~vpeHNˑcq͗+5X;e~^H*-FsӲQ``<#DjlF9]3W9%1Ŀ| G̴lR Io=)wuI7$F7s^W%1_4gE?y{2YB'y6_0s\sO,?c` Hq,uIj-!,A|;.Y~ߏ3!D_v', 8dҙg,v'D8D ߰Y}3\,[ѷv'\K=hyE2oӱ b*,\ۉ >FӜ4dY<^M&<V)P5>)i1Q%?K7LJM&i4 *WNLD/vGі!HWqFG 3o˷qM24٘3NTQ?{ㇳ2Ex™@V1ٸef%bZ;3C 8~L2cSOH`BQ[݈x)? `75bO J2l3XAyc9ׂ`Y<- H3)d@\i'jCf-g53w\ymLM!sw\\Rң^(G]Pqt̉*A}|/Zxqp$,ܭߪВѮ{ ͗vK@ >SlܡHWa-?Hx9j 8BVІz :p77m[[SM@xt渖\g{e7i8\s7y׿>0ߒYs YUj)!V!֙8zJՂ5|,:?ijq3- ۊYhVϗ"gEZF$9+7>HT֬FZ{t;j+ rµfO=^ jׂ|+6Xq{T7×2 DLVo:*|j\2I TkbW~z7֟pWcǨHwbYp4JL>ꐴ,$v%6 <zӾf;+#k48wT΋̂<~Lz=G'=^ @9[Å;&}Q`n޹:LRu]sҙ J|&X$}}G_+T))+mzYoO80)e/YqpbJgM.C,t^(y`RMX@}ͬ]4zd=o _;&{Wx#YfJ6T8$|4= Cu?r=iY7e\nn҂*hT*iه$WZδVI-#ȫgy#$Q^,p&8j v+'K#Ej-&CDwk]+ʘ?$1o"Hy, 3/s+` ,ʰFK#|l^lB噝%Ly-qÔ:E"4 u(OC,^`fO3+c*l$Ң+T䱄ba዗2øVbiTXHh=$l@BaɒHˮ >}XxX1ZncxS|]*cMJDsr|TUź %Q%yt7N}(kv۳waR.쯏K]: דQم]8+?iUF[Wkc WMMox-|s+մѡ TV S4i푷.m CUg;ߤdBCA~2L8LH>G0m1"7ˬ*]kh5k_/} w>GW;Nc_<(*ؑ$5_ *#ΜIwv>}~۟v#tؼ4c?Vx4txZ\.I}6m\يOG5bSNQ|~)bK4&B"X2ep)lW tCV8 a2'kA!).5"-?bR7y J$0P 6#.?UwRsMJF!],>,W(E>/GҝaI3n07>Hυ^D1bz(7JV` \J*Sà1DJ`c4qseH"KG>g+8ay%'*ѫ}^'9aJ1f@eL/}ӼwU|o+\cYy^^eF%%K | )-6>`DHfR[Xn-H5Pd"Z#YJ_ߟk 2_-?Gmx 7/^\f]4wJLk(9{xޙ e7GzxC)iH|+n}PW^ʱwN?-B`-8dhch:ڜ; Ffpf85q8rsv„ ig-um¸Rhu/K88cP}㞝<4\,%o{+Zjٺ8APsU8mHRr4JaYaϴS8~%z$eI8 Z&>[є&k#yibF17'ff4MbpjD*t\&~oG ӷ|4F/e]j 5) ԉ!nc5`!SRX kEϓturKSfFٱSflq[fڟWoc$_O <ʤF2TSx{\h.L*76-Ɣc7G̎[~A TH!e4#,`ncqtWqrq[K84,uؚh< ӯQ)g1 Qܕ w\pPv>0Dhӳ]NɂDtEkLDD698/Dʱk;x$Ο%\ګkJU[Қ~âb5wc rig&O*,ye oIB~`~ֵ t Dqۥ <!`=ư6J ٴI}̀] 1:²# 'd@6JYz}UK;vڟ4!Hݾ ȻY9\(: ^lX|t- a_>^J< q6R)ݓ= h}1gP'ygJg*rgsB4fm\PLvNpGG- Wh[ZrXTcj{"Rpf% P7,/^ml\NTB7~IhGv,Lt(3=biMps1?+ T4yh@Vxrl@2 = 7) izOǧM0lx}u3_|`z=֟N7{Kʈ4?Cȴ~PLo\ޟ M `s)ƀX즏zX H WLs?-pb~9ր}/q@>H]BWp"#ƈ(uƠ{L}VC,Gy$ u,n d x.}/, hViLkAXƣKd83,xQ/ VY4s;'LY}XHw5e k V-.+~~'i6kE!!W(xIZ24[@R3 O悠3eGNȺt!EFR@i-QyS>p*M%v $^,-ի{ZsF+oNMdB½YvzY/SR2Dj:Lन#V# Zb@kY% c2>h0 E- iM`ZH2ҧRU}9[ZKz'pRY4I l$p`0DCluH-tj,U<2B viXA7 > ?*TozHp$l*(0a`9%e{BXkA'ć` ^akL*]xe߮>.)}'ӈesU1J*'hLѣR%W=nz0,z dw CHNbLN%i }fB&e ܒ2d"g‚*R\;N$*mq b[^gSA3 2”%B|9NG*B#"T܋ui9鼅P xRIkRc[9 SbN5.ΧXh0ڱj1ECDbTD C ks$>NaKVJZ!Dh JyH%dFh(|J}'QrFkr mYt??]T m%s馼%_UPܔG|jM7a7D%SAv%ˤ_Ԩdn3^BK/'O+X97yO5B熚P%MRۀ0zrbB jO5㌳M QPA nR4&Xy1dDcS3}|j Zhs7;9+ B|GQUe (=.P9+`;D5.eP3èE=zEHwϫwg~E;$*A=얃4u=:9{VNgNy|7<=48NXMw3R~6ڳ6B[n]!Oj/6[A^t;.:x2L*tgz]sT5*3(12q`d >$5Jea}-WS[wL^?|mWY_1E.6-6g=ͰM"v0Pv?C};4OdFjUjsDJjpƥ<ѮQ-7 U=Ci.m vݕ%3-xg3M'Bw,!WNW0?9 <]zu8x됄ᣟ9 3֍^OoνK'̘IppQsS2}G:[ݧ`#Ͱ[ iCT|s&)91Z4JtО1 +M\{k3V W#l16"JJ|$BadL6Hw:#,"Axԫ0=WTdy1>UYYQ#C7<"(o N-HkUZKy{kOYZl,Nfm]jlA &ۻSM)ixK!{=f`'铵vT=F{ =;Fwna~{~1߁tNhvr<>_ wE¡~wW{գT TJ&\Š dOo*=\VLp2F &S_&lU?̓:WEuPM%q yNW ઽ,5Υ=RՇZtKU J )4s|uĺ@zu<B8 c[,:2P ~ DAӱc,eTTK0Rwtgs^_!=J{IIM+d'r׆+'J`:k,@y4kI՘SWZjN4㨦^դIw`JiLZT\@g륎9Xq<4fU$AI KKG[+,Y)-,.ZQNO~ dq;.XE]Դ^eV2=0βgG\|]gKJǐ7̭.8CBֵB¿vgSp`9-qG%"oZ+r8!L{Y.QB(js6X'$'YG2_]OPXOڠ_ WftcXqG3s}~HsרWhif5Z3J:JJ:JJ:JJ:*+b7? h 8:\#ep#袧 E+`j?-rs{5躷 F$Z!Ǿˏo,/Os -L {Hܼ͋Qڼ͋Qy,HSilTڍ-h*8?K D1̖/%u_!8SKN܄o.gqu fr%hfmh-89ֵ[LX ,V:lU&29xTJrr8O h9 eeh9#kZa7^M?lM=„BU(Qw>R.گmJ2{i?6)=0R} HTabgzȕ_)Ɍy" %{6A@zd)("${g4Jl1'-w!~ ˯vyEpF cMoO~8#3Dx~N |<5;O> 5L$3 l4uwU_'S˨D7-5_{6^4T֮8HUW}lŐ hw $j`,l)p"AbM!|.?WϳtKl, ݕ $Vho9~iԴyqRY b훫~ʊ7g%åVm]ngf>n޿{iJywtvQjK7hbS=Vm9C{W%m|:Ȋ7<»|zrl;!{5$cG%-[ gQlAyrDҝ#/9Og*ęC/}9>pceM?/f{/ 0%-zec)  ܢ N<0/hC!^W>.d``z֌X ;,!CBH2crZn8؉X݋c 5"QT쮾OsX0 D{P9EOA fxmա5i8qþĐ kqWH>H09>V ؑS? +j 5/y  c2-HWsUb߲fa+[Y |,54-n{Z x`/p ꖽoƢ{q,0K,縰r7-bG=0:#w7}"6LE_keM;|LͲ 뽃{>H_>*ONlʠWt B>忱mA&Ǡ՗.@He!Sw _ׇ( VoA2*)ՖZ@8k{(V?aq`-!}JW CSnjRD6(\ߧ# hØD 5XB`ힼ ^62k~”$49lQ]@ZG<;|=[=;M9({@G8:E#˲kv zS~ |e}]z.ģlxy=pIRn 97=Vmp+j_׌>Z;=Gw!f'Ѿŧe| ;5$;Z;eGZVJl(;୤f6f59imY|qEoNNJ3f.Ģ pv-0ѷbXRql9`~`OYDQɐr<Ýj RIJ?>^W+=O[զXTUK!ٲ]+"VLP+{҈iFh 8oTz{}IZLɖa*w7RI fN T=Hƈ ܢ1lm#(#KY$QdSHkgDZ9U]\z'r ܕeE #եꆠ60hwXQʾ9ZuUFZ:N6بo:khVҩf(ȹR1FI٠3#r#a0jN'fi883T+'Ѯ(r kx ]N#o>53^2V(Yt$DFs1%AM*(Wm0-e٪bKuKA&]iI%P,0}+@# !L("$]B@Ly\K#,#oΩ0#´\MmK19-J ޭ_F b뤳%%b#N1;$#mK1bfOI&7Ϩ #Ubd"*Ģ (.@W 4."QIR쉥vAsѸU8OZ^yM1 `NC~, &)PK>,,B1k]%#W0dRc \~doL;Ca9yvpYWvMpR"9?f@7mtnsT&(Z~w_O.-pkd)TI=,%(drt,9[#+CLb:r}.rGI"\drCm !>1C~9dMH$BiC D썞Ƚ/B鎘}D!/G2A'ek1QBXzZ,0(O$0#fVH Z!'H!# c^,0DJ^%W 1cAGKLeH(Ne.1D7f&K c ANUS f A/hc(Z)aU 37+&Y`פd1f0Lq2cVge1&gl +3\O`|[ۺrqg+w)n.wVhyfgI1' yz<)81"H|gⷸyRՔDqg jf4vNYLOgXz݆qW]dKƍz-;EWy&)V\[k=li @237Ց`7!P8X/axRUT#׫AOf. zW=v1䉱Њj۞Dġ:Ap\POkٷ{t')=,d-A%;/Af 3 +U8I@_Yț5}ܸPr5?I.Y< 2  T! JA0v}S gOxvkQoGg" 64!(9h$,Hp ix(aMtR%ۏݼMg``eQ 5 E̺M Wn閎oA) vHg&`4dz{GۀBPM^.]}Dud#{XirwmJaPRqcܕӻ/[3l߫bHMzq\<2)=ekgyȿ~бE{q܌bJR|͈ey}=\Гi N8Ў\6ZN) EFfwnm%7+ H-*` 6!V6ظR[+դmHQ&͚# ̡%gO܀mAd;cH#WJkfu" bvJyQEwTRePF#ȃ`&k0R9  eʵoKv}Q]_:* >Phެ1lzC9D/Ǿ*n`FΈ[XWV*eSS7D.'iq=O|HR +eҿ>'o;l2l -r%Q1^"̼\[ 2Lrza)d]3Fߵ\ĞBxYY=MPnkAT t"G@Iu H f Z@]hA1Ia|x-ʐ9b ( ^i"gqiD9KJ$iYE;\u Kٮ{5ր^qk\ӎQg]r&WS`Q־:kSʙ*p+gccA;iUs S;ic{_yR6ѰiǸSpgɖFÏ`L L!F[)]:tSv3T 4NI,ìy}2sk2)n5썃lpק7$nnSо 0tTi=aNN[j8$ xSEϲB~:[b߽D?Mm} -5ɣV%تJ m\Ys$sP1-}P}e]c7Es{KL=rr= 4 h,>Ўi8eJRi4L-Y!bUkm1CQ8Dn>to%%}ޢ~cq g?Dn#w7j^ͯ-]0?@~:~"ᒶHG{]ȏuvyo{ &퓼0V#nCȔ׊(t ;])iJK (iN5"Yse!x%p+3Rcn|BփJHr%а{bzɇئ#n 4t;@6L0ZAݶBQrTrFZ&rW҄ٮ{eR m .}*q&x+VA+}/G^pc\"۳x_2rfdbcGi{)?@q4t# 91هb/%<>!I r–q*np?C 2Z94%κ_8$V,e A=3]:P/b`m jO1VM%:*SUJT0:ݼ젔lSN-KE֨~t"`[`Nah82`HӘ\m*4OC> otqY&^`#E͌ߚ F?>5(؂\A?I#ʠ"Ԣ1t-lf۱(HKf IǼFf1c)%mƘ&3"ʈr:64I~`Egq7?z)h}ipuiߓaccw'WMCԾ z0bbi} YExDlZ 5E_gMH<: s%8ͰuOwP5 .h :"O,^F']KQ@bDu,;Q紴V\;wV zQ/bdf|GPB|il8~.Y'i27A.7TtDUoY\FX;udS/st0psNvgMD vt֡E{ P.-8@G.n$@B9Tc̕!^J-YwcJRrj͚Ɣj85ߘ։hgu!c%kM10NěTA{+ "ޘU(@(х`VnL-g j&t|ȆNB` tD(P o~')Ob $Ob o)j++Ԝiaё/E)+n2OIx+JTLS( L!w +}F-^S1T&= ulMn0}Sjɺ;+TV6DEKq5sT=97n!jM{n]ݝEד=ݯ2}s.z1WyG^4nؑ?5 F:w5Gcz;NüEm M4Ǧ ýޮx7c+wP+H11wo݆oDlJݸ`j;n2H11xc{lFS6O~ y2Vf K.u)O./&YDkV-N6t,*w硌S1bQ &psyz]?->ޚy.3ҧԔBqeXJJo'fu޸ JT:pƸz{%K7$~,Tf0"ħg@BFbڒkUP%IM1"x/*OxcɑeU4A9;]߈5҈Vh =r3gcw&_NbԢt3`AC/[,[b\}&ZdwH]h ."x4̹+NT5` ]xTS&VeOW!>@z*HrəC(8 LKόjmLGB/YFnF)@Ld 5|FS@'ޭ6;៳߯|]7zBol4j .36:/;E02͆ _>#uVҊhڹÅVz7^=3\lLj^0X14b|R+Q|ӑ98f+ݨa^X~{_L6ˈwtv*ai'FOD=ih ]Y#XHWUPVƹʀס4tOD( !wu$vBUh0%3N$sMr `bJc*̾IU`^е"6 JZNU,{>GB21BjٮD!h!kQa =pEΥdn(BrM="sc1lq@Տv*5|UɭBV#M{v3BhabMFqW -q mHуUˢ݃`ՂaO?]uɭq3pU:p( (P*90Ƌ'FH`۞Dr&b-|1=U;Ym#5zA7[%H}1qKqSEtخKds +Z|x_:$K M9DpIt'mЎTt\w"H)WDuyZR)˺bj`9)Lr'wO[-|pb8dvvHOl/riUK䣷^mE^|iKϤY1H[^EfN>*C.0ihw? 찓 [5*^5I]JvWV5Hn_Ij)ߌ$sIJ9]RzWw뻤x>Mp 3a&3{$My{lW?v*[!e R0s{KK6DذG"hMqYr7,f7vN83Gu[5ݾKp EC3WgIDM{ꈅFix4[Nt˜Jٮy\a$]WG#9 xy$DFIz^<֎Z5 'i9+նUɰ~yZf$-T:ԣԃǝNt3͈Hw#z{lm[I]ΎkA*5ΥdBQ:OVjVLե,kGJA ׵$267$`ޚ*l'd|Z\>-KK|9߬WPqp &Qҵr̃^ 5F^I)"Ớ՚XKiDdͬY^J'KnFGy*Vt6WpdYcmC@,auQT*s8!zЖvMg1ˆKˆ˶{<-d&r`cHTS#N#(8EJ (q;m6`suN&aR IcֲgQno ө10?g+dR D8M9*AsÜ16Hl* +؁0J 5ڮ&MeÅE!Q\xsgkBD>XS6e&ns/"z$닳 z백2'Nm r@rގ:n aU촰,{OM.:hz;O6*WمIs7]\.>x G $WR09\ObiP.Lû A:ooeYn.+LJ&a災_ ^4(Cd!hYSXt5SV^SU|ܞ/V3=25\F:^6.ξ_'UPm2?}Nb;B\-4J3'ڌjiPJ@74FSk=fez3AE=~P*?']'H>Lh9;P79yˢ{sN?DSOM_?gWJCbYM]UZ8IU% Q0{2j~]B~q>ZFQtd'po0-/;'3xl2/ߐ蠦!!ڭ6RІ1\n/R˕6!#`#*cb Eɝrj^j 84={s؈*|=T{?cwűTRʄwgpw-O@$ıE;$Nci#+xiT<:4dDí 1L8vt+ʮpLcz"if^e>Q{F6owk{wxw{:I}n 64*RqGE7d0%sRXkzuG_vCAdKjPe; b"Uܕ,# ARZj"Yfb-?;?>%$ГBvr!)U _ۢ@^S,laM1ImBZyc-F\pU89U{kC*9΢/a&{ e} @%YxDe#gzA,MS}zh˅CUׇ#f99Mkf/eW;,K)ofIյ,\p?3 /$ (:d;!&o jCGl舘L{'<r ~6s@]⺉7`"+F}Q\7oޒfVzYKi*'SԥzuG3pkYgOet4,Q@ydoC}rsv&D@'Hzwr1RAݠ^Y6swXY5(S%Ea5(iLV1ɫS~4OS躍_5'dUgJkj/wDOY/ R1 vrH6c!?GN$'wIr~ZΛ%E0,?2\ͯϖqIp0~xw򵜟ϪO ,Z/RIqHtYCO*Ezـă~kQܛJ̾x0uOxй\؛JLb"JC$mx:爧D Wȏ)ԶW\~S9\_R" Fhv,p8tJřu9 >\GV*dZ4NqDZ2@áW(?~Jy dPӔOb}iv/ItϳW&<;j#18p2Bpjae#_>lZ=ڳv %!LJm)(vYXnv6:VNRA,Jx(=YYC0nY[ŗ߿-Z}ؾS~Syc=yΣCfő* Sz{~FO/h?ۯ+P2;`rtX67cҞ-HWj]BIB-H̷d`v@N"Y5J'a$K:'j1>Ԟ08NPc \<)~W?0;PyeIYl]K WZS}zI  \ιyԃwU4uU(#Hre jZh*"+ Ny[ḷnս qx*F/,3 & 0+IDFcI裩_)9x#4W$igN@Dx׶Y9ȕH`yd9)h\rJabdj^FSA/Qr3H'B6дBUɄ聮zi,Ьz{{|(uOזg2,*Yu6 z2]ʖxkxZʚ:$X 1e8?;S,#[\bHp2^t/; R51NR;7ZւTp`/&~{1RYüD@Zj XRRr0X;t溪+kU[3VS[1F"z(P-uwO:Q[sAo}f^3BmtքL$L cXNԿi潨fFq*aR3h,yN0y#G.0 }Na~z@k,y/(D(>q # x>76ַDZ)Xh)6:Bڲo[NJǒlPǒKP߼-Vזl:qHlSa{M%k6! q;Muu{ԇ4#C+/pԗ ?Z)эX^FGB^2yAjNnPLh]4aIdH|kTSNBsrq}4FKc;s'N%P-׳jx 7;0>; ~Ո9 JQ*h4NP{Iz7 j]Tw>HH5M,no&bOBT{BI7x76!5daWS6WֻT؆W<, B`wVzn[YY 6j6ݒ}a믎K*pڻ]y蟳1M#K6g%J;ubGE4K:hm톍b[.bD')mpb%;XKY6[ +fRlhtNjjHcVGQi,kR&)*R ޳0ӡ^MMpmF,OPDݰ-n{4TM(O$d8`4R!1;vrZSp˵'m{m$fý`D 9ØxnGڬDi` 2GaƠ;Xivjl)lgˆĆj['YԑчcW5-| بӾ[#w[;o ldP.+L!V}E| SeSfgNuhp9VXG y95˶k7&-MRL{ xQAqqzla5;Q+=@/x"LZ^`*~Xe rI]7McU[Ma… dQ8:iqeeq1!69P+turB) $.+uݶu'R E7{.M+&cB*DsH*"NgQ2һϨy 8'܎G):.ޞtJIʕ$Jh:HӪeg|siR;%i FlK,X™N5JevBl¦VďG1f塹.e6f[.u>3{#9Bd!ڒ bX>G@ J(h3!TGN C qVs*EF^b_wLz6[jaB~{d%sa[Vs, At8'4R+8cܰlx5`#f?',ebI% MKltk0ZeE[CəNchExޠ>_=ޤlM.tՋM 6< 'nѰq K}j-6V]P^|M'p5'˖O?farx8OܐڏS0Au5Sj7 ~{` O0/"_dz>\\OZvq+naG6\rLF,dߒkQ~Nn 3yu0 ǪަkDrlYxp+'3͜;2:RV~-[0q8KJ{pɽ;{__v_ƨܩ}Ȭ@Ȍ3BcճjPѕ^KOg]Yu Pi39|mr*蝍i)*4<^W\SwQi2N1N8% q4Wէ-@$se3jީЁ$#)}^_A ٪d 2M`%1JQ7LAULEa$ZFXv Kb F;XWpIl@t^$9oIiG + &"GD@w` KO oY+G(vX"Y<"J*V};'dM >աEH|QH09Tz`(:.Ǹ>r_N3t,smd.ZT1zxJ%Ϲ3F<#ӣТ +,q4W*c\3W` h XR7Pπ&9\]ij)/?Cr!R^k^^/X |qLak_L!PPxj{OVU>;b( 5揪%eZc;2k.NQj> 6 W{L,E7߲oز{Wxʌʄ,uYL LZ?EjY3LtSJĭdH+lp2:#|9 )QIJo@`#knCB l7xrR3M0gKã.N"@`:Pd@DL ]1Vo[;LbƥSI̢XO($>GZ- Hzswul% %<ѽyJt[rylU\+*M4V"uMVz<2g1B(ˌ_fR׊gOf 3$͍}!g3cr#؇ʄx<㽝 LIFt~zBi<)|nC\ eY H`3kBMʡw3쉥遂1H[O+, +3ccǀ$Ų}DHD#aޙ#ݢ": "X '/T3FmXy9 -2k7#u W2GiEnNK1v%WH۽"e.9 mr6 '+)WI6/}ۓ$ݒt=TaLмNB`g;6Tf^H$i)톕ȼUK-c~Q5Vpq2`R-w hĈI9Bs\oc"m GMfA?*R1&9ќքʂ  nr; so2kG[ _-绯0 SGgyDّޘ G|"S[rm{$8-s7=[FQʴYj{1Nc1 m $@~8 Yr0q0e 6d+#f -h$"dW*XEo }q6fc=۩e\RV>` &UHong6M]5źsAT]{:Mc 5|zqYjpJ*0!D~n%v=iR-Val(3]ChGuqvꆰ3EŻ5)%)&X\zs[T^,Ε{֊6c-*۠ v^ 聚tuH 1>hz(s9 zqpih0-Jt5vi)3Nպ %L]|Vsh5."~ 촒q+[ll;rWj֠AxE!E&1-Zw\kW[sa-t5pb7oTőmHa.C;EeB奪nuu,CRA_yK5T<ꢭhT|mE)^ 04RQ#`_ci1Psp#;i-+BلD[=y@J%*4v،`g|oNcB,mqpJف6x~~Gne b4>^tx~ (\F3.pB|#* :^-ON$(OцNJrcP;3*Uo n\VUe5OWР.TAZ{;luٝq1!PӘQ ~8? Cⶒg<2 l9}:kR !jy8x)# ; #wBjjSrcԗ>^鐺?oYF-_ZKw;[뿻|ٓ' LkW: ͈E^)>'{OQqҀgvn,5y#p12"1&h TJTUNg .O/J]r,V7*:F] #CFh}1`(cyqFUN[M,sZ9f(|;UuZ_9ɲMۋQQQQ>n]-N˰9weQ`a:ÆU dy-k p^T2vJJ(0V*xiya#˻ݕ !hyp |%A=.6W&7+O  2.{P md[nn.l#Ʃ{Z[dX"i} *  {8޵*XeNxX,dE:mJpculLFKl]=)A:yMp5f9bgIH62^ɔ -iEbºnp(dr{u5HBBxT3I@D5)duMtRSRYCb 70'JF+P`s85|,| ȫʹX4,ˍ-{w/7ZiTCxCAȖ/bBʖ:YҁnՑMP@,oݞ)YY6U:l2x4yM=I~IH/I}&rp,a~쟞sC;=Y*~UP'9.D#7{j'V DIPŠc1˝KfޏcE׷p/FѦ7bx͏EPF6i;OWc+ϭ7aibCҧ{O 'φ_seI/\~;ë?rc`/uo/iGgq}z<6[|3/~>ͳx/ͯ/l~_ʎfz7?|}S^+2 Sovb8o{1/R`<@?>YñzioO26? 'v!}2,5 3P8qa/\| 7k@]k]~>ϧgqo_hs;{s}'vz>ޫ3xj{ /[{pqeg<^8jŏ?-Dte ._?{Ƒ lQV}c= FEJc俟!)fkqoñO2pYί/UϿ>{bߕrgI]r~ͻ܈rP>ǷoΠc/b~~~C³{WO^_{wUjz^7=$e?{S$DP75Wr]nTlJSjtR}7 P !^#]W7W^?~~![wgPV%_ :]>:&\{Ξ{__eӍ1 _:W7/-W㌼w}<oЕwOx;o]?^.ь|9˫?&8XKd/./M~׳w4ّw/~}C/4daA'D2J 81g t[]V؇HZx]؟X?Iޖ?8jW+8'y+*| ltAy/ o(b՚q Rr=uy8^S^Za$sȲ_UaT9 JRlC4kڠ{C*?9,W W/߽9~>&0Zщ6^ 98rɞk3w%9>8f"ʼ R9~6௙pLȾ?fg)j2nT"a:VʓDk):,K7Gێ.IOK4G '+;x#XDzXd]] PA+ VfNRzѻ&x^ɕʝa%?ъH2:OȣEFHD@FMcOHN'&.>5N29\$sdA\.M#x\RjO B_y&9Ghh's2X38$1NI6Z@@39I8"Uz^9Vů7b3{Ȥ]UH .&w%x"*fR("2q#Ab:q8-GOޥAĘ5<7Psa6d@S9ӿ7d@6d=ӄ k܅8&*rLT(ob2&zIy*Xt0FY-BC+9X?$Z~Fhh ب,"% $}9^Dq3 tyO6ch?o>^OtTy=,B5cK\̼7 ;w,8.1ѧy,\_N.@ˁ̒ N1c@br1.S ,14U5E#=2RfS!1 !vK/ !q1K'JλPt@.tkxWƄn(9ivH/(o0{s {J_ kZӰe+ڧŹljI@py-CG1M&QHI@X!b摱>/Rg=CJ.шSʌƑDo KJCфzZA .qg(*<^+y"\Ғ73>Oxk ")0ڡ:㧐*Єc3$ brY#q8&:גY 3J#n:QJd~T'Cq]fĖ4GpxuI{ך3FeMMJ-p&-"KhB R qUGEu"HT>hñ?aY~ ETF>[chMq<y]ttG%*zph%!O/% \5:=$Ⱦ47ApSYhb'/D ;Ms^ͣ}W4jYAk=Arp ݽWQ|^Vz/*aF g߼=aUf5I՗q=Qb-sV%.&TK!IڋAA~tWZFkKw=k}t-qb3=Q9Z${?w3ζu 2ܙ./q{WBl$!KB1 =NQKL3n+ﴐ:Y~ +!t$v˛Tw%IΗqL5\$щ}2(0^-:޾IPe2R4QZn҄HHJZ`48B,qKW W8? i9 򺘋ʪV1]1ނZYT uzn}5ľk?C3Ψx3,}[չw\a6P(҈ĝK-(uI[,*w5\F!3gN<d07lQ7 yl07c6' {A$ Bqmo 9wT,X^gCEP10>6_FD^ ;| W7 , frv|gEl0U+T YiG8-5Ui{2"o0=+ kZ_9@es{8غ>)B8dK+Cdn\6r?LfQȊ pګPk؊d|>oo>6&|_[_o,<14ʩ$ @06JSO!rˤ$k):R O?巜RCU|j(sTnWeji ݭL9 T 3(Ic(aij +Ѝ'mJSl@6)MzS h\P5E5ޖR%c#M$dND8ԒknS2P#8udJ1)V NmJh*lR=|0FUR"RM "wN3 s *x' 8srCi'}ڞg^4ۓd=lO6ۓ۞' `֙Ӆ{'P !(+](%NM*X7'++h{6-Gjb I7w^>ơ<<$F=FnnjX)0RDP.2`+s IDc 4cBٮbW 4'DuƘz?{.L9 ]lF>c=Shhu6SE|ǢQ}HS.x ^@HƜ@(V/ec]㙻X c`V~9Lr/ڸZKMT)NG*AWelW#s8)EbwuH>b|XL`>l&Ų$Kb^Yro_b7`K[êsb!U +bv Kj'3NY>ffzM oͽx U./oQ_?VkۧߧwSG 3S }wo~BewO&\}D?zw #rO>N-ilٺ`P-(Qdj<ͨ''C5 ۑKB‰w~^w[Ii1c|(u:Vz(' z/A>!V bX"l z?\dLTlB =cz/9KQᗻ=̘$6B yR %qWj`fr]]t[Nd XJ{??HfttW7ɩ pכˋP%a=u}:ScV7Q$ dh ZmB@g& Aɴ7e{穐[l@{Y6nn7,H:޽4($mvUbUE\I;JEZbAښ2BVhQU|mUD0~/cf佨5r *ttU P?\%@d']B ]Iהb?JWH-+`- .@r﶑TJMń7 ^ts}Lw Ϲ 4[wo^1_\V]FҦK4s$5{\ROKs*Q .@ z ^?]=$Fky;Wz:8L¸}%JsX3Q I4z `;79hK1&Yn vJkJ}Pj 9!I)D!%:bi׭.m;{;',߽f"[݃Ԓr5TcE|!PCRK^,"\+ޥeO.A!Ŷ*\1/ QcfXjG Ʌvvkb"[ϝ`R; KQ!$ccQ[2gbvY!p ЁbU6g >xV)P8)8{Ea96wO1L [0Ø41vD$Oi%S 9ksC]mKR`Y+QtY$}\33@~S7,xS $X].ϱm N]_(#Yzab9DWb *5E"xJm!#NT;e .lhfy{UbY׿>m %N޵wޙ7)U\2&cCP^Ɲh}hUM|.Ā|8]Jx]3R  Bj|ݩrQpF{{ew\L5PS!.垻\"7Jqsb%%ܸꕍ@{g_! =)܃~#% N8{QP EyoSra,hнy_jX}b5Ve3 PP0C>jgckɁ1R&kzO8#o-v8{QtmSk1^g҂O5 WZK1;r< J#,ZR*-*Zse+ vͶv6~gۊGP)ms#mДmAAk$bPzhP Ncm ((amsPI Ux0T 2Bvf(wFOBCy$ @Oߟ|;4Yd>*!V{F -6xnfZ?JB,pm]Gџ^eXSd[nh3#sp١1!5<` E"#h$ "-[n!y0ux2!J6T 6<KDd.KV<hS!XZ 28ra\J{=%$j;e[DI.0 Z¤n# Ij İIM ?ɷCc|ql?Hr9ff1D®v VB$ )DR`XA,<5dԇ ~;\;4&@@\H=UUs>Hs,R9IXdAkj@D WlZ4 Ip0_.&{'Cc65$gh5m*?"aVk>Bp"ߐdP[iAw H)O bJ hd@>vhLի%, Q=ЬJ#!ڰ¦& B_8!az ӔI`@Mh0 ƭ] TH~И}X{Af$ sKi17o|1B2~Uc%MGbm*;1иCrUɪXM%\Cɾ$l{ȧZNEǘ 封6m$zHc)]M׻k!ݨL\_kEDnRKw:@Ufmzc+u$ϴ t{IZ!Ard/-u۲򷯛[9MP*`Cʁ1d,^DZU:cJ o .=L2m_<3Se=zuP@8, Y[KGg@(yOQ䐽+@ɏ95NN0gm G;{jR4-g),3UMMZ<ȶ-('F5Xa,L1}/[]N0ܑ^os-`SV N1ͱxȴM[V⋖-8.jG* mR&72mY? f2B⳴bGj#㖶bl3n,O1†f-G#d*-k~e^ˌ p~<1˜LNTM.IBŚhjvEboRCM䪩jZK-&Ww[4vFi+GVB̜ŝ/( Jz,B}xڤÜڎR커-&&WXgg*l K ֟TTZ ]n]=d3]4󬘙Fzn 񼫷&iN[oMf Cjņ u̴klUoѩ^X36MQ8SO+=G̽cL3X^7o./v@ x(juk{-\8{nyOojn7.NMW55u NHP݊EB ׻˿}۾ĘG=&?4nR4/_緉rcMǫv-qǴ;.*%z!ne޵?|1AͿ' /T7d͋FowY_ X}6t&^nϞOO\鼿z8\'L\#|FtJm 1E*\K>.ϟ>9p}M>~z< kY0VGOG?&nāGcY<?*wש"^ ?+~m~~}?m)Ɖ]ݰc/+e~ovAfdTaaS`aEUԒJ> /On'Ws{{u꣨2ᣣʔh^kߗ_O`{IX=scm1q?xcŭ?qa 4O cݳ`vo'v2'D]O;ZU{h,OһkX_t=F_6z1=8@>쀕-N8vQTO .v~lQ|tWm,YnVwXV?xuΛN>v'<'?)'?G}vYX=*~X#7%_AY@З(=ҼvVeiڻlstiNrukʱt9wi1uګ=kꩯQ):NUjyN 3DBawupu%cGT7TԺkr){]`b%,[^ɱreWIƦL=CFc_j2FrUR6#i`?,͗[zMU,Jn(05__'9$m>R܀M ;'9 väᝓ/[ËDYJ`:6ʲ1٬{vάVnӽUwmSLOi2n<1=[RM60hfSgOeTO>POɂx.لQ۪1dZ]L!>PK7˹cEm c/Uyi9Yyro& uVݮx[nYdgF/& UO}-0)uo %1BDIqV4cHj4ϓgKG ƒz699-{ A`   S;RsNz:`3K2D7ɲh*1& $d[U ɬhTx2ʽ oMзYJJ6}q(_m\vdo![Nh=EefYϻ OEa_%=Y~sWO-/TA*٦ (nfil\8:k)=^ʰr**nf(u NZU-g9fega1S%03 7y}{-Sk8j%|b(vח |Bu:4꣋5d@dC֊5nG|[g'r yU~#h6~XYCĥ(x,F2;0 kBHh` f׏#mN}%|Nj)ǂݑŸ͐J}Xgq& ϟ3{ed[ n_W滟E`ׁ `HPVT;LCM0bWIVR d8@6?32,pRwsDz:h&0"Tz ֕4pf^h&ث%Q]jo`}G3UW}mE>j&~_=UIqh#񉂵&3JOBbX+]֓si(p#wJ8tYdK"kʾYk1"d)#i}ؔLp:]sOAJv-v ȐA{:1Y9aR!!TB4>ov {%!gArBIϚCGmWeNߜx8}6*Xʒpt`h;JSvs\@{mtT'0=LL $(%}toF)fWI# t[5rMhPg忣 a[̈Ն*ޚma3=RbV>5Cb}^%e8I쇓L=C8Ԡlj`P'$2E4xЁŸZ8ơj+'˾o*w2kA'DwRP9B]`kP'>mSi™3A=X5[)(K$ lrCO  l tkW*8{ +V!q@ѹ=0pL(0S]7yd #Нz ŦT.&"3wqҍetV:i\;HΙ ) $f!M P5D:Ar-!5۔)N5(r'weF3/5hiѪ9CN Arq{ f7=r[/.A>Ή@8J^* lig;ϧq/q: 9T &y1='/Gs$`睤|@-giӆ?JjR.Fqu+i vY˸CHQUH_ ф+qs<$/sK1'S ѡ191#?:g٦ޤ׎-\IޏTq?Z}I/|b.p-AE>Of\Em!֔JzD"W3Y4&OosEۈw2R,Ձ٧H9ʂ&PpQn=:vAT/Y V\ZP)"(Y(NS$Jdq%v˕VNg/"1ekL.8"ynP4᫟shMe. Tw Z̺izSM 8NsC%L`=&]rjtjQIT3qfAV6c־L6!ZSٛ7vE_=Xs^#/$TUctg!~1~,Wߢ"y^?Ǘ_dj7,vEq*ܼqQ{y.T&pǥ";&4X`vd9{B)i,q"qR>eB'\Qۑ'G$#*Ds/}~^^\Zc:{2Tej۫35ZYU~RyUS]nҎ"0ھt9Rơ/6Ub~&V%s A]m7rY.[1%,ExFX%K^ӋQAfj!tʚ> dvaBloǖ. ŃQ;YS\2c!2Faℒ?c ]bivyJun; KD{9:VEn<<8떄Gtь I0(c9EQk2%\v!vkF0C`G. :WCȃSGSBV{TÉg ڡ]ԞUn(VGJܻ'Jj1\@ؔuO'WI6,N$w_  ݬ(Ns"dsa1+PǩE$J+UɎp`C!ⓅP&/ds"{WL:߂/EY_#"N{Lj}b>)Eʰs{:;DE|:;F\Ѱd:;D\weg:yc0٪wUt5eIW!al-]SiK =E &2c)aQH0szf(д M MP殣orTβ20EkgL0-{PDd ]ln,#m+Wiʖ/FǸ N8/ʄ@JWh轫@'b#iu_Fg xqG2\o=@7u+aiDZ#,jP+>_zcj5؛Zrv?ӹv7(8D1G qL!`DT7b12|1K/[B@ MȞM|'61N% iw7ui5':MpbAtsc:[C>[|)~ f7pg+=Ok\]^iθl#"ݟ:+Z"wDžD?\=φt|Ѥ@w֟/@{QgbӤ$BdžC W,'*K`\ap3Et>o腻l-ƾ0k׏Go~Gr:^o{ Ynp3=x2JNݳi% zݬe; _w#hxAx~ ¿OFWyK/{wӛV߅N;Bᯃx<);n[vR-(ob';.xҎeĚ9o HV|i % 9t}Ͻa]Oˣɞewǡ;|./*p_mv/w-w5t{_>;9/UGI$~ve+׼,;G/ulݮ;p=SݍEWotl/'~Ο_>%qJ>[mu]#? ׫|Zm0w95io Z?_:w^W|ypDt7&!$8q&37P\_.Bs0k$iVP?@}S( y̺zOȯ}0x jg/|yzrx2^~ l}~2@e/T~_D) hR#>:9-rרϬ´RLӝ>d*\ϴb{gBsP ď9& yb@LuJ&Vi.X~cK)WaDХg/@5BFQ<ؕ~;evOS*lS`3E L/qgV9/XYa0e(yxlނ.\o.p;a ŌmAO6`Q#_\&)xPe*'M[22mm̸F*ꐞ`A*f n LX+1C1ZEXPA& N2*4ߓF5%TǯP>Ӓ  Ls.g$=l;hƌ]W?Fp!bw2bkwnκv X!ڛ#00ufRTKMKv_ձ`U4_"Z` xUr):JmMgdrΛvuִۚv+7KSTt HK9 mU`ΥôcTqMyR^ve(yx^*Xiq?ǣ_ernw}~Vyf93ݒh%tKMN3 (}A'UZ5nJ-@_u^}^;P:gSo7EЅ\C rhFڔD+Oq("L$x[D[7t<4|m|uF?fN~ (6W8ZUf:?_#rXk`v޸jOXϯDh@ iWsq$oS`W J%$@ꖄUBf$5߮C,MɱW8 mTg#A5z\C=tՙ(ʑ^#Фiw|y00Wv.j"9KA%]-.D0W!RS%iی)քThf-LaAj(b#-u\0'PVbJ!<)XQb6TѢj3<4wKZHqqMͽ 6 4-rU:E;q)qsSCDXlHV8j)UU/o縃kExm^0%WK7ya=ɸSۂRfC]ruxµXޥ ')FTzޥ\$Ɩf Ub᪍ߛ1 UT C-qWΘ6LDM"&E-* r0m^*9еi K~ˢ;RRiS(,x,"κ !Wb%%*@jB6#ib.Fy9S鼐&nhk=T EW<8P9 l|2R]"T& dS$z'#=I#̐] 5Ƅ#N}6HKIrx|(XddH  x+HO$JIc@H+`\"8au`0$9P\G\9aevNi C> 荩0}Z핷\i`"I1  sY0+!8=_ &*X{lb\}#sރMhPQ1 G%1מIK3(1&PF VPc* gqЩ%^s¬e0KSSF"N2T#JAP 8e ^]ʸQ,lLn\L|T\ly*gktI\K]]W]ժDyYU[pK -\%q* WI\J We S2#_˜LM(Ubj%(wh6‚& $ @`>蘨* )jO^lv`1x`12 W_8Et.P{Z D/9NĵV*Q׆)Q׃*)h˯x r~{ݎݎ .-=00Ld#0aK hiqaDy TTrf7+Wj[zVu. uX|0GӬ,EȕIr>*mZ׆ }+q+ƱNjܡqE.Hߢ?>+_RC6҃ec KsF#Ɲɒ6.ӾT%AGѷnv|=T?R ΘG]b.$(Gbŏ$͍E[R͟zb'3:*_تw" #[!z; $fĝlߪy*~^-O^[qzwb(a% dxVSwJ~X!JCՇjF7` D5X*% Ƅx)aR5'/mQ(Zվ>fSx'WKY}9W֜Wo^I^aƏ؀qBgD͟AKx!cr`!)\:+B:9*I^tX,{+dA,`R7wP*BT %ZI*lFkMA{Q>TdOηMRZ':}ei Q51tL?[:#~Da)ӒZ±8MYʱ yF"` ];ys'_|3r=^3ObX5?>j6Bw8O q4+H|vd_ZOP)o6zf4~3 :Jvx La0vHw~ߵH?Ab] X9,_݅JVTa*иkqb$ sƒ 0"KTh0 cWN z6nWfr`%גkkLוnqk lsWTJ WTb$j[fowӪqĝO*5&臈aԬ~jV)Yrc>u5Rc %w%mKhbXvlłl'X$ҧ5E2>_z F/7qyϦ/lMKfݎݟ0 2;|&6%{n}WJm+秛>OWRd&݈г0  xw`/7(g[ᰶzyĔ3n X1D-)я"2M@|MJ,ŦNK2?E>^{b >3q˜\#1u/LHMQ.Z pm)#2"K8S֨IB(({6*|w}[3j{6<\cFuch\8HI\jJhpsh,N) N¨| oKt;qe!oKp@#ng|~^?Oa.yKV{^?s6mF~?wz'/:! 04'i:qme Ÿ˛WǯNB'k{RmlsNaxWm\6hem\ZUo p8dcP9<.YaNwHx]x^xi|ZTltvK'tQ̲}(./fukqmG7z3O\g8{/;{~Ԛ: O@ 3=|RO.h]BR}{ ୯OwLE)G-&S;u~I6Hv[0 )h& O;WCW[):p4ˮxP [m'w'탶.Ӈ嵤1 efݎ;o($M\:)iY:q Mj N3 ޾<M X9|Mr4=}q#O_~9sqNmAzY2ˑ3bTMw^o|G- >Op_2-s/]A EfM-=T`lx^=~ *~sazNZ[ӯ-.SV EzyR aa|Ck\]xn!S,ߋw#,KkxB/l/`9M{FK={`G$7IPs W\t8oy?{>XfnX(MyZLkCi#h0뎙؊ 5B)}@zqd8 1,BXR3 [ d b w@Z; }탃J1NJ4/{Ifa_rh~.]Β,ܞ*Eܘa;4+viߕ6NO$-{;`r-3J~l%V=Dxɮ&LŵOH>)A?[7@B@LF <U,48/Uڏ/BO Y1KQ,Eue(׍ӭQchmn(=EIE3NjPGð|%4hLjm],lrQe9ЕH ڹS [ i8 T%8t0f9P˲fۙ>NiĝW_a[d a?Nl6cfic (w;xu B ёR DG"LZ5W K!n**  _Ek;ʻ8 WPh"^NN.wOƛ^olv''=kPy(Z{yntI jF5Eo56pzi6}uD薉$zhDT]s2^|tRLzS%jXW~lmalҔd E68;gv -J~jr7*b)=FMņm%RjIRJIqq}ͭ{-5{)WJmY7(^*"ЂYbhM0dLIyF9[郐F)"2&M!4j{T:˼r P(}?Ֆ-&.iqxq++$q=$DL"^k& T#eiId.)ꭸ*0.S7]96]M/8m3^vpa^91-QdtLDE"XEP^Q%.lL^n9xCva*R؍ݒݤRݤͥXivM5XvSMMrET`7ΐk9C1&*]ۨTn%`) x;Qbi<\l⢤A>`qn(SI F4s./'&R6+gJ\TB\PXJ4b g|dy*2D`ae-4g/Ÿ\PP+5y9cst@'T`€bIPb 铓wPS"hM4xe ( ($({A8]'0XA_N`p\#:(uP\-̪2Zk~[(ټ;ʛf,P+o& ϔ40{^nb0Wಢ\(SƬaTi4%\/ױnx!q3@VgҔ00%RA@kݢ#nQ!5Z \iQeM-$|y.It<7xM6ؗn`w* -i6N dZ>.* ϒ#.|tV^ӈcCsBZU9kMfW`6h2۲eX9~W48XQҼXs+c_IQ"qJ_'(V/RӀR^'S9mFLN+i'ۦ90kLdRLI"63љA|rֻh L`a4240Rc*={IX,n5dQuJ1(%6 Ny&5OuT !&˸䭖VEDǙ [W30TDE00F ^q{e`^J$&F֢2 b{@ZvDKK0gT(A1zy*=/k"E.XTCԤTm4ޕ6#"ew*^A ULo7{^`Ҵ-%`ʒRRb*Ie,_dyiL,XH~&PeE>S9c;,@vQ%-F? ү6' 1L{[u,6bH-j裤0v\ZkKDKl-5&;vFxШ gTI|#%l;O1#p^-mJ:$^RU';\,3D_8y&>B2b5O,WرAIKP4IA: /\fKTEa- v|D`MKegls[Ѳ~[8yemy-ҧ~GzOB,u^ hD4B! %t]rZ10&*}F"]x'aEog&]r7&6,!K!"܊F'-7hF2V!zNu4ad3 fćF+)ڎicBEۗa:O'h21VM̌dlGx*b-\ ',#sW.cؓJ\jeƋ +S$͐ULTBNuRXۡtƬ@#`N.'Xïfp*b2Cd #; Sq_1jDEFry+t+ ۇ5D QT/;I > vDw4B ;,sTebIBElDɳѵ$NYv;sK'IjP!D7e>C*yDn}[o@`IW`CVOhV{oބOf>4iVV҇ǛR ;ٲ"[mZu%|hqΚYtwC^4?t:0#_+WuKN%UxP =qE(5"'_ܐ*bF^dU=ZyvF^ HEFjKH Ȕ&x6|EA>#V^"(|ЀLˤ=w.j q[v݁4HL#i:= UxS7Ek%؀PZڭ} A+]TwPP7__//87!xR4 O'z3,2I+ii%8 I.3ˣ=ʠz!\BFqo4hyw0_ |ju"n$' .Uxq{ּ? zB-M%` 9NR@2WFC%޺6|觍}&J[bx|-.#_/"񸚮xu IX0djp4 b ;7ŨRK [e[ih]B/Ȧ!n7ȴ`7 @ե3v ; <͠,׺(?>|ę?Ǔ$ĸ/_}L-~z;Dl⣢uo'qșrcZQgzy2C>s=H&4afUX9P׶l,EJQ1j*p%QPbq ɜpʈb].SzEʼncW0!'$}PLn{B`wp\ ~Vhi4&a.jf-mU|1T4dXo ^ZifshHkI[$ p'r$t806DKCڻv4ZfЊǕ@C,ʇQWdrcF|F&z(&3fz]`z)X%F=Ì"$TER 4Uj--! l dڐ湑yքӪN"G։,n3lj "uLkk#wQC8)QN+|%o~?YxNq,s,<ѤO濌ThW 8bzoi5Sgvq"KVNݘ.5&wH ZѾ!=^o tz1>pUG|"o;8Etۡ qx끈!y2;X˔<ym)y{LrھOLe#ɣySֿ> 1.B<(u?Mg.'Nv'x%K>NFF)F ;W,f*«ϟ R` -;8vjX/K-*6e*W2[Kr u|YFmdKiKX |ӥI1,^`|?4" [ɪ˼( /@j'?ֶd~:kČvKя"lXTwc>- IU6A^\{/'#B&wgl!#$r C^Z&(ߎD|cw? z0',i+kCZdnS$p4F/@O1b{`ot ۈ)~>R+Fh6'?!}ϾVWb>1N ZzDX~/݅Rss2aeMsZc.92Ѿ~hm906'.oq$ҤA"BtQIyµ}>k- bۉ@DtBeVx+4Δ6.qFFL0ғ ud3g#(Сhsu[[I&nWsOz4wjULO0di1eߙ1Cps5A,o']8'[-aΒgK7'%!~ ԘdddkU#kcbG غ8a8?v4۷cC^Sfz7Y c'݂zˮTK虋Rmu\sKsk,I`n-ݜl`m/me _l4HIsmLˡ`DC%^UBrm2x6DžxϪ\Q/.e'Lb8!GY3%;H-P'Ts!{7^nxٻ)^dcKXLgIj0hfe,y2+mRLy,rm`%;NԨdgb+^gxb )9'.;0 -0klJ|Nca:k H:f>〃4QX>lrae9o<{6D.ln[5J1eCSaX9. ғhx6=ۻ٠W5,ɂHO5n-[ݠ`(PE9Űp1ysF,Han"4.VFk_h [2p!l̐Y+m!O#o:#ǦQi3`焫DCc}KĭIVk} Fol=YD ~, tY-(9+E>! yYՋ}PYw1 ~ҊRd8rVfu倠:łN6GvȌxgO"C:LF;X1J<̛@.tL7&0^ zʠ*C;) zI4k駴et.] k(œ7vVZPq&=cOqAopx]d ?>~W@?4~o 8(5G;Yy4sLZ>hLhى1I[i1d'Ƨ={'6;6y7wl}e-fϦwsQ{y#7@:R#ӱc9`VG(S/S+29˴8ؾ7`lߔ˹LVqTK! 1\rLoS$4vWLM'jt*mJgaQU*?_Xf2HMIR7_(>EXy9S-Ll3D$UdH/h/kRF߱&.sH:WY Bo;ͨX>/2LY6N8&LKiɹG)YReuHXgi72Ίt`.\H1D߳4R9)X =t:/8M&Nxus8׉ –V/)ݞ,oNt$SifedacH&t֥:tA~>2s噔儎2+rL*'ca* OHICV-;gt֬liJ( %To,Rmfxl$ s-O)-m@&dym4 DzĚs&P "&.$hb1yBňU1C-W86\4K8&64ae6Jhގ#;7^)|7Mq~mMu|mn\SK~XX3^᎔_REX2Uc4fKcuda,W}eH&1RiFS<k|;q]"5J^N5U\ogi,}Z's-'˱Ȕ$ʖR' G^ηt&K:ӇǛl)do3ZiA^6+gW r'y}'mǟI{ ta|G]to8r3~=|4˛I~2Q1җ˱3 $/K}HN=V=E$BH!DRMh./T"QMGۤz"$!_uulxwa&dZUXkZϰuSUNS?ukBB"!!RîU]o:vs.j(t5{S! OOB]Пf7d35uP]ado _0rv([bM:7%W۔KXjQnJRBgJ4pMӽ s7m tuDm.י hN[z=kG!#h$ak)"_l`Us}6X [3p -%{tj{$VEDjSD5kFs bJ7V`jD7{dnlt$a Zzsdla% -L" FrT8ai*ܢ񠒳`DdVTεB#cr-究J$˽*.wB"̌t׎iD:A  SMz}=Фԟ]s(6]1^#dO,KF8y!h#$E/0 )+r![0 8":2,L%SpV+E^IA*q|amۭ\=:ivm]1d$J"νCw{v9C^3;9J"Eګ<3F3XVQC_ʊاW.ZbFw߾bQX/~xhlq"LJd]-b8, )$2eR`1eF RH9 S@pD5Wwg?fI`N!A+9$iÍ>\UtC"Oxs1ƣ&`KYU<*ʴd'|&#$l:)%I`tt֥#8cm%ˉQh]2=~z0*)X9I`DJ]>@]\ڬ鐩6KrXc23EСt3Jrk %(3jګW7qvnN pM| ?OO0ՅqAF/뼺cI\A;} Ie 33i+)!}hvW$3Iu9 %_WW"gsk|_G)r=wBXɎg]fHӻ%C=+,rbV$;6T'y% )) WM "if-]'?5-ed9X7]h'7 1cEvG\"$"thj;(%U~i|`'0YQ^|أ~RHP5t40aQB9[=!Oٓ?g& [Yw~5FyZ eK0Jh_$}GngpXYi|0I"_$8uv25m%u yU/FjBFۮ=kŦ Z:W7U蘷;ڟsXsj J%Imrtw.| +4*+fbӏYfsK?5µUָ WktrY89\ 0ZTUne4N(\1P^Prcr(&岫Rz5P1ϗY0h~+~̯G.apkPpX雯{n2jM_C`s[u*O'w{wحZ=% u(_ܫ"[W!?@\LϤ}cF؅XEҟZZ0 s]d YB#gWo"Wԃ}ŅK"yNl!W˕nuVm TC-n,o?͍I(>ELzy9>qWI붩 b@3aHh:$G j#")iµ÷|tM.9 mL`+'d>~.uTՂ@5|Ү;c ;A=:B  Pɸ̝J9B]&:;kWf{q4f2FS8sFѠ5]FLl+İv  Zw!ʉeu9s>sֽWC/Qk {2uYch[Oo.Vktl>7}>8dz8 -g j**mb:qlϽl9n[W u$H0ܭJiŴ919 9+]jAxõ(8YxwG~-."$ѱM)Ѓ)S z3;GӹE T/^ JIOjUjPϱ9ۙLcyG5%46(h{TW;1.uRG={c*{plD{O.d#I"qYcJv!_եncxn$MMۗ/9#:ךL@\-k BoC@uJvZ`|آYuv}(VO&Zx:[bCϸD&}h:c"vd{x"Q٢O\"Gh:Y.PC4c& &vi_}{$yx cꗍDg%@"iՁ%8|# \Qy ©~jEO{߿wqIp-{bj5FKtts7G"RhLf-T]NZg_$UsH!\[WSp&7DՄDFB"/0k*zz iz2w)J " P/5ŭc(M*w ̅ ?J֫-ŜO.JaF`ai",0rTHNxIN7V<+q̱umgX3O.9W>zX(5}# ' 4mJ٪v0C~Xm͍7>̊#7>)d壙^LЗz˱)sZf>. %cްy I+iJiAk{I30DbJdy f"+< i@gXY0; Y`=h`fJ%n:&;XX0I=Y~|6ǯ7w|Egg>_ݚ;qQ}o WpoJ8WVw.g ̷f:1p^%Y喅!67?=M>]lHSҦj0PDjDt$VvHVWQe|*1!&eAIHJՆRS&qt_k#&rTvK ׹$MQF[[Fpp%-YEߪҠƳ0JUn&6q5Ss$[jP)ޥ$6sܢ܎Z [y)Zx;o1p$Ϙf7.gPބƬR\_|@|TLoGyʞ)_>PE-\~2f!?&T03~:)?>|V'IG Dd{1I\B̸əNK Ҟ1VYR^ɐ̐o yDb'C`iPv wpI;ycKΚi'GEi>2aw~y4uYWHT$X:t`&$\m~n">~Bq¥$e+<:y2Đ'DcAfw53 ֬Z+;+0^glKl)_;@md`TIy2n"<ӠWm?{WƑ /~mάLj_9j{|H3hK5_n*#=;ZAڍ o>JW `8#dƃyzX=%0^atuO yW{>bZן~jjg1N7"F>55zeZv˴4oKK.@ihfq߭r+{O-&gϱ8?An)KR d[Z|jocz :&֯:)Dtr605PCom;Z7@aۣQp yF=̨6ŤL=YHi +/t[ie41W%UZF1R[]5++BNӤ!,:^XobDo#1舀1:+"rs-yn)aI Z*3PJkW ݾ|~W(}zwO^DIQ9]dnZH_s9qoÇ_Y9C+3{C\1Z9nG%W7yեYJq9TSC'_//r5/ z&$8Rӧ/n^I !P;wYxmșrR!jf 7_>~ꁭm|~$]Ea2Z]^vNGr:?`3cEzj/?TJ lu4] ԅVV/ܚg'0f*ͅTڃ]A~^OUPN[J)O5 THBeօVxӑ$BvvKˋ׷4}\]axsr[r㡿^vz?|l`SySuuRKɨvZ7` 8V~ d66Hebra9lyky[oޠK ./p%h"$G&TI<6 8o qnU|#[CQbbHʀOҥ, 8okc[&BONzԁb!g|;A bi!7rOh `@m^z'=EͿp sJmM(R&ӨV #x]; jJ$66&jݵ4j\)@EOnd1dWC_8<:.6A"m:'($RE?R[HXiAN|ҋ<9+=dϲ=(Hè6!l+j݃$cZsfÜ"DL-Z)c)(1+DkXec$j.-zG[[} >"#>&b>rWe-n1XhϨQ)/7.<)_z]ap7yKB*Ն|AvMr}϶3 O_?xz͏o.oSK1Yrquw]}MNu#>~tTZz{5le]ySc5Aa43/fӦ>C#Y؞9b(LwCb#F޽`zM;C~&K/b||_b6̀jYyB qgbBtUq َìX0:60F`I'l q״U݉ah~Gdrn|ЇOwU]C~hqbJVx=GƠ=F.㝊?]+9|M.,ocdVA!m1AJmlJ?uU9-K&~EmY9V'w; 16iH*g"1(އETQ 424 | J%, PRQf `Pxr{'ˆ'NlPN)i7$,Q蒃-lCr,[U)$P@}a% g)b1E+CTEۜ%1%{B醯@g/ v{ֵC^Ubʢh%0^-Ճ"!¹t""Um͇?w8ϝ83NǸLwұVFuЗcW 8NKLFm6 tJk/5b&ѣP3a@8*y۪64\ma(ND##c=(;O$LYѮg٢nc(ϲR]`pϲI/ rr]UGpЎ薰G(h kw:חn?вEjn[._*X>E4ʛ}̉ F1znqRpn:ݣ:Ȫe3}Lo?m Ƣ>[>1/=i<(b|ZP!";::TWun ]SBcac#H^[KK(udJOfXħ<T,|! R("aKI6 #]2 SOc㘷瘴ZELA(t ؄^"+zm6Nf[fgQ4PgokkmWh!'BQ287gA^aP,UtхBJWXoʓۇrsHe㢠w X븖?B#'8 v*wU mfA)(O&r[cW$i &)RW-V9F<+j3v̪q0ކ PN&` WhlX x39VVJ -FN"f#Rި\%7S+$rd53] `L^#akNG7`yQ~Ȭ9h3Ov&;s?-ԤԎ|raA ge a2uϘd3AcI1JX4 k[l99,4OjmׄęQ'6>3vpZawZdVNZXi҂1la%|OnNζ3⍧v/Ez>y>+B.X>꧛;Q/QikZU5B旻?QT%&[J;>y'&>Rv=Gr336nh#M"%:a j`~6>Ir01I1{=y2Ptd,PNO {3}m'0ݘ(ýdCK1a&@(ٓc2Dz@ N d_pF7 Ln[tceZxCR 3pG&v8Z! UAis{M]LRiJeA%/b9y|)z EidY%! )zl'>AI!Ģ0%Uk0M/˙Jy6F흘nɒsɄs~j#k~#+/l񙖞U5Je]s3DUڣY 7uj.gWb.jso=v$=0 r[e -_6v8EMUᝰn"9-8bȭNo {SG4H@NEq:_HhUTCF6f]1TF0z[V27MBjtcn^@;͙IbRLIBqNxCJ"FR"C%TЃ+)TgNbk s)R3e0$ %TRʅũYᚐ j1ڳ59z?@8Ft:Hu2jߍ6rnIuiŶ}0T@ '[A( HY_(lnzuEe 3*pYP3ЬɢBCv *tuh۹E߹+R`IM1z=AWW*#ż|g^kFñ}o6Ś-c?(vbB5BIіx~TLRQ3OuCfoEaI.I0}pJ p@^ \ѸgUNP>"XT){VPm$[4P C"]W4SR)ՓJ'9}p:Ж:mXnhEKc! *S! 7 2kʬLBބ60Tde(NLŰTAc=zOR=b"`60*QfFE~ _@ׄo+KPz)K!&zozИi!%P%7SX ʪi' (YlΪj 6.s}Z3vaM!=8Ed"F=-qaIB%/p;gf2Jב eYs,*#  44Bn$$Ơ_1FuL,p\;iʐDՙv{$CA4%sѫA D;>l>GA+6_@G(?&=B0Nf ~A~ 5(UFLHIdb˽ICZziTA3/2Y j#3jMN-K ut&YL.,Ιy\OfTY|S-ɽH)z5wq2J./k-ͧiM0ꦆc&!j]p(9Hk ]2C; )M=^ugNj#䘨NEg9z -lLA%e|=IN;,?pO/JE)y1z8 2pVYtw ay2~C=f4y3oziĤz;gbCIUa|C0I1u7ʳd+DtK(Ո$tjՀ~}ߎwAY3M?[0xtBVҁU8$k6xk^֦OF\4_4nO ES˨`BvC779cT^B uKΤe̬fN ){kpJsdZG%pĎD*4s"׿,TjzӉBSxkdz5d E%T_>UmvԡZ!}A\1~nO?*mޒTa d#"lAP6n㆗ $7RRJ_%1):WJH9]+ `E71 Fp{p7 &/{7!t~j1#Iy_ë^;2}5x~ /ww~t֛#ݑPCX`\y"﹊P_1*VO™ug%VzC+ėh5U*d}MrJ,cTnSҀΤ[fYҭrKt5">P7Jd 4Kd~~?oq027#0m9=G٧߫zL(^N&kNŮTEǭK7Oٕks m`h]{%;Y+]н|uxRxƒg%C>QIƵLz&~uo[)Ycr\v\צBk2܁O--nxz.,4B$O/ćfcE󽩔Э%ZQ?Vi&)ڐdAϳzu-TO4ݩ2hU.ԇ6\rZya20.?N#J q1i-c,2tR)M'f܍:;BJ;d_TR( !IYv*u,EڼK\yF.5%ZnW*/[8Jp|%Jm/ߚSHI<+fxV,U*J^{7O6,W9U;+1XKS̄ 0$EKgFehy]Y%>Ō<\qA1tV #Bw$:Xn'5.آEٿPYec=P@m ]ZN0Wxm3(a-sl')qR-NݘV Bº9g ԧm\T $~~R-07JƩa/8M$K$MSXE&On{+q)f甌+0Ù+$Q2Ʒ>$hS '3Gr_ Y)ŻgvU!QR;y5^U #.1չ=K&.PUɝ['Dkfk t!H5w\ͮOjv3Jՠަ FX:rΏNd0*y!;*h c0?4=g` Baz:&w57H~ϻ~y-`OxX@ς,Pl3 00Xuϗ9A l?u,3~<󛿴!֕|CSW7׷_%綌q1{.pqHa`Wy&쭷8Z7t3 2BΚ⡚E˗=/AUd=w 1m b# o=۝R׏dLd<:S d3iPB+jʑ#nM€r1'ɖ1V3ʳjL5 JT+=3%10IpWOd/0ƽ5Ok7R25SB^!׊#|(@W"4:|yMkp?f;pF;.ݛOfry3J/C_We.cך\*Ь1_Vt@֋y Dy+KT󦱏K/oӈ-PşE٘ÿ-:Жy:&5)H!FtQ[&7VbRNN)'}F~74/)18\Jʁ7('pdSt*=cQjHԁ.*!ϔZL<P3]Ά+̛Q/j0o[i CJꑫ=+ZkUZne)Og!DEH7# 5he #%Hē"z4l{Ou"'*P|pP"J sF hD[`.m(X*5"Q)fb}re07rN~|4 `A_sa(?šu;DDpRcʩ틿b$l'DH-qŴü0ԹsR%KPڹ%BgaE\+rTr tj8[s=go"AJ6gmE +VĵfEh=ZQ$ kL38gT FUN) 1ϔ2"GV@mk;e c2 5UZ@2-`j]F.8h rV+%)M~ì9 !30Ʀ4X]NÑjqJa;T),ˑ,aӌ(C1gIFp+\NZ/BviFLn!#3+ne8qc@@cHrC2`cL*BcuPfG!#ʽcH-|Jl 9@J0 k"iU -Tp6o]Fc;M\xnGzjނtW=!?@[qׁQgeD=+@kC$v'D»++x3yuyUB@<ݽq<#/?ǫ$lbz4.oE??4G)qt 7ӝodtQ.*9kzd e_O#ߚǼr-p捃ө*8;A0t@H[DH`n"#&Ku"#S +xeIlbRYb13X*,& ʜbc\dTI|Ct֮ٓ}HP[sblvyifleL)rU*E%NxJEqQ8;M3;N?{W8/t>::j_v&kztiZՒ:"/Â$R"mCw&$L_C` PZu'|&SX  2`% x^ Deg%:cA:@`?e-c~ fof45 Xy Sp37#КC)ZT^BSH rĉR r sBjDrriw&h(|%m4Z1T<w#+A()sC =i!"g0#)(KEw| X\Qm@A/˄;@@XpLrZe*qFNю|toUѸkQWgRu^q]wg{;IT崙`sծƒ~&IGCޤ>|[b1zN;1 i3W3t6"ITWB5 ?l0%,HZ Bֱ{TL/Ki/Nr5dj,-H`kޅ'DhnZv)3z"BDiKhN$ET%si`_io-hUg>nҘ…oDi`9;v3-޲3W%N ֦+Z)+JDzks)@ݴw_T`~l7{3h.ΉQ 4DP-L,0- 4JS)gž]!0'kx+!K_4dRw =5o9Wvhl%5py:B55ij jɬ#Վ$BKüfk^x 9%h/tެbb\hX8b.Jr嬤&fK9 ` [JRV<_g$;:4}1$DR`:ط3ZUx<4ӛy+S~T.qR]L0C@eT%D(!@D"OQiI\)g֫OLypVRb̹b)Y"eYnMCH H_n*ykžCp1F\:qO-x2ZJΊDd%8gPE&2"%X¬4yT)ON4ÀYg7.˿#o(< !u$o5 Fd'_st>ӊw8ʣ*l6t !]fP@vn{x pӜydeXwkoln3L++a~WirP2@p wzPLjL!\O ux]8% x;>1@lٻE!hUɟnDv" 6a9x!G}ƬWwrh1pɔmppr6l}.phB!CȺ)?p:p8nj10q{(wpt)pbtuK}=5TvK@@3FZ3M&.Rol!wy㞐sb.EiVe9=aPb"_?W5 णw.'!Dks{6~_mބ Pc c\qK}PQ!ӵZ=UnM7FÝ6{.?\8{Oot+!aWZ+{ΌnS-5sftt Z0ioxlOv`$JY( 4kP~!s$9|Kk._[s9=sF3 Gta E_A_ ka$e,YA2kLPsF]HzEѿh~SNVtQA(88ct8pϗVC65^l}9Oo5:%o5ڇXfc.ZJqDsUc˧` ekk6rutp`fv8'H;>mS 8a)`cGjUZr!y6(UF,:f8um{T9S+*v<OƑCt HM;m*<j| |&b?}@O3VnZUc`< -ɤ> ᆲ87qr󘙐_n+>rڅ#%8lY H' 𶸴2,Gs!/~S1+Hheڿ|T0Xʯu7F1nޛMzP&Z)3PfP[Ԇc%|cζ]ELjԈNM5NVW+li|`ON|Dw9*2_2hd|D0s}xO@ bo멶m@IJNE1E T&)qI̤T$Ɩ: &3$?k(lB+W*ɓN콶LgPT2X=Aޠalx89})| wc`H oZ\yw;3 6P(\^5^|1Qi_/p"!gy-V`M+Te=wj^Y3Z1=O %ӧ{8Db!c >3!k$B"Bw)Aej4~hx HˆK;tOEX#oϒ/H(YRb/**nH!42Eҝ/D.}vBq(罬AE).׆_m J??'p؁j=d*l\ΊAU`_UY+&@B[QAY˽/5^g$( ȓS$+9K(EyV”9>i.WOLsBܢМyPY)H@˄HLD(ؔQ Ɏ6Z01j$],v+Vkֻ9d{xdOu' _>xΨ[4zlEނYhrdS`+Q[1U2HίXBPijC]#(-9i-zoW^0X~o~7:ODŃԃuUG:=R3w?\Zx/N?Y6rS .q!T!ڷL̤k9hy;iL?o/yؗK-ؾlD_8#! DiA+(,ҵ:eRʌGr&"O&,CytR ZK+wQ0'V#%YZV2*g5N&RZYkéJ[| 1Pe/=g@uIj8.sSㇳ>v]?uNp7Ej2O&T%>e2K_×l3е So(B (eהFa=owLxj>>=QXͧti_:kw؊~wo?x'S)"RR)}Vδ9FYv0J4eb:Fx[]gA[oU!5mWP?~b->0o "^bk1(0 ]e8A/1]bnF`Еj-o(H@^k3xC>XWV0MRAw$˟}&Xqw)rRE'l>L@Kd Ε NM%7~!ЋL`8${D"5C tm_@j`QfީS H$HD9..p~[>./D>}1~uj94?rh x ԇ+C]5̂u&q_pK{OY8=S[2\:unDtKpM5z".ߞv-*E'n+/p:vCNjԫ !:5f$`[K >2_=g(vhC" edW-Pf"xϑ't4$LHT"REc}'eL^޺DjE˻+IɤЇx54&F||^B͟ME,9|yb?59M@먰iL6 wuCB}%B0mq!awrFրeor,+)L}~(ƀ_b8W lOw<~({WYx~#D%#)yJUW w8za0j=l F1P*/Pq Rc)Bj-x+'"Z!2̂d*r VQڜ.:x?{ƍ K/ٙp=\)˓JmˤT 4f Ei$ʉwj4)yi;i1IŖl %Ӯ~o$V)5 YsjE)g\7WnE˾Oz#{ p#Ł wj4FX{ 鿢fv>MkE#9kK"ktb3ߤS؅@ohO"ә<_p}.; u =1asOYZqɮ:_[K5Ƌ6q lfgTygk`.CA Ze7&3(+yX*8ɨWa< ߐuxi,5m,=>h,Qg#_+fEg_jh5n|i$焵$:d>O'd<;M'E-&Eu{]qH<^eبRvEwzz21: d^tqިJ{6J R7պV4c:IEa/blQ }p+U2U_ѢLruZ"TQ걘糨M~SJvΡtbRNd۟2dI4ђH@XJŻFqSz@!gBӒ̓SybKT T r)qB 'mlG4jlsC>UcĠYԲ-Ȏi(ЬQ[`my)k̅CAr`ܷqLyA&./IxeV01?2%*37| ߺ^Jh>d=N4|1LG'?MHН"}c# Liڞ:Zhd{JePVp92:UR] n\Kɜt(K8X#LgTqs筲ɁX*Hi56 # 6(r,m6NpF7@ Ul[7VM]JrYb- fDŠ%G* GPƧ cJܐ&h)=Pc* ldb7I(vb7I(vD)zѱ\΁4qKeJ-YhtpS FhG7DsRpC(=x'7턭*&rP YS+HJ|TC1E?e)`:r\[-G`k)2){J5v hI[&OS;,T9GUYѦ 5:mײޕJ1fWV1 EcUizYXͮr `ݷvvU&K@2O/4t>A w+2䌒n7}+ozU݇AwMifVGԯ-~nNГ@5:miߝ~NW[~i|~i*u!UaVJ'_-ZIc3!ȣGÁ{8y)67ǽh$)B&\,TK|Hu;Mo-ܝ+=!;.ᠴ]8z8%vM7ÔNyc΀4]xw?i,Bryw=^@S~.y@5!CZ4ktν t}?/ڔO~+kyEX-l.tSOrh4,.2ud{^ȴ"V8$9'A )xʕ4˴sLeDESÂÂFe:Vg M @fv"(#wck>6!E5. +yCi%%<0>B!>z/):6^玠mc L8R[w*/uw׃u0ҭOmŖɡaF/!51_@5")7 |V4.q3V;O ,|ҥ:iK!W)ҒjDž#"W<>Bp"LSTZ hfId;9ЌIT22 p-\?o`xX.4?f!ŧ?FO /M˧!E6ݏ7'|t]rkX ';F'*n<[C+ݦ=X!9,˄J3h*):ʟgjT;k#A: D嚈͇kR%olTyyep4M<gܿO-b80ѪqmrS s˳ j N2sPZH4Mƅ%uXZS^E-JkoD} L㼦ij(0JK(nz.KOVJ:*|+fv %BY|ł6|ɂ93 M4Q=y(z~9PLħ! a,e^6t\]|I/&c{h"D]k^,~"(d1v ^en!TA;p>Hk/ 3~ [b;özsTt,GłAb {WZn#?s^-%Cbkʨly[AJ`Djwfp4i Nux‡,J؎`aa̼/UGԡ[Kdw}-,(B(]C NkÄ̕վQ|RW" eRfx/l5-R@ƩT{/JU3 r)΍,8K%3,f=| W{nwOftx5p}&WǞ X46ݸsA mV¡m t: ʴ @k#ZtR p-i]h+{%;uDΠ 0cEd!"Y.xClRC*LTVk[5LEਇSKݩH&,*)y?ZA7ME+peh9Qqj{\{R|G-Gj % ڥ%)g6g|қ>`P6ҥ1bxtd/T˞jCfFl`-z49oFYƐWjWSgڛ۶W{Lf_lE\[&PJf2fL*ИJb(&],|NyQUuZJej{\ߨ*ħ9*v:Ur^8$;SxN,h٫W*ĘbBSsy(3ZѵV>9H;6Ķ-EYB]lZ\; ER+Ctx 1ksO bbrKѦZ+~/cV7J͑ʙXaP0%/?uITנ*9W"b#yPÙbN)M/0޼x2suqBP "dA,3Cb&>SNHD7CV GW{K8v1'->/($ |զsCa3/Eu/Tb7$.7W 5^j@d'Nu!qz\Í5'62rd"N]u\ ;DlwŘHaN> Z S֟`̏|Lfv"~ uhNn>n L z_壀Ӫ QI/D ##bneDht2{7n; 툟Dfp k3?I9rX ,I@-2!tPJ%sUO5N ZG XXjPlhfQ!l3a(0-Lem;OvHQn;p\c1OmGd@ :.AR`.d.U:4M:Q]vc,qL.?)/O[ұ@Rp$Md9(٢,A2t°+>Q,0S60І#Tc> 8D# 3N2! 9abce cMc-c4ոHo_iXlu'VMx2'f5qXp0uL]PA9xW(P.1p `&7иq}{M(-UmxjیA @G$z|4[?{^.|8{qO:JB&᷏f`ÇM?~x|< 7ƃOf /|O[3X"zYu_n~Zgt\߿ǃN7Nִߺ %;7zE#Rr2d?>EiGPsؚ .J:cfbR?:o_P%CAk]wsrwFyL0[x0Tetރ/èryR~<J^=oT~Piimv;L{D]_x'M0ֵ f:Eh5U~fʍ/nS/=4:-[?fFݿtf; ֒022ؙ{ʮr`BjaSKߜwZMs_?n} Ϫf?u^pvè0`~z8oAgo=tkg38l~;zJ}m'x~?v:Bo臅I|49u,iEznz TTӫߛmɀBB9?dwǽN/EPVg //|)/ҩHg`f)oݠgQ<ƳOv_mw,0}5֐i0U1҆ 2=v^^ըuQjA)(tHJH*bUƊb'NiBVJ\'TA  Buf\` h3IAu)nx$P$`i +$HE`JHvTNQx?""2fkQWLe0 ?ǪZ%6p(qͺv\rO?)lqZt2t6N. Zm/l!ߦ"cԿOnLznSv+~\p-%S:rtFXft OK̭{t+Т[EL c5<_iO9{`WGTc_9q.TR=AHBαpc9w$ S"k0'8rJ@.gsz~sfq%+l$Qt`*Qp#tRZձE1ӥpS΄g Qܷ*#_eOjoެo?߳K3~> Sw}3J+մOoʭc{ so\є2 )4Aafqv`;dk_} @|}gw} ;kJ|s[n?F/u7˻W 7Q tQJBn/ WP\/sG1MQ@ ҇ג/nmO{)kFK?M" SnO3bx_/W7 ";)"ĎiRˮbݺ,V-N>f7U,rppKñfqͧtjͭFKܵqcV^ FF>`HTbk*ã6tjrJ4O(Oa?~m+װoYûQ-~٣ rI Oƅq!|\+*rP5;c"DZHHڄD$X3#" WrF mw=ٖx}gdH0-m\,UHq#P\j|@@UF{z Up,X Fb(HDs."[Gi!Q39@`"(٘GPy胤PpBMF[>( 2>oaS ` \F6p^$-% us1̴dEJ0JE|s]dms} C3xLJmPXX Ňo'Dr™=?%oK)dg> ("#?foԁP3vso;@\3Pګ¤ t2'#I1a0$NVe˽ZgvUgUB*:q I{O)sJN^ i.su~Z$u~**4Q"6﹊)TaVaCq ʁ= fEF%U'i2Ke!}DpNp,SQ3"wJf-I!Tp;W5'9nUCLlrg8<f Fg_1g9g# ҪgȦB"21V;zK_1U:w=wwVR_A1EU62 h8j`>Cs c;-Q΋W4E2}e( T(rKl##̡0mz(ʠgUHYAy!XrswL91J\ 'vp.0C΂XUbK72 ޶ߟOE&J^WpjYZ5txwIh?W'')T\-; ]iT2p~3*O!ɶ@feB4^?Y#-}( ˔EeUÒ-uPKQBdSCР"@J޲ϸ$JL,OH=qrkKΟS>˥1̬qWdHL;@DQ%+ȅM]h g̲ه.uBYd Ϝ*9; /FrV}0f4KZ,rf1&,) 7 3Hq) 9ps*'GN/*eQ`2gRˍP侙9&eIyBiuavQud#,-l7wO7aԇWf78Y+R'\ #eKLUEdq2:yW(wX8 # څ6Ұ{dPH|p^ݭ@-v@Fo49-"PJز=pN1ө9LVs6%snQ*'0*eh9 OT<25]؇X a\ \hE#ãxC.|/cXǗiQͨ#JB)QSyB>UocD'K)xN]6 BK'g}tC Bړk ǫdh k- `v`dN3PiQYι΋L!e9gN2_)o̊3fɱ`xF6u*<>qUmЯq9e॓&z&݀3H|&:)mȬ e.tGhIqsT.5+:E=ǩqsQ k`cfQNe\sݧ GM]kyY1ioQzUE4\kn>4y@?Y1x"8ZoL_w5C!aݔ WNWy`}{~>>@ ^9s[+=U]h[1kvpJPm9+za'q2x3V)6?ᙃhW3t`~8|t;f|'#:R?>R+yAOo=5>='w`1QݯGP;:: *zdZ!ۿH R's튷~ _ 3^?SȜ bx#Q=>?EtGqGҢ]cV]/[jͫ%VlftWh6@:z(T9!tps.SCFGYh x  CHXaq<_cb xq`ҍ|uℾrf)>S&ׇr^orcj.ޤhWF{*t2&{?LPj>([>Ȏ Cv?XHX62|vt70_LeS`Џ(j!xMo9!Cb_!̫smc#4` wj3k"kWh?gR&k_2ɑMHV"H^=5Ƃ+g]?vҔݓ.X$WV$iȜ'<ӅeF I;Y+ySh@i1?&`d0~iӑ $"V^.jD+sPn,0$_>4c:ncBs|hPl*~˿?upDDnB^6ABvt#>N6.~zqv>XjwVރ#K~'wT|wUtJ?|Hsp ިf e|ߚ7/8^TDs[j,/ pHtsc8] %%E)^aHx`'taWT$U^Nm M@bHztRJlK]p'"WbJiPG<=QމPڙ}?K#%.8LhwQ8M@))6WߝɌp$HyUmXxY; f>i,u)=ε0#U7ЌZ]F~Fa}welNݝ.1nt^p--#鳵8/Ah̯.QL)kM7b +@Z2j&vC 4Ƈza^md M"9:Qg<%jjaXkB-7  =bͰXm"vCZS 9Q+JY3lN) IT8E@Z)6#B>IsY+њ^.&؍݋og~.L0F&xzsFhk M隨A"QؽkYbH8Pum\qPk)&+"+$n$~%¹]\ɵ(,.b8l./1y]@Wm96 + M_\}AD^ۙAqq gT^ryn/OAwpLdMD1i3y"*ṚBr!1](KfOֱ6 dKj8meMLd ~pUأO5|SQؓv{_<!tJKByL _0h7_HV\h=0Ln';"O1z&΅횴90 ZO7&К;L=TJ[(tN@(Yt|;&O^a/vTl? TOf,6! 0%zw5X*e=zMɮFT{\gxIJKC {SHy?qbcAhjX|y3wb@z]G!Yt(|;cᲆgoEZ㯬V4yWpf+( ؠCcgMqB5Z>v`X`} K%s{5ah!m =bV lq]l Ysx֋5-{Y۴&YPG]3(ny ߖ(t4WcVXM\?A4" ٲ7zQi`-;8z P[:XFR6!=W[YUҹ? !Y1pO%@1"&|oGxm'2Ѕˆ1n"UQzH0J1ۍ<:sVtݤ?@LDJi$;qALr&vMw0yPH`6?ƭ#'Y":2xe*."̳M7S\2ECU\1 b"Y0buـz=uzz ݙ9X:ZX`w]p(1D ZQDo׾C1gU[p|7p?>pdCfe"ZԒgpJh*c}tCXH-Rൗ^>zf~ėdaW;tjㅣfS~uEll{[-\yҧ([,4ᶾ&ɒg7޹c3bVЁ΍ sPpvH (5fk/; \ pwͶ5o4c{Xt壎+5c'_ 3NHqKx}pƋ1N\)@ۉ9+:>;.D 0;?Zfjz?.۶A:'pj\&FYiUw~z S47:>@̈plz46Q[ z`'6INJ Vjpj9 zGCko-#)9{#=  N{CQ[)o)?|e8/ka] 8 vZ#V9^Cv讦Q8LpqvTCvHv #@ 8*·~Iv[P6[.:\;1++b"2nm85[{6bb֣θC=x5[: >߄V-9f„nN?|ay7ӇR2B6{M2SvNvS;>f#WzO ?N0uC|*&d+SǼvCvn 4X1`O4Fx)}>]jaRv,5+1c-HQB.GmZa: XHZ/-H%kxW!6j>Ӏ5B~}QZU0[(xg$*1M~÷_[R["~wwF$(0»J&*Ddp.qK߯]]8ۆvSCT&b&  JPL2j`;p:8ZZ9tނN&'5 bCN'QU &) cCLC1F4_r&/k}撧S JRH~\r$}Dm`*pS pJbëCVDM.-!=XDfҷW*]Kvɳ.y6.y0\U.19UR0L5'LƘ< Z>dcd{<'c}(e\~}}лfԓLi|3$`i$5<7e0SȻE(uw9̂D+Bm'\Xb ݣg{tѳgV!p^Lg,RXSM'MIY*|P;-èU YY.^ |DTQ˙B2yG2D yL'O@k&ZGۇΡV1_B< WXaҪX'*/bе-@xVOSm;-&H B)E8S6A׬ ś l~ߐ (FF1#9yZI2T,6!rOsG)`292 fVIžUIx%&ɑ֒ꗕ]R8^BD=4=A,F I~{G>?7i`91}!)8* 2+e;ZwiO7|BJ Ͳ 1<t.SzII6%(Pgyۥ=kն}nqׂl^On4O "+##* I(F BN*G(*͒BψlEs~:w!U1ews>~*+:kc5y[gS "OJZw\>~? :\\ TCv?ޜSOjػoC\%]_\@XNw.g´覦, -_o4oLmJYh*_"R M26-ӂhKQ:)"="WFi Š(3%Μ; %^toAl;g< NK (r"vFd+.)sjy@+CN[dV39+%eRp%{‘cBy1|e0[mF6j癪 "H\qǂQY2싢8 k FhpgL ЊL5-[TcmD5fZ)%g*r$͎;++5H((m5V* \dl"J?A?E(IYNщ@%)MdAsOVXGvn9|u vA|$݌A֊eQDX$ӚAew:Қ!S )"2$4ʹp\pS ,I귐PlN`i%L^۵8Z_ѻrNZ?rY32 crerd+OH\%EP+;*&R )L| (K%d \& Ch9}.swEIN#*E:@q%pHI1+АRk }NZO-0K1J#@q/"jJE ISH Hr&wy+ݮe0\tїr~/'[K}oƗ%6B McGTmAAMPs|/I#klUֽ`ǘwwRݿqf?w LRha Ӱі5l Sic̻Okr[J]y6sv7CyifZk[[,V+<&0]u Bk6*)Z ,t ѭx٣oPוpOJE z/J3Z<}1x!!;0ƄTie,&e͐boS$zwkLTe8IZ8,Bni}Wm@.}0V3:\FPrQf&.n#[oCe|U&*S⾾BhICo4HQ 0_fjN@1w4,3)ܔd er:Yػ0"eMB:Fn%p,XG-y"˔J7L# ک@Bt֙V'A,/ ڣO DKx ,Q =I%^ŗ:U m{޿aRf yxtw>*esVE0EkT-Sv#&7#Zþ-9vʼ/$fc/ _/x3 AO)AXiƙ1"4 _p\pS^2O-x:s- L$(\ƔoGNS8Y;•U>gJt ,܋XYur|sQ푥 vKPHI3. Iz): ޥ23`ղIIIJ w&&RQDM.3Ճy+P DiD1e{E71;.!1..x;J-_ 8<xqM)QSz Ժu~\}2Ɉòt?}~e-LL>=)D9\6uQDQ(T tF ,3 4SD1゛%cWD.erDx2. ގػY،`(к<8SP _/;@.b4 QXxxfzzZ>(Fտ/F_.a>=oR>Gz^3Nzg'T}zo2Oȳtqg1 `|/ӸFNM ?6|~T{} Tu=j\p8OT ?|&lķ~P(PeNQBVZz2zPQS:,q$4RZߦ]h& HN8NJ4 ޱ@KIɤҡ4%q1[#m4k}́w/ZЋ1Μ]QM$mI%?AOqAH`o12f=8βw$yX;f:f[z+;3Tw E!M P=jyc[ -DЮ]!7@)K˲sKxt+.Ҏv" =[ p"@1zi2w$A;$ CK8+- 5y+Yx @X69%<A`m+cb:_L+({[͓moyQ׋tAਕU " U#wLAJƾ}X1E'Tɜ%5JcLL +wǀI F1Xd5ȘNj^k jH" ٠ Ŵ'<9KLFD:~˓˱A&5F.ј9\bmɩR)z*jDSOA%hAUYFz,j 38ďiDhY iT:r/[FK=VڻԓQb6L:4Ֆ[eYLhN U#қDTFAZ2#r6șd_K*wO_w\~E=nU OoOt~ wiOSiC'ZlYJ}X{;q7˻FQSEfAOML (ല!`@C3bh#w Lay*!>vp )mDߦ|_Z N ,%\%ɨCI h/BJe X"@oD2FVhyeJv`"`b+|?9=/@h0z &sMd?Å.n CBKS }!>9G]bTGP:h;J/:7;Ǣ ጰl6p0m%G"\$G 2I˴)1UsJ͢RD @'ެ#Uh "Ɂ/D^hS(` .("͊L2b?B&~2v vd PٴDA%W0ՊʧE - O/^ kq ^HևtKe2 s9'jΏʇ d vc-0@&еqõWs;%c~|,wKt$;¥,qEG.Ř{T@Tif ^pOYu댶G$p} w ?e9 7TzZ#_L4l^C`T: /MsнwILoh";$T6`L0w'ʆ~,\"]yR(I]2@-Nr6Kej)@d\ѽv\fRRAM%~cGoVRel* ENa-) Lwpuh3A+d6oWSnN\6. .~40Hjd)OdꙺR˖Uh֙scXM:/穴|>>_γ^'%NJx𺪄U e8Y`x'!roM$ >2mr>_NۇLs N=^A̱G9mjCl2ZQ :`RH8BEr6f 1OyPIHs*2BNL]`@pCc/XLFoHYy4eQС,֢P`sª,z hIbˀ8\y(K- <`%0m tK{o}"=mk:[}ޭql@p:@6%b='i t?@#Q0ſzQԈ N٭Ui!t K>-bſ{vbqOfUQ l1= YkP m8Gejw_5G>{D~78/6\W`| <+ THI W ?=֯Xı_7mvXm@6gwdD{Wx &+SRcX;Gnҩh6d11Z+=wH1ɜXF+Aa8ʸh˂$/Qɢ[I34?oH Ѻ  cp冪RQ;9xC;e}UMZ~f}:f}+>Siu QeiP}#r~kK&5a5^ǫٖ=39.+2ָWӢN_~ݠڢNڂR`(!72#M CjB cTn%&So &ډm-O-M$aX{fzܸÌΏ2hCy9*gwv,tc"SLn>fYJPO} 5VoUYrJUؾވڗea3E:R6"V$V& #$*TR/T/5H5(0kFpPa%)u(n,9sҍN\IC-a @ a^Z3Ye[cab.Derhrn]f9#uGU%fS43lk#+ zk?~hW>s2 2d ]SswjX΢;rƨlʻV˾0^^ug0~AiEi1+;wsr2lϩ+jw؏ # &AtO/OXj'ׇQOVtqf7 /Pw?>|߽HQ,ɸMR_/?rxPkAhMV͓$`xQŔ^ᐤ=  B~7;# `B:I6"L#SD黨x@P[J\rU@$* f]\W@\޲s8tA@)-5 ,8)RZ2M B2A hBc iKldN u0ĉ[kozô\3%-*bi P'$^@3Ȧ0uP܊o k|xYC"1c^[J_ENC N(C<$$=0Q^sRQ3&+6\)R*H :D &)6H:e8FE헴rV °^i)=| /ձͬY%*<Í_5MOQCզ"(&:$'å%A# 4f PZfTFyeynZJٰh}-m h۞rg~*?}?{Wȍ C/sP}(BV{±3v0c^KRrNoxHbQU,Jۇb!e"HtrۮDP5>DdO'8AT8!%O ЊrrNkgl< Ч,8CWؼB;Ulvy_\/FYL'_]w痋@`lw~cܧ ?IpHCR/zC pwcO@ ~{@܏Q P[~Ji q_d+3̨^0zͨc- \&{4 ^8MɓYzG/Y;T©B^(S ejz258E䶭 Q0px%7_%Yhޛ2p3y ޿@lZRM֦OkRPFh:*$y|4 = e06c r^ uy-0TZF V;_js1Zu*l,1 9V J^͘Sxg'3R 헷d=2 `[-SŠ<.S =/%lkH +<)qSye^~8VWWC#&0elED\Rp\шME#O7dTU4G%&Di]VRq9FUqd!D#&£r؛dZ*L{*5Uj7|d|´erc@q|EEWܨ6p՜? -fJwo`9w't%. ܑbrz0ӫA^- W\-ǎTQ ؊Wa,rͪʮ\3Z&ӝ@?Ȟ1΋)DkTEPױ)-ʾ,3X!ƕ:Ճƈw-REARehE^@3bܓ%bJCav+pu=+ ]u! V N&qjG]c,&̺TYENޘB>DžX[iTm[ks2ժ<9"DVvZ~R|Rh=oiƋ _ۼ|9h8 eUzK`u)>( {!Wpzh!Fy.n8xn!M?v+A⇓nҟC{ì9:TTd-s6坟9QZ*`JW,oXÚZ45>q t5y)/s՛- >\7N?h1*# %/d"*m})*`}#r <-4(roJiv"4ԀS8 S:!LmD1gS87(NsA]fU0f Avl4DAtw #z SM3Dh$,&eFSL4ߡ$ZU`J5x sZi(2.Ü2,AH1B=3טݾ6䴙>ξ?ıT`|'zѰ@|>I;oquQ.5޵VH{fHvs]ǀ^ Uq8CEA+Pmn;FUwZy-\!ZG6 oHKRҳgΑ73T 􎗷gh}8r$CwۑJLQKX*v\,[r(.{"C1?x௝ $/fz;}ۿ]pď`.Ƽx`#˖Bjk|a4;?es|ό/YÓQR[ky/h[C[ V^~q.&9Q<_"&ael5sLZLTAKDU>Փ'-[a8qo[P.nm;aړGO'44BM_:\,P_ԛ:kGO㧦Ȓ=h [4Vn&\%ZUZug>ٰ ߖa sяl p8AO;f6 : OYArdNŊeXGHni IN΂ͭ q'YMH"ToyQvpa-X/ijCLxnl. +,SMVU\1JJ$B0?g_tuwuVF";NܢQI|KpQK4\\o{~]>1 ]'c?>_ykn灧0p]SKº^gμŞV9g|['BzuEϰ9?rGHB^FTiO- ޅ,}p,'/}1.*+J|KnY.XᄻELYc-NF4-[T!!\DcdJmmn1hO1Aiڭ y"zL)EBvg0Z'?߂MwEN|Wn$2F} " 4" )ݵ+( ?"; p\j!h-Қ7L#/SVI.vHL-ڪ\UˑE)E)ۋ͡)ȅgv3BcʄlM̗f1< *܊/ 5.~-8i7njIo^ԹUa}3,RUߚ+cv}Qf!T7~qG6򛩰&wK=XP&( tA% kV֐}7Xw1*v tfo_D!\c\"ޢ 3R[lV۱T [] bD;huDŽDj&$;2ELbrfT/PZaa&Hk5#v4ЍGnś6@8 As29g GԆps1!dv ذ ,wfh|n_ӻ{ c(9 X!fL2425?B׫!z0 A< H줂`@`c4K)u4ˤ 1iChoYÍ=>'cCͽNR+ (vyJ"UzS Nt$4QkebW:,{0K)bΒ̄P2yn)O`D} q ަ,X9r?~sQlݰ&% 7n1VEQ3~=P]Ftrwս(e{I'nևYq-ܘHV;_<\k _ dU`k'Πap3@ l:&x\y"#":  oF~Q!H7ͣ#ܮlPㅕ",Iw)+Y{|Vu868eFFS)8.\ZKRV4뜇+Ƽ% 3 *ys-Vʵk$#$,OQ$x҃0+ɏq=elp+m3V8/1"x9k#{{&.FbN%84EbQyвM80e Ūvjh: M`䍇vzd8B ^r$qXxP6Îr#G[AG0?Üy%IU: 귒x;: gr? Y3y+qԾ.uS?MEB* %K R+Xt:%.ud܄b<^אtZܾ_2AX]L 8V9UcX!42? ԬaW6W\/IaZ* BW-4?9), Qgg9o<04,sT:l303 sLD&G&gN+-(մU4(qi+v iI4i͙Omr;q\r4 ,fz&/\ b.5_w-2)D>.X# w5\yQ>JL:QnL>}R|)#!.O~Hw#UG/Pk-JoWoe7+vS ⦺H4%K8ԣrjfIS};n+|%*a֝r3M9VHcE|)wβŋ;sIfTB5g c(^c*&5Q^5pxdizGxRb F~n)¥<@ل}ZY8^%В)N_1&ՉRR饉b: id=|-aǢP"[MkYȎq R+5s 7Ғ 䶯 ي!LqT8%ǭ^E 7~4MvB #|HbܐTaW.g/^KUdQb-O055|(PUk,~n7Aރ:3(aBfQ;]-^P^QVȨU .:!|u_W !'jaS >J`+BzL)^T:Sce3GZ(!uh,jM y_" ?H5)lj䰫R[IB:9!wNeYhvp| *T)dCoGTz!Hfqk1i8*Nh쟭 ƣ 3$Q/u|>$G4]>9ot]?ξHy6ڶ˖q3Aݚ/z8՗mϵsϵVLxzk3~v7CQ)yi1Eu:A֍ n' NAUKZW;AYIA yDrk ّ!`_F[=Vw>5 W'9튜mB<(l^oQ{?pC;#8k<@ *k3 Լ,K  ECjk0ν q' Z.uL܍t1/s &B ;ZYC' E(>9J\˯v۟F*$# sؖ22/eU['NrK0lM~^{"dHe 1jxj^K`"wi;єI䂗JzBN"1#)Rp)&Z,Kx*5H/76jdRF2J45bLfH56* :8)iF +!WZ"NOP85ՠH4؅$8AS1ŠQU DN=u5FJ gd  m1(#ƁЌyr3ϱ";FM\!Qi3!NgdɧɗQoCD7W:rە2+aRr:X4I1H;˔4VubЏFx' [ Q"3g( Oz@PIM- ӘȂSyf`dU#gRWd DNK.1$6"-9AgYq ̴HeS@9g .3J#iEIٯ*)!p# +ڠYKeuh-lNp:51!.~Hvͬl9os#)Mo]hP%yb{`M9Ow] ""A8)^p>'}{>Bgf2i>Qb#~4MbC~*bN8nTp%^;b"B2Z8o*H}SUVСhN۔ZG& ]6~AM2a/;1 hZ>-^%iF gẁͮ:CX)=-/.bb5JV^\6MBhiM.%Wpq1]# D& K ,JP%%X\ꭀWы}^cء)DD1b̑sAcaJM(}i#IǏ xi0AK$>oZcI_03LMŁ=-vi&+x<|YM_pƞsc3S™>L)-Pa^ j N=N>df~UOξ~D޳tjt;s1z͋~y//X?b_ݳo^._;e7FP|ͻ/4S(Dom!LK.[Gaa?rÔ (n(ןiAW}U]p{?y{J طԏ_/Y?{ lFḿ;(m\/NVŪq /@G-kumksGfkxgnFOs+sԩ $ƫ7$ܟ-{w~%{,z8)TS 6ͫ68Mi1O|~E;1`F7;aata?NdŴy0\Ҳ5VZU]]\#zJ DW' w5䂴y(Yl(~qRwkk>%mlnhIxO E bf88LA@Fht/])PExXHT CZaj\XH$AI!}# ^ ̅D` <P)O$Ǭ y5[(ԓ.QRLX~_XxNҖ-q=kK{emf,ąfbX˃f`ӊ۠*9ɝTؠRka.l8w@`z3sν4@T8HDcb9vL]aog>DlnFuNeA\}N~C01nc%aV!X3ȖyPit=_ޖQշvQ_'_f2Fy=VG k㸰y[db 9h"{g>_H3\E|(G>n|@"LbᡌߌL%Fz|2kZdS.ؤӑL ymdP8'Cs)3{E {6<{ٮwm;$.Ř՘Pa ] ޛmҮ_#sq{x_%OqGw5Obիt{]:l(?{i RIs0ri#!_TP-FUCnNM M;[2c) tդC}y:s0U|#tF*sv ɗ~ÇE .ЧӹS=4G$#ΐ?.-K\r9Y]㹒hIo@# 1gzxad٩?4SLuo6.i}K'nRWr`H6aVwv|_h랿G1?jDvJj i-,W+U>?}"o*!/l:eqYG^OYMNHPP:$\EG qkk}(uP6 ]jYjyq{x1RƧ4Q${é36tCY{; wěv"3P9&w`5ݙic~JQޟ)Ԝis iԡN'4Ct%|)6Hs@zfvx1ꈥ}´+n-]ia{4䪻!|zX|, !i>0 +%5 +G=ZMUÒle{='#_@mku MF@L"ِ\:j;$FL [-\ɻs_q8`>$npGY3{ {1QAݺ-:--w`HwĽ:lp:FawѪpMz˧F23;Qgatxx\^J|y?JInW>=~b{(vb>y~% Ϸ SDj w/Fqy[NIߍ dX|J@{FIGd,cU/F]ͷW;{^0.5YWd7>#;A9Sdr;1"eP|Tzqwrqݭ[{wx #FJİ6Yr#mfQJ 0#`aM!!+rs}6Fr˚)o@OcaYuGg+rK; &cFfKG0d!7,-f? [8ɿ-~+:2Q+T kyvD4-LJcu.b DW,Jp*c1ɧ/?4bomL^9vCyh)=O^be/w-~tK<_@x;.b9. 6/Ŝ>E|J"da%Ɗ|&iW}L^1+&H+D^%B Ur Op]5D@`=KWKT)cȐcɧĪ"Ͱwt^1Y!$WLMJ^+6`DIwm D6a!e+O4V{n2z؀Zpa=.nȥP#T8ɪ̞ۚI$SD oS%B <$PLbtM-8X;K ]xt2{#58oNXl6_ddP,wiy+hXQYh G# av2H~blP-ø4FƄ HSSfhʂ>eme'!yleߦ ( NuDP3ЌDPPAZN4cZP)АKfb yjwEpYa4V/h43m| g9n)3d/{r6|P{Gv FgwmYt ku9?A.Wd7Tc]1<1 <պ|LbxhLDnt[.);FvƳ$v"ZeO4U!!_f>vbgY(kcn:,dAZ+q x@P rˋV$-hyt#߾~}oQ0X*==?6z߁2ݧ2q0f0Bi kAslzƟ^'Q{dCvCVYӍyE9gVj'v^uk!CcRQk9˨-p=3ʢ0snC<1N3~8*ϣ%yx:+~W^bV{C^\ŴUz.zDl֡GsdL񜇩b_,)YqDiY7^QNo_{{֖/jLG23ܿlR|~EuA [<17팀Q0SCiP@ C˥R>GEo`l؈$|hCN`Jbˡ?l Cgg1jx׉f{ QO41QȯqG]2[_~/!ǐ%ew1KcJj'K_%Ƙ&uČ++tĕ+}:SP/Q)J_eW还J/fP#Fs?#es wa'/fPe_dk_T}#([f[)m3?-9{;S.[8ZSn);TڏoAFˋcq(!@Ȓb ;^w J頶Қo'cof3!kl`zKqiv%qЈq;`t #h/ =c r/c &gZ/V[85cEF[pT0J9XZޙ6z+YG1:$Rb?uUb GpH˿v3o9F;'ϥ!Ϥ@ v3 6Բ;EXtݜe;;ɟ{_Nq8Vb;Gk79%Nv$LP[fL AdᝧZ*W1.8됄(q[,c^t# rǵїV3G> rV L$AdFjO A A(B1!/8oIπJ ÒpuL%qPYf ÙS႕ȑ 9?s NᚭNZ@Ǫd*xfj;Dǹc4}Fxo' E4C4eꮱ4b-v;%f1Xka%yqm%u?3ӱP&]pxusTМbĸ2eхH8t/p!wNN- jN@vWr^efs%U꠬qpZRNE5sI#!@Yg5+VebY؃%TE8|p/b# S{012"!*,D;ʤn(lh v,˥v0}ЅD&]pE@N҂NrFzk ok] 5(K`ſ ô)R}0' hc^(D4ܾLJcutN.=*,>^jf7ӉY}wŤ뒛9ך'SzȟfTOyNTqFTl(&`zkBAz^( 7BORzIE9+PPpJ.>%- 8RD@{oM'e"иܘnqI1|k ys;1/{&p~YOW1Fx\3_ lXZoKhFXWL|N%d|]?Gn1װ+!,;}iW,>~قL$AbkʱR'Ky^WcZb8akw$vB@ ڗ X"TQML[V` g9[N]r7jq8 ޵6r#""Z$b,/'Œg2~/Q껺cxFvY>VH-K$БPY`ѭ6Qc{-ʚ@sf:gn&^ Cu{ܙ[=^n˭AO+xm ztfڻ{i毹+~|NG7W펆 us_-pRrΨ:'@3ͲK \.vhuX_ppIg?s^X,B\'صۍ6~]1cT8'pN\m}J1)MΗGVnύQ\S0,Pi'fyM° $DL7ÍH+זEVpvI#"΢$6UĈJZbyPk8'elCiwA?0^vh[cE^oUFV;w` =m<֎RzQ;)W«E}Z0Wԧ}|>-Pׂ$=tOhvm=eյxqbsMBd1mF+'KKNitf5wz{+}t^CyC_ yr|vD?Tf\WsB{-*7ڌBټt٨&38$Au/9JoXW:?97("~43h<^]I0D@˝O/81QBՅ B.Z8.Nctb`B4V(R %I+ciRpOJ!H3iwGa03.B8+xN3yV0q4۝:/<fZ@(XL,- QK,t|x9FH-+ C1ݗ2IƟg{60<6p.5]͋F=쀭dtAsy,n -_OqqL+(Sl'NQ8/$rNhu ^ډ_ASҗj]JөCt =^mɳi'rXWrF ޫun#啶vދ bt_x=t,@o>RT0LȦT8=ijc`Ơ'<@qG ^پJ_::d8ngfbbayלt]ay|~zfIf{׍Y8f (InOQ9֢!ηE~;ɫFy%}<^Jb~F 0jJbdכ^|7y)_w.+Yc65 tF!P|Ճ<$0SMHRwG{ϊ$a!+73]OyvZ:I-/UIB;T%yK}E-+6xf0~ aGV`SdHǥjat7#ZŘ⯙{ _Qaj*5I2"E#c\_U>H:_;ax|wѫ\̎#CGRm1SU2'k[)J[:?_mB1?MPfy85R&O]9bcg)㡡i # g~5SG[1muƵ BikJR@/b>ﶛv9WC.W(`>Ԁ|Xzq{0O|_U6~/JQ֢2ϸxH,v2 ?+U J^K_ݮ>^C%W#A](WGN!Х,]y~d=r_@ q|J= "V}S8Vt^^wT k@NQThj*N*z/ħtx>uM9J Ǎ_P#g!*幐Tnn/3d^L9azv8os" J56Iw@zժ)B f]:Ŭ[MNr1 8\/ikt^4-iu$nIE~B׸P [ [O|/[Os`(c]A)otht~ae N'GgnWciiۗ+JtoG)d;˵o@>=:buZFR{M7 s ٬7Uo:HƁKھa2׬/: VǸ ( 'R LFI ƑY﷿kp B T7O+[6iv3Ă+Pi|'47앻؜P6,W^Ѕy//II? c nT2w* pJ]6Ѵ`GZCj qyAYRjPgз &$,؀86Pr҅۷50~w+[hl! "^M(v_0rx6*wa9fow-s}?{ژ5IV^""HVXIACSN~bzu&5zҸ2]S=m̔ -ەeWv5rw%o/\|>ѕvmрF, MОy4^[YLrq/:vmNMzzb'g#֛2∵s:CY.Hkf#&~ U}T1ɘ 0y$ H">2kLҖ{!PXv_gl,mp̛#OFث1qCsyS| r2-W )piÄ"a8aX0J왽>Kpx vCO${[u' ȳ۽TYpx5Pk^?-6;C15 ~էW/ҺoPVbc3Qpa[B{se)CsʙcD 3HE6 a,BZ%yzA!^N $< N|Z Oɳ,I \~h6QR=_r'#-A%=y൘PLy1Qr sI1J@̐%qNqiSBғgL~UIzdHP (r>A _}*Jw`W:D]t},jFA 9N$a%sBIf&iJ+(| ,,)q^8(: i\?–C^3 _k,i< 0Mѓb𱎅'΄*4:v4zҪ]4)W&=i+\\}WʜujoÇ2o[-!n]EwqqZ_\y@u';[Ļwg/fmP|uy:&/~v.Sf)$K'_ N w7+"KI٤8hS &l`*:x1*M,an"`M4rqFxF >8t,FA k XT\!RΨ1Z˨$ yTDA:f)N[؊LF5 {1C c'̉`;0H9tA a01Ot(ǓAcff g%mr3xTsEK)Aux8&@!vRy%fMC4 -K>bPN2AI9>3oUp.Yyo(!hmR$5 E*A0?=8`vbf '4NyY)B}$Qh%dyK5i Сy`ǣp0L.&V71Qcݐ~Pُ즁|N]}F;ANZuA-Sg)J9tPF]̓VfNz91]r 3`foٛٻh<` _l!Ѥ|:^u_~O,qCξy=挓d"PR !cLٻ6dW,?,,%X 8N^6(J!)9"}4$=3 _aOWUUJXg<]RR c,GGC`9#,hɬ YG((uWjQCW$.S}?fqu7g–EaˢeQز=z~c8{ + Jbќ| Ͳx&c3=]ZKŰ,>( 5">,hb1+W'5XLe Bȁv Jy6yI-I^knfu3?b3Ջ.}Zdq F|_;#K[y7FX6۸XoE 2RT˔.2퓨%s&Bk~&/fԨYsҥ?Y#͊}i#ez67w/NA^3D>;\{w]/oZVV^ 8Ǹ-Oشr'{*k,n?7oV l-OR5]脪&eTy0AfP=g h25?J.z6u\cٟ`oZUkMaDpLoaat#Sphr?ovp3w?NcV5#u($ 7Zְ]Z@oxG![ÒyOTo6z! .㴾Z }1GL{k 0sٍqװh֠tݙu=tW/ZREiPގ6N)-B.jM`3ehƍ˹SK&$F+x|sL^կb:#Z{s0i1q$P3xJ?? 7p|W3''jy{lSYV|9Ğ)6,o}mv9Uh|%..u5BUj>^ķ;g㋻e9Ԅ۹eOwj2^b*rTvц`ѯMj5&65Všْ p}'J>|;o}] ? ?O"&Z'creuhbIq6.i8ku,}Z؞A=) R@ٳEȋxnsl?\ôh.~Ixv1I;Owj2(-G s֪C, 0ږοzaV۲ xӲ󪈲GiWH"k2$n)GFP!v9mj5yܘ1Gyb,wXVX~q >^e#x+!N]y&9R:ؕKr8RGFAc06^J2 ۞) 1#\>X"@[_mHǟJUR_z6O 1XI*nټ:^yr9/_)Kuq nv'jJ1}2I^&<|fRk֊2}>f)rI9 9+)K:XRHX>qdRu@lyMi2Q"XMmΊG; U -?\VUQ="1:GHip;qk^w69pi i"bB19 yPq$}βCiq+lkUFljO]|֔p,Yt;\nڝr0pa0p[%(6:c̘2$IPNCh^1bkǬ,,֖"pPUJQ,@(qPXQ9(!.ÄyHZo(ycb82*IA,5+:yB0cme@YS,P48k f\* ؕ7`[s}OT7z:j+Z:~PY.䦳XXϝTL)b2&`e3jD !VgئHc$cXDȽt;J)Uk)D,)>Ef-"Uؾs\&"bwv&r=~,;?{RP1&KryYi18&|&oDDy | N~OoS.~32wusVV0Conc9:HVgw}±bPLy¢c 5W!!τ[Y`isʕu ɍ3玩QaRZ89"He-64i *sZZ,c La)G+eF i"`'lH`{y`H@KVKlco${%^S`6%[6{J#9VQ,'-< 9h+ vqOGH*+Xx9nЙ,௰nl2t5(ʵgy0"٢,㡰Ί) p~,4kFe*AcyR8Pڂ,v7g7gqlN7g0ܟVVhAe<\,~F`?9,x؜%y3u Q$, V9o*1^SE,RRw \6&xG rJ`ɭf\w+!ź%DE,`Up'U]'MWjQrnՠLYUu0'`]Zx̵ ngNZ9X|+Csyܟjϙ(c/GQQj,j ~#`R2郗UJ7<^9@79x㔷4]oH=K|_.xlæCEgD^)JU)Ōa$GYx .d)깄nc/Ɠ7 1=,oYPW7f E MO0 ·~;0#[Pi(e!wϊ]`M2^Yɔ|!P";ACJ3$JIԗ@K ^ݴb\dDSDJ(,JX|Xg/KNwMRɩSxUrr^1ɹ9yyD%C" CSXA(jjGQUI0Av U,;\bY ä9l9J:as@oQn&ȁnhSrt< pe'2%Sdq꜁;P4X$L:]aa=e0~@yH uZSPXU?*|kb#,1|PD6NW*9::0ތV7_ U*u(u]X菃p4,Mnc(m-v^?"Ѡ|#Ѕf6Lg!6cpW1wcpW1wU-Fpm9+ 2ǵ{ Se*s>81Ԡ5r2-v1Gn׃ECl0  ۜmVs0~ήCŖQ&J sߎkknHŗsr6~qRwkn\ql3E'%) ( "ArK`뙞tRnNU>UYϮͧ5sͷshć';D|O'gr±FsfJ Z֔8ùȀ '&:N͚"o-< 8O 5 J%'p\y)hIo1l`b%-1 IB* >x)L)itY>7Z a85XN`;RĀ", M |ekOhܑe5SN~T/4K PǬRY`; W1C ,gVLU`Mk: XeucXO7;,08 6!eCu f4ֺ7?A8#B{7 U'昍n,էz[jglpP77ShNlfw027 {<:93qtQ%rtwlĝUɻfQl(Ի/( nohi8K W%zuJ-ܧNᘊ8s:5P%R7OKRLN>\ 9`y<^SÆǨb8ƆAhƈ!xNwIKlQV'H (iN] #ϫH Xf xcG:ԐPBi+A˵*LNrf 8Z?Tֲiȭd(rKM (\RQإ1R,`FEl2_M 艻n5,0D>-yڀ˥WoAGDջ &?z:)ww~?^?33՟7ՒCoƗl[IQRa\_ ] $V<=M/NDb& 1K ZH#M`:94_&)⊀eph Lplz)HL%^=,p3dHct?j Cs(* TnZv\Nqaku*. feYތH f"zcfbyl~%*:?wy'59bj#j^8$H)X LsB̉Ǟ*=h{&킢Vխۃ(2T1kH,`SQ4l5DT2b(k6Z^r_0Q (қ´!m"gu_RRFq#6HQ6 lӜQјVif . x*-X),mIV{F,5 "bGEwN 3̷ϋ%!Zg2ߧFtDD=GUqe׆Aݰ3[H!C8irŕ7N>j6g $ ?v;~( kp};+ITkg 6jK!Lq}IUvF-B" GlCGO=@CA/ Q $9yԒ6hZpMHXvPI|[ JI!Bo"v['SC|@[FE uUO~AN{|+>ݏmI趃v㲴@uծ_?YwGZ5{u҅|(FRV4Z<}_̥ : ] [.G~ U}EZҕw٢X{1jIéRJY/.XNitUGu풻#R(bZދػ'[h!U+B;]Ws8\ۙBj ,,lHHY-ﰕe<}J :*4ٿv_zsH4Q\G׋Lbʶ_4ۘ^<$&}Q֒p =WLH|<AG\Ԛe5ЄxCu/ޯ5'b~=$MmR`z6X\L /&.n:W_1ej;o*|޴A7+*_~t\&f xX@u,қ v ӰV!GA꩖RvW5VV|F)IF#$d -6mX&ny)ҭ/ј5ˮt5΀-6m˴@}KT@ ʗh̚RHaG[,!K'mp7I)LIļ@ ʗk A@u;5ZqO|AۗiJ0N;z5zoBG} ,[\[r̀2DGĭ1"-٘ nJ?!X/iEI{1j{\!I{ f  shͩJ zJY{ *roI > ,1qDrF=~zn1ŏi!$'+% lFTBAdsd/*1N`Vg2d>FXxCB v J[۞Ͻ,o;3HU:ÉϘ@< 3lkp~P#UntXh\0r &so-QPwϙs4G(\bSt۹}y( x/7E7)P=$ L.G0 Yn-džTzT0%M=HHKtOH Y3L굀%#HnD7$8:Nw3iWiQιG uZ/pL3I hKl^4'a=U:]*PE)^ϼ>\/wm~ rv%x?,f&= 2$0IXlH$sؒԒZk&c,~U,WYRdzy7N/M_Es\] #+fZ1 )'$#:vMK x5֐ JxCH,-!.9R0 H[,oC ImPKnTTЌĦ+1͔.[2Q) (N&x+N_ Lw*E~-O u&edJ(<=}Ef.E)buef0_wQƩ njxs ޷s7;u05yޅPH{[u\G/Y. Czb* ,Gؿ#uHh:{Q~7K ]J30 i8I]#9JDݭ0'ɣgWg &59/+@RzKo,!j|es@ l#/b RZ* oo湀Ơ.3ܯ ~7Ѓ}4~ tA7.&邖o0'@L}C}P aT뜖2tf\d,)OOeBK+M>OZ!A݆l =iv}ޚM&_6[_h1Ʈv5VQB#BZiZВZb#P',%#HIeSNNU8GLpFM17StxNd\?smAL2(iu$f7 *XBE6T5m`;T殿E ;p1K}\WK)HkR K2ᐁV@5̤ b&43|\% 98Wc&`-tJlmhHƠees+RɉE*"AeL8+-M0kYI2ie=ĉUwY- ^t -R Ezzȫ7?ͮ/&>~ym.[UrKEwܪ*3ɕ";&ģHÈ3L6}3koqJRƜ[pý632ٛuVxt1`.2j3WFXR%]=V@Kj($E O-Y YcdBHQfhJ2p# gރ@V^3B~M##\pOP 5(F )n,I&qܘDKDzMVOHFO+8eyM 4Ie3P& \FZ'4YQ(\wkk%A. aosMn=_n$P)wO?/goV5?|:|k1f+x 8u?^_PB ~ Og&O>]_0 sߌ'mw`4Aق x/JTLS&Pw/]XRFQ7]7BQz5~áWS!Û#_wTA,k.~&J&r7P+ *:?G$ciқ^z!h(PK.Ffc%h"vǹٌ&O:=otUK!~]Vbq.>?6t<|6Zt_כr`AJ2!3K[p7@)hcsD( Ռ/ƶ2;VQ Ts$Of >|Y+X1ޓAvYjW (9EG?y͗{}~E$bĹVMt -K186V\m b*^a_Ɣl}FRi%;1"nޱY(J>"$%6Lj[>32ն?bK5iВ`#`Dt0|˂A}˂p8#4|l=8MjAPȖZu,{f@z?u59MVJTnEҐɍݞ́$`NSvMCy.O\I7lF-h#u' {UtiZP)c'^K2Rug҃X2fU֩^% h(:^dbZdU@Y+Mg8VʜTM7VQ>Au 51]e46*RB> 5\0M2PBr驄faU¤a[:C'I4*6yH[}WSp`KS8:'AJ.`%ä=\j& noJ~ܿ4},YQbaϷ.cW|׃Z榡{ ZA[$ RF>bk^c?m5ƌ(\3z)kű}n`/]n8ў_Xm9mFi Hw{i 57 o23џIuF!WRX mGL?>)ihًH|~bGoJZ-Fi=hdbӣwOƓw⯃'|4(3mwjkDZ5}mq.&N͕?ݹاT֨V ~>-8\jAikivϟwnh 3EG+нvh^k["tzaKjnV~|<_S^5w|>.嗵..Y>̿ gOwn0yX㻟I@`DRĢkbd)Sy $~'w94fA:j(G %食lir3AUhrtUG/?S? =~ۛ`7nov;C0)'AdsF(̂"5cpJK!"%R{|3ȿIDN<`I Ւvb~7"Ʈv͕$G)qe:‹\hI:3Ԙk|[%%Vzu8]핗Oǟ#4., 1)MȴK=J()Q%Zp Me*RM$(5*mGI@IczA)ᄔ^I,V3L,4#i3ذڮ+"8*BJ]9˼" ɤ4)5ѵ)-[hH(19-Fӕ)!K!ˌJ<%ҩ@)JqHkGT-tfom!eTD ,pRtD%Z&F]I2-r)۰lVYU(3Ll)[8Hg!Dfk٢Qi z}tD}.Te[JqF8g#(yblE /NcȖZ_R0tr\c QlsnRع32z.Jn-[gx&=he)c|$"=8M@@Y3l?բupC_H}G=)IJ15.eVbkŭTYQwQ4!*;*;W` UIQFdsc2o%jMtTx`Jki'kԪ j zDg}U(08t;Vx5*a)y*R|oP6Ɇ;xi!HifuN?ޙ(;G8)j&1}c zC ` {Л> 9ӷ5S} 5F}<cJxˎQڐ v)i X!,Re*wܧJ 0ea9 TJ92lp h8AiJD!>>qqOSeXdfѼ&nu!nVV$=]z/M+<wOar)~N+߉Po݃(T S K Zm،t)Mݘ2 "8-i{l14ӛ3U%!M1[etïPhOl\wm͍@;Z]*oy%.0|$j#Rש"@%NvE3{z6==\}-Lb +Z/PZ%ڡ :xT2π+EA2gd`$[^UɍH}vE6;LLtW/95sV wbHӞ}H%&-DOH[J Y}{"$Z-)m x'V*ʿr]%O>%o|fߖa49zD!b/Uc;13Tۊ {sU8] uE7h;]J¼w/-(ưm0d#BG:b\){Ҁv&=;U\hЦc [!;A.~2Оe9"coU}lE$Lzj<.pu ojWAgH\-<=J.Mk?#t*r+E%r7VR-3NHB eh1`2h\|ߠ,!wIQ`V{Z&x-0N=-4X2;c lk6w$gh+2l6 vhh/]}^50koÚ$hJJMYhBuN\fa'?UՕjbl%"^J3՛/Lc'μ \a[AUBGkr8Wd|"`{-4͌ 4Jg:UT5('X[c8"LlIcٶcAa]4%JպTq:s݆`ssxVDeҘIK؛bf:ոSp?7}UQBUh+E3p-e1dh,m}؇:wvJ3G??b 95P9v|Ra41:t}L+wux:pŅMJ]%"ٯck+JAޙǺ7hqzᎷVһG`0!]ӃDk"D2ƈ؇vI 3&π ̹봳p2eatiLm]?K"]KĠk)+O%.m;FY'JXU3a'Q $yˋYR㰂7 Q22FӾ $pN% =&w^1n.Tvc|Ov? g+ܮ86 GS{1~<C㧯_ ʹ8i$xruiӏY'SPȧylyl3#asi4VyrT9JDl!u.\(-~l B(}>Ͱ?!9}ŕ.ryiq;j;4MF@ jL7pi:s4I^.]7x{1)Ys1~PN͟<ΎN: (. |#;?ͽ-YLY/Y\92x%p˹nj)#BL &?-Y v1q@ԳL^NPEiR1+D<4P,xmWwS1(<"Zh(]xFtJ.88Ȣ(zD83W!5@Bc( 9#+h'Y |Pd`uR*9 k*%n ( >I>\ Zf qH!H.5-GE=h"CiQ %L"|>Sqc"b)kSCiPPE Z0^ Rnv3NCL ;ڶb+ytKRpAB:@x%"py+"8 ՛rm h?;CaJ%Cܛ0&X ;Bp5""xhJ9 vMKN`%/Yy~ۯ[%a^fGnBkqS>#O>=o7?]^{_9=bɌ]#4o'>|B&#BNfsO>߹yǻVц|`K?{t}g(s\MHKxO!yGCґщ! E#IQs'SYtNZܑy0K*R0_C q{*(bBsM- m_MRrphCdz씛V;6/7GٟG0dh:=ڼձObUi]iZW }`JK:R& owycՠ/oIti"ZlH$k-13*%m,}qMy){{ݜ{K})' 6{ _jpnڻY(uPP$AeeQ, 跊0D ^SU0&bu;==ªpB)tB .z)c%<(θLSϭyQ"NOmmؾ4׸IC Ɠ*bmN\.qAXD NsWL±R.UT,a_ۻ:xä! ( #ԝc6-y ET/{貂`r~T_N(/wNhI4%1LR͢ȴ.9]Ң u^}@2 t"CIˋ2ԺB$۫*EB1QEFgug95$Ao5*c7`\$HlpzgҎ7?ixHoE3 r.TooA01'&h\,@xon~WiΧ9;i;04oG_<Ӫ=h.;| (/N=q Q˲꧷j2V n9*@s=5W]3[g]ym?KtӖh2 STȁ)~x1r^\!"]3ӘF5s rI8'Ju<@l5_{ծ7u}f<߸3i|z&u62`xkv:7 jKFx -5Lqz0XBqJ7wHS̟:~N%Ő\'9W0aX,(ҴM,$ K>^|]UbچPrJ9ٲ4ӛфoψܝ"+&x; 7CAҲӣv[QN!lSrU}k0~.MaD1QE]T*Qk !_j҈L%z>Zdz it(20>W, ڋ󵥡kB qB;UkqH.Nm!sfLPR1TrSJ_)fvbA'fc8>~:/a]4yڼ\Zh6yhlPDv٭r7vm ^iUD֙MMUl6F&FLJrrw^lhͽ4ͯM,X%K hp<:"YQM&ר>BKf5),cfA?J%E#'ߌބAehN VʭuA/߬0|Qm4}nW7﫛عuշmoyuFJ- ؙUۺxgjJhs(cyf?ccvs6~rw5񐜻[4~STM闘&DZ/ 9h?NF~1 :VT>@1*;S(Y܏FdWl"戒v~-\oINu ˻*O\޵>q#/{w{~n/8|ٔ `,J8)R1PVnt~$|8`IoVqj>tȠƉMm4m[ +l7jLnkliR@3Ơ!Ww=VsmOLpƧ^vm<;6?8|x̛1ordM< 5ymq4G&b`\<~ ?-=gv~ԯ[ 6P)8"ϝB-t#Ԫ[*6ЙSuҡo&ZDhdm}u.@CU褷L}'|^Qœ0h lv~6%0u7c2_p|/`wcl֗c]1hnQb*5QJzı4al-p 75Cz1z?ǗzDG@U(ܐM֍|55̫1G+Fro,jNKzGmT A[0~znq drAݾScpI(J$ 3s9ǜ,D=ehgZ,qUDVDͰ*'6;ln@EӲ,& C-BQ Z iA :ScR+ZaǔvΦ1 VS*i]s@T:T 29X rߢZG' qёJ[^Fc1XLP}3uP4p{;߻Ugw7nQ$캟?1a;na‰AOW.ݝU?56f7@B.-SE3G9ڪvPiV.\y?yE$,6tߢa6햋AQGq;TGO3t~n O <A[ acۿbBWTq$y!KviVOLҶ֘)ɘؽ!AkeIZYnoĭ2`#|X0#~ APOzzWO=/FWU=.F㍵Fa#8Mϋǩs0Ӈu- Q{ye1Y0'+HԞ)N%gT a;f~ku97S5gԎ` jIJJLn AgAI[-aԛ"=c(e$JQBj5 ́$QrmƌD5hѻf'~2? ;X)k |Xsf0օY/1Ow0#;>:3B,f2;e,˒\OyLjhB鎋#w/1?_t F LyT#{*1\ɡp(rD]w[ L3[kYx< R~ p=, }"oO9ߔ89 k[.R@ r^ zN@ nIAV I8}$t2B8RK?.'>~Vx]p<qYq4^^gS6T(*P; | [ʇa+i 2x.?-"tSuja'Bcuv0 fM:Es{ bDEv걿wB'̯!}ݝMw:[T,gGWM gu_^-.BHmkRa.Z&Jiy(ina33ZEOkf ʝV)1dwTL1W4fJcM) FEa1EJd!W9]r-Aods2y~7s]EWB S.YRzWU21F&冧i5T/Sk+"w[jWٷr%8?(Eo0?JJvX>9Lߗ{ws ܳ`8O|溪"t9XSŎ녿`XS5%g v,mJnb[ŮDzA{2oJBx=kRDT(i*80\l%^kD?kR`4a~ emFr {2z%di>e MV=KsZ[zzjL]auD6ew7-5QEdmA+tVO?VLr7=VLy8Gh@}6miRڴ7FvTڴ j ȴvk ᖐ UכV[Yrk*cjA=ɻcGRG,Nlg9\^ҳ]bBhj..J.o.Qkw4>onGnnp%P=cRazZ"Reɹw%ep -vX[ԛ8 AK1Y[YЂ7ếNÆA9ͩVXKq(R:*Bb#”XS"JTb-$5%܁*Q"G>Aj@hG:nx8pY2K/"@kv-<ǥo-TSD7ߚS(x\%rR( 5TvWBҶkF ?O|[x;Ng*y~< 6;r !Wx?~!0)ԬGqfA A-/Dƛzf xGhK!<}%7';EF0{zP2RCO)Qp͓3Z9j[?+z#J?7v#=V_=x0QY/ Ӂr}a@q:v,Zu}iBLeD_O4cu}FbnJhFP0gA)L#% Ci+qWW!(߲zi0\iyr)7Z0uV;JBb80?>Bƽw{I<)}d" g)^"G"x/% (aj +P[6ҁ#$vMv|pWZd x')Hɱnd2ƈBragU ݍ7=܏WA/:5WOfz~s]/>k0(rKU˓OK~n)oȟAeæZhXyj\0=k[k)Ƴ "Ӂ\1Q)td>5 TiXooo"Gm@Ar)SB '%4p& qT:)U0!Z1_PTf)|d1.EǘQL!Q- EEȔGIdGW3ݦ Dֻt0-Qa se5f'(Ra*Jq]gJJ~>#%= $p"dKAXÈ9Vg+Ǣ(19ըhie}:Rxr\-KVƫxQuc{}.) Tp*%%qX`H9amU&aAm4{Û0n.\ ߎ.٩JԵAtꪠ 7K0ƘH5u AJLy4Zxñ2 Rj 6\[8~D!qt, ,SuEJ&C62G)&jxM(CH{%_ U0UhŃ:&\ZھAP BlOw:(.AR3RP׊YkS*&t>S<)3 gQD֚EZSG~'(ՠ$E(aN&uZ`E.ZZ0hN8`yslʶeCJ)NTCU[wՃJ끪%4ݫ`"j$ 8).MPQj-)$& 1,G# 0֘f*Woqy5 +^_b۴c._b~{T)sߌa0?. L:U`|[s脶zt .hٙ*$zD ⫢Mn7I@MX-;^ Sv6h8#bd6*09)e*?)]^o2Mھ4ȥ뽡5ǿ @/HX\ouBR~ÿ3}wu?v>f ><cɣ%Tf ?L&χ]:/>&G^毃Gg3^@Zњ&g|-c)_{8E&. }V*pS!LY̭>0F!J%'Bևbg~24u #X@c-fm&砍ߺscVpJVPHx=SdbM+~aK7- ^p#ri;M/:f!#tfekf0 Lp[E(\ek A R[o0xWpIC'?%XLuA cfDžTi2&8HX"6geTm^3\S kS-F34$#_{}Yد=V{=פ;t|C TjVZW d(Bj/I*Ghܚxu3%_OD!_3_" [)QbKwUndө_+p9*-7^] r?D]Ա<,wr~ʹ;[DvLZ< _ORǽfގGv5m:*uv5//vrV(ƅܶ(3A$>D8L;+KE#1z! A#шVp3RRstϲt$$Kf,4(5O(qJUeN%VBoj7pyѷ})蜺3!u;*)],>HZ ΃J:bV=W`ϼG,/޵I~~~__Z~l⛛O WojDEnψ`qdm\I?Hx_VfLg_G_5%K2>DsFzIXe*L*A[R 2#@+k*Dva%Ysb:9$\/2 Ӂ1 2|59]^;/ ^H%h_^'Qൖ o|yVړ WH]2FS%": \*!c/ SWbV5W@)EbYeiWIWB/'Gj{z_Gqt?Ϛ4wRҼGfbT_*%/[i{pݲݸQ8ݙw1J)cYs.:YYO\YO&S^ 0=A^ ?.`!;%Os\5f8a2y>vqxy4??7=|??Z*ֻi96C_[k6Xi֝sq?8Fa*New=߮hzrQ^+j\9(RnXdO§1)F91tʡüi=vHRufML׶s}c\,/)NaI OT  .* |&NJɡc \lUpx}p.7*03J.zoTw'*"5E:=m"}r|t6KE0#ўy;Y~/2E懧^uZ{&q a ϰEfPR^ٹ#Smi6n-B ><:er` L5EVj Yk3x#J"X.jZ{ut`ʊJ(VQ`6  췵y;U$kT:l*7$ZkQ/tˤ5\$ڀ0k@[-=ce5K# +*uZ`EbrTfd&fT sR0j,LҕS11(O4fI}ҷ@&LFlx0B>l ) - T/pʛ:Ad;{'9phS]YUYoWfp-℩'ffNG{^Z.1}kjKh}6Z֠ Wo in`Gl{D]Wu[ίv`%ϕ=W= r&ƴeTϲ*4b&Wz|UiV/ ^0.[|^ܡ%RI9Z"?JDpRjrT S;"0G+BHrg^-zYhi1;Pjөn6ߣ K.7/c$D I}Þ3"ڰ|_@X$50}{0hvJ} iiI4i؇ѶZ VI'`orNbF+5 AqamUrԕ0yW PRF-^X7WWbꈧ٨1\f)s H`3H3k0rpʱpaլe)KfkVNk[XM֔2IDYv!  ",+6P, e6x$.>HVkܐʃ*'AXXPpfkV?IfJ Ӌ8;l**S*%,h 7k# :MIu j-җpN %DIԎBJtpTKұ-gu┢Cd\h^'ޭ U&5AJs8nj6kݲTU+s^Oc䚈 ul&B*6O0Fn& JSzM7ڳa\ 8z&q-A)ΊX)Q11B|22E 4`Ki+DmU~) 6 F,$G(p{b^X*ɲ f4^|||a0aavo9\)`QQR_. {%EIQ*QJre Ie '"~ާlx?X1yRKaH/>&<}~!L3ȴ;(G?2޿RՒD]>u)S/J9^~{pvJ`Б#3}h2'cP K0%\bP w0ýbidoﶛb` |]X^epS*F@K,ٌ)##w5D׈A"}S4L&`I޻nL7o?:L;fqڞO)Hl w.+͠|I]6bז0tկ+;_{0I~#Z[gV 3+zŠnkV` [gWNynUfx5-ko0kw?ګ+{D:MA[w3sJG?ɫ|N`R};GAɋڕjMmfguVzSES-9DaL1>z,pH(e€B bmFTW6_"ӈat ҍ7ypQiW^lfAw7 Y#?»%mル m-heқts[tMgL>cUyGe$& @1I;ՠǎWE)iOaVa8% y"DX%汯:yS)*ݚb#:MQGrJδ[SLֆr=\B544YڞGJiyxI^GjmY8|D1h::ͯ}VQ(iXgUgjde=㊿x<tռtլwOݗbFjX)xGT53 açwv՝v2ڪpYKґ|G9wc`!~"]2"5ňTwj'v]9w5G'^SKP jQw5+ٱ9* L4F{+GEմh 窻\ 5B߿K$L(|Bٟd]Y%?ܚ46e->(N;+QzՄ%=Km]Ox݇>LO0MVT_63pӦ\9H0R@ J9[xSQ'MAT+wUPζpfɘr5uR{dH35g]W8lS6x)gt?P ǚjJ@+<`ґH 0YP[:k3Ʒ&5~MF?N|{+O, 0Zۂd"xq%+v4o|nM>(twOB-ULBr33Xj7kaK;I<:!ҚΒtDݫ4sB !f 9lVا9Bh+GeOni# Y@$ !ZN-$nt͢T+Y kr|])ڭ6j6&p)$^\jFCRⅈ2&̍Hh)\u::UQj l&Tk^JCvV9gS-pL:ι p QB`lik56m l#T˗!B@DE'FGģIB7#r`Jxm,(4X}is5Žz~NDz(hv-D+6t-rEoЛ_;[=FoI0AR[Ir0az}T-7JjDqW<ژT ًBM1ȩu Qo Y!npf!6$䕋h˰TÏ[5mNZiaf!ޠKWl6*^MLg,`{X/Cv~x&W4)PZڼ堘}:Gg/15 .e_|Rl\4ڑ(D`Ph)lQ뀣ŎgVЫ~w4ɮ)ǛŃ=F!._@h& -"[M*.Ñc Gv*]NJ[?yT K )Ѭ`!L*孧s]li6|2.ߌ@"// 4 3䔱tBx5ۿz]}FQֿ,}D3ڣDL =VLkؘc6mkuq' k*^v}TÍT՝3p;mTwUWQuyL?ŭ+q+YfTO)J¡9hʂR){"\)u7}@m;D5ft@l_w*G|Z`8k(b~G1ܱdŦ9BF΀^A_:,N8춇gs8H Ƥ7n;5փ[Z;Pg OEpUVhPK+]j&-*;f- !j.ϝƠ=<7椑|IJž8H Ft$rh2tb@bbƏh3%et#J8H9:}"ݎgys4ҭFS[ƏgkR*C {]O"3/ݮ}D9 ;Cq" %kz-[^y-mbF-^c/q[d4[(!3~.E:DЗSVZ-D,jNE|p(Ck2 ZBHI@h2=&8&+^8nt!J [ [PoЖigO j5qS^ĬL@"<`/WI.W . ˱ƉC;NBwlбck dž{l|>~TؖB.yee Ԕgd<~O@G$ˡOe?ZpfS<}v⁕B~vP&BZ!P0^qQdFd69'rqU 1 'u4L)62a_ʔ64i]{*Ue/xzGbRu{ɞvŘ0z60_" / *3𜫈\` )8cC zö ]$tХ:k|dV4-8TK]Ft=Ptǖ!)h⁡f gPfNc2 `~![RKKߓY;4Ïţ*ZoQoMz_ @R4-p_KCRTW \Qc тXc**q?1cL}$.~JGiR5fB^|C+x"jZsUSYPZhXu)УҒ($R47Dh\%d ,CfM *@ȶ:+8%\KѐbTYX?$TE+U)TK)=q ᪯+%ЬZaoTAjv@]?j` gdVxP|Rh QHj"$lkxTعDV=a+4fllV,v[}8{q\n/0 Lf=i]q}Bt"{Q}%z)Ovj'hDžV'Bρhhq#$ ğ=o*p5 #k0i}k*V=}X,K}_7_?>-7w߽Qu| *3r5IOsTQ 0+pIݏpߔf},Ȁ+4WTbEDZ(̡ψtp#[ȻK}fHs!3锺݀F Ph&򔞃!yDڣ^K+ϳEY 2XuĶ5`;ѰCW/wu`D`Ti2 kF9\jv2aU kE DP) Xt['0%{Chp`#]= 99;@aHLLs1D&z{4VbdGB55Btpy}U|ݣ M4T4Ɠ oo!a52VFJMrN>1ҜV=}v6\x0G(oC.2XiuUkH\LX#IPiF ܻ n V(Ay儕A-cvHN ɵFށYP*s)sZte*+-ppޗwduεg"敪bJ.rF!xᓫN= !4߄`Cꘗ+ $5-,R%bڎmޒ#2 ^ tcnP5FCHk5̐՞/@S:) EI2UaB0EPOk|&]6 RVm-08FI.f"h-uwZ7+^Se0OYhIQ$ʏ Xc;R{tٯLH0 fMXiy)`r$Jb$:YQx"N]:a6W Xu󔁣XL4.ث1[] ( pPW|7& ?p"s%܇O>nЉ`I<|..ymQ_ fTĎgT7j.A=/APR[lEs=6LaVBM?)4/sݩL?f)7Ǜ F+~ I5M3qcݼ )A-V#ln1Nih쎥[8z 058mԥgG2)NX$&6zi(I ᒏEM-IN#釙2]سPJ+quWo^QWJBRD d,)7Vq]l~MB J26+*J$ZXL0ѭkBHfV@Sl3U4i <ݎ!ݬYvH+ɾ#)L:}Ohف[Ls HoUC"\J|t AC9/rՕ3L;Ysa]`JYws_XU芹 o}܎Uؒ- ',j"֛b]:$@>y(r}zSFwrazUy1 #Ɂ"NZңn(/95҇LXݐB 'GNdG؝|j1D.dmP o:l4)o:ds:7ܓ(jW`7SgϞ=RY;{>..hi bn`Ut_eXSwW=A}h10L0"u C%+ON/Fmm:W#1PĘ<݉ax17i:&yo8県qtEb>P$gH2O[fXTJlff lKuy߇6&ư~t}RZipHCC/UXOsz? H?TElS|V) -id1:@M?ʁֵ+Yj,S{{7厎pTl1jzóMJ ,"7>le@|G vZ 0@}V߉fm-BD U-ޤ+Pb(<>c% JYUT nV !;nmj?tMDkNI*Vჴ`cRe-g Bus.vw JH;B U3I^i%rs:*2'" QyZbӒ*S;X+7Hr+ bdr9swx@x<)JUqB3M99#v{$j'dL1DLYw7a# _Fפfp 2^^0T<6}Hpaxt҅ˀ`3Ū_d |ѱJ`CS5iM*)Sè1>|9Yf0㦕RPK]}ܤd e46>FהTĠf\I5e[!"|c{g޺%)@ 8~n7Z vS?MA4 ̡+5E)7Of68GsMeH>Fa5} = O(*>>(h}44]#!| Wu ޵u$"e_rTuU$LW[n!K2jJ(J(C;6,&t]]Uwu{_o7yOT)W| d)Lw"T!}%D:DQܵ-D5KPPf%e) dl0珄Yk;Eup*a:CyuU86`+|]} \>դƑMeK|5(Q(lMfӱǀ qu613?w~/o\y̤%]nӨn m;Jw5iHUshqQBy }Ji|1@jlF&_3fDW[s \e^M>cbS^43EbS^;s&KrT|ʣ)ӞVK&_ Vaye6T r!&݉IIQ=eB4Ab+'.ۈ#.gB#8Y(;H⤍s04<} "-6z2p:l sXmRl^pLnHdLS7J_;fX p-T} 5\_1$}OEjd4o/.V!ӗmQi-&B(K`1)r:} w? pv›W~>W@;t5ׂL2dWcDʙs2(Ŷ`KRՒSNDo)=aoO|Rk.tZ6Ab- Ð WT#>}hQ]^RS*V-QkTURxsi&M(iʊ&o12ksu{m8VIInʁVbLѱE}[}yYQڿro=x4Q9k\13Y~A%J2;=ݒti/'kp{,>DRPqlv˩DTQ<^nV`O>A`7Ǻ_*ښR3Ov;reZPv6y5:O{Nc:G]|I=9Ȩ/2`a1gGg%3}nӶ~bFSu-n?Ce=>ƒg>yaD%U[7;ޑOy\fIyl.-- :'[!kiY}ѭk8UgnBpbb0ZokjQPPo>lи@A~.Uqڪ؂1S)b2{?@ۮDo}뽮z^* OGRq_1Geß/jlan,z[ Z[䷚[w< S8NGHc2LYv{BXN3ثHe"φ~Oo5\]8>+j1?4X* }j~:^v}Gt:ؤg|Oa0YM<Qy MM)-L1xۦVb*7OD7ia+OZ]Pl-d %RRR"+~>;Q$O"iP< GA ^]8ܐLN-EJ:X&-L1xw7F<7.Q: /`L3<7iaSU?C)A}4zU^# #r&-L0xăkDZn00R"cÃߤ)M<8 ԻN@oW"\Cl?j@We,Im7ia4k]W(EB3:ym'|*:[*^vjDS"0~h C~$瀥$"2n L}ψ9W"5&ƭPjX`AK&Vb(w<׹@WQHN|M=JlVjcѲ ꟗ4lTD {C/UJ>Kk>TRh C- #WJE25OhqB8p.*b^-'lSlY=32$LfP/:'3=²-d!fNzRhrIR%nypk$N?9y9L".{yvvaЦlLAiB`uP]α{ժ+`0442AOI!NIXEūK.HX 5%j6R4B+#+TUhV1F6*;)~骯 q6-Q 2=I'ӀSU'[ڶ˻|/#T[Ux>3s+IzdԤb ٸIA|r-ȖBU'vU\BvުTbGb& I}NzsucrAOq5K $VԬͪ2 fvYRK-uB˶-oXj`ѨSjUsspѫGl'v 뗏(`KˌV#'B)6iT6%d{IXMÆ\C!'jvWXzfHA 0 QCԄ&,O k3?U>S 5^0u"YD9Ҩ JCK+'ek";XQ`+JA5TPcPؼiCG5UBګ4vlFĬ*P7$դ QR-Ȑ#Z A_ۤN" hVjUh++<kEkqS0#jTqЀ8yPV펱INIR+uAwgJOrJ%BAs7?WP KKqT0{8^1'˾Rm2Z- 9Icl?Lv?H fuNȃ_-$?7 YlQ%>|)EvQƅ5H5}%KS NqJph^2Z~$%] ~x.RYUN49sϝ=IIgSV>bQܶo~Oz0M9joRZF*t#"))J*df)Mo @0#..v@gi5waǏR%_%_/IScߩ CTTONBzUc쿉:ID}^[t֦L$$J})ReO.ﹸb+UuI3;]%zT/-*In|#R[Pl&b]vdsWKo<ݬz9XtaUG,CP;u:IlU:AbʴrnA'dQ.V!F -eDc4*9؇? ٩ևd>Xz4]h~܋mO`e]ʷY2߶L@e -Sn+É>Hc:FfGOzIו4cW Ԙ,?\W!N7Ocv.h݂kKu VjLRUbk.658g,- G=p&%HNH.us]I-5B`ErDT?AƆ*lqQTɖV)چv[Y6"V@ AWB$l/~5M~UV#Ds̺zC }İJJhz_iv ]W]’]ν(w*A3MUu= s65yn<~Ӿ&0VG[D˒-ܜ;P,bk.S|_!bU@3dqmk_MIN(|ؓ*}=׹g=h\AHIAĦ?]X8ddbH}Ůw m!hHuC BI,E!>Xhԉ#\AvodQl?)x/~=tG'O&<MKw}螅33l&>WvwE]xIwDa5߹ g?(QbG!&ѯQܟEmEo|j~|{'Ogݮ7=l 22>̧Noֻ0NvFWѹu:;OW@e;L;E^wL*D,lkBx,zjt?:z޳(Jr?ha@rt1asIxtU438e\;H rj*7?zJ>^Fprz=XOklפֿu/c/G?+zNw 'Q+c>Fǯ~>Ce$:z}u}S~[tz0|-"Z֑iGH\s*'i$3 "h˛j䓖k[7p=575ͦ6̛=wd؉@q?/7,Nv5Wh[ZJ Tꯕ"~Ch+܎2//d[9wO<})W})EPl׷ۿ .pwrjן^$1;e/~:ɩh0> p'@>}w,Q&l6w{6GP|.N^0J]vzN0g]O/8ť~ShPx]}Χ4o oGh8yc3:]j>oEx80W2yxu.t?ŭ~wy7lq};Uz x?oP3/YeYs갓z{eŧp43|ߎf߅;7xι=H_T0VQN̅ɭO{}bL(v_\"*<*V"̮>ՌŊap3/u~P9nvD{~~/pv6%}N>?z:CUeWٍL,jyc-ש\~ABOgwcE",- َ:9eم7[9MVٺi f\7o,XvoZG1+j g1U`^XO|0B7Wꀪ9m\U=IH8>5nH9 H7ײP=8%o9?ݰ* Y61SZ 2ANRŒ}{iəB,8?KZDy)$DH`DG4Y=Gtb/ 3֊xE ETЄ ֚ 0fZ[)|!)D}~SGAэ쳲v sZ lIHL4 ' U7"V侐Or8 cNIoAH&@ Y-;dFFPPx/wqh-t"9R$&ɔi8|0 `(p0F; އOuXQhq=MF/[2ua!kVȸ(c4|V>J|aD/}asO6@-'T ^8(+/"r되!iGw5$+"W.PS S\5)w}Q':̎<JNVrdIkfW=E|SlX ,SUtZ2K5@$!Mnsq\-p߃]sZX{]5`栖<#&%nn!;尠[(a;WHS^st~╽ o['pb Uċz\JI՚m4le}3\EGNкʯ*h]y]V?^ wt%_|ӘέxX ñ]bhovL~|NR^ Wk^ӌH 9g?j$h$Wl_IX2c-h%t(M\wu$5e5OGY؏ yT,AG~X xrmYЪ,BZVӲJىDNxd[Q;.4m wLqT"wV]@Ґ vK*鄩4hy$8Ǵ #>(Z%WYJ!KbėF끹.SE/凲]~f,|Ei#+]H0`Fpٓ Ͼ On0F$\1/Z;5LoٰyVLƓgI2켆 Jbuﷺ㬮`IK V#.CyhZԛ^6#HBeMbM޼S!B_Q/U9qb6+$jH|hn {R jޜ~dʦ~ѺJr+ }+Fط~J78)fX%$ k'CyA z=.Н&+MJg cT8#˄`;&5N\$Ij )hhov4QiHMgo 4 a@q^N[6.aXimLyM ,׌3E (_VtѺơBv-jD7&Ѫ@(=J4 T@4\`3r 1M$y<'96LPQ0OfK0 b'I[ԠRH"/.I%/7h/ g8yWF`F/nȞGG~'( yd]؇a7F>ma>K ~%M,Q `ƕ=L<%1p z`ǐgiI$iPް р614E˰>m)M9YwddF,Ԡ2X'dOF 玫=v {Bd[z?7d+Xظ߭>o6MsaǀT\ =h\I;.0HhJsC*NLJ] SI+5q >Ҁ=`5vDXa4/t!*! ܇Q˗*MP:r)w8j5!Vkb)VkbYc*>Xxw(׿Q2:;G?8|yitd{ui{G7fۦgڻ].mwm+{\3^2]&;Oi[auuHo/?fYY6Ut_4k}G#w)yn換;/.0nXN@뱹 Ha<韤g״Vw} Hx={})pV8lڟY L`ژY(M.G~f 4G[v6wwȾieͷC[%~ ZK $ ӐO\›{;7Yq꧴Uw#;/^`/7dʅzg ==Ej7ʙ.3sΛݗ;ُdϾtuvyKwF7vj }wyU~][vT9B³ЪYDCs~wA˂_{J缰C}6Pc_ JΊazcF"k+o-gםUn0KN,'dSe^fe."MDž\dN#!c%\bևSWVVa%Ӓ~1rF0O/C%-H⇵qwk?;|l7;kokHGizpumc:uzA{6ʿ^¬n,^/߅ql evV9w>;(Ru預vr/¾MfΩ 7=,;=ˊkP쟥}-UaßޅOf'@ ޽k4{5 T$t?jI|xp)_GNk=x9s1w-`?m]l:VL/6fk}[q;_@HtaJϵnMb}O rϧ'o9T@$GƏ1 *?Ih53ǡ{ߥ}%kjmCLsߴR *7/b=r˜"juinè7~}7`4o;f+9/1kB-G?H?i8|'=x 5u/pRj3EgvhgGR|v݀HͳI3A&c#˟.?2 Li ~蚯7MV9ity-_X25n:٤lt kБϋ^+R bi[$F3/9 ;[q.b*?ld[E% u<;YFMD]Jn!x1&HK~ +;+O֬l>*V6-=*ޖ.fgkvfg3;;^Gi=2[z[ýw&*[d|NV35s;^a&5ܫ,*Y?8fqkfq>Æ &7. r=u4"DxR9ܛTas%5[5LƒA5[5YrY^mYüwvzO]Tkfkogk'ښNG^㚡} wή۹T5;[m͙fbfiSwӳd]8+j0QOENqz{'{bDχ\Gϳ+`OBcw`OY|:Y^v vCo)i~`6e ,o]# jxt~!(!>*dqYm c?hdR\s HNt&ANҜTQkc׳7 Ĩ67.KoZADcTZd?ne'/)a1 Sy#1+u.2a=Hua0{tFi74)tt9A\+>`^h8b䶟L#7l,Pt<-c hӝVyZ ZV鋰/pP+.\F"6Jf\ ZXɵv9S8OK%5OUO!s :BK)Tȥ*aǹKx"[h *P3p^6Ott ~ #_4J}.tI:VJrv3xx+xLQ"8p:wD2MdtLDE"DHP& a^kp'|](jJxNc}"N"1Vpp$$ALIJ, a ( w@!xu!0C*ru;#ֹe꠹^QGse~R#E90$D+̜Em D)՘r.1:pD|D~Bf 6gA")HG$yEa!^;#_V3넼fA}#*(.TH yHh+-4Xah)-y0C?0kT]y5iU~׬Zpu6ےTHysL$}rL N,ȹ;& Ud&X&03gt8 mef1l 0«;)qG% /gsΰˌfqh‡cEGEr2RAV8grr"! Y0% C #3P,W`H8GpH)`|ʩ)Rr@E ,pHH򹕡YE T9R"bs-j˃@?k 1SSJESoA4:X8a")rYJ=ցh)4Deab `$u10S̝$ɸ4448 DUF,a 4gʣK*CS'#E/MhD  Z@F3aϰw KEG:Q VN2xA`@ |t[ "n,(s6zC8փ(FmD4]$q $(G_@3+@/-bk)Y  N ki `rXzJf0\r(h0{摢.(t=` Y`$g,ru0Td(A<$R@=6 bypb!:5( 26xsC@7cr5LᴽF Ҍfalh㨨el9wVbDP(/9xL 5O9)"R&0Hk=9`IPC`XЃDcڕ0u`ԠcDB1cA>J:0D(#^k&P9 94MU:W&WIyP^Vل ЪVC<-*ТIU9Ok%V kqFEh  ~10/Nem֨ےz2N}ѝyN73з#b,~%_ZQsZuVu]b X-M^Za{%Z˃2UUPԡE^bC.+_Q{vKT2efn-%NC$ET{FubCD8LRML`SMi\6e? : taS6lwױV Lval'>86u EU$DhʀKB+Iu`K F A#g(#L`Vav9`N<1in{;AK p.('gA1Fd"ΘE9Т% `P2lX#a #FM| d)C:Zt)(ZY ;ϼGB` N"'M:wi "j(Aa,a2fcPPɹSFq/4X0p.< LU!lGQF?fS̠I@CJ FKb%PTL#aɸNxK0ɉ̹0!@GQQ]f[̨.hwK=HCr[`IQ;N}P`=%`SHu*,K =Ua饯Ua˪ґM?F.+iޯ"UaWPluUXTaUa/YE4ҋ";YU{/DqQ/RꫪХPK˯Rf5֢*,fgТOQ[E Ue{vVaAݯ²?T*,=i*,/UX"K/UXTay N*,f8hP[%# >g\pȱ&,:߸IbTel8A\QIrw/(5Sgbi10 uT*@F7X1qDH \}^xE`+BFi&{38^?2q'1& K.f N+!ԹϔΕJ\!߅h,\1(7,IJ"uAZT }V -gyr.SWwv["c62je@/ʾ+keTf7Joi% %v^7zӠ jCՒmZ]ZeW%hoa|&ǹ99b-[YWML"5ݝ7 %f%!f@"MbO!?>C`oe{Z5NҏӇ|C 9[$ u9E}崳K}TkeE>A?}X\CniЙ5s=۟sXbsU9C콆uG6.EXr}Sce/>Ua#k "W2}S/~|Glm#zatHz;j K޷Ȣ n *Gt}0ǏσO{_Zj-ԛy۳B,Y|EڔfW(4E[zԗ7$tǥd Nz-Wٖ-ucI%x޽]zהvV  #F,hzY1ŠFiR)K?B 5ZpƣG O QD줰g/hxJg 0nA\gFgcZxg!L1AN%2g( Zh"1SD K8!s#u9N _B ቖ9¥1Hc. )8QD67IVmWBăJaF0όŎ{Ƅ/W1H%Q36!%qMNHJIKΩm$< F`Yu''Bn\`W|zp7J} sLY&,%'d`Y]BuVRNL^Yuv%n\֮S$&(R!W&pNbQ3ZBmh'4M p':4LL'e-bѦgQ cN GI@DžNJW 2p"* c QGb[V2hgL#%Qj=JZhR3M[ I!J,_IRyFZ9s `% q0?) .ʤ2I)KZh"G>%*SO2S) AĊ!g/hڈJPa0h)i Q>%Qʁy3&Nz.eH' ĂI2xz ჷ3[L Is3ŗ+XrA (D L1X34 w6#z>OwJ^ :Jpy`TLg/g,뇧qn*;_H"2XH`DWMִ'=0Hu} Cq$Bd~a !8 SP5YEk%\~-݊hc5cY)2YxJf!?61廯G>ߤ}%1J9<-ro HG|y\tcЭmtqԔuY84؃@FqjS-b5?G2_Fwk%FذDz>ƚDC: "!5:r94J>h.1Ag[oK㷞N;z/śTb&{k*6jG 9jy*}ٗ. lwY9ʱeI8vdq~t-AzsDT|5z&Jqo~tg`挳fE0gnpN9X62%>m[`ڏ4 4?m5 X L &l Bҍ;<%a%}b8@9uAĨG/,Uz0D;EB bmGwXzXhĭ0\! X{/ @ci s$X[a&#Dԡ6;ДQbdpXZAeK8ef8/ӈ T XVM[™{iw˿fwwؽOx|+M3 ̕hV&$mF"v >ijoi<_2]>MO'ſgFʻWoJɷ?E;NFގGs8{r1o5)ލRuoH蚤ԑN((N[dLA?v96EwI@(^1bqS3ICd4E˓Srյ 8^ ę6)3}.DUcu SK&aM8Wub(ύuI'ɳ) 1W*5hucJ{ͨT(Y8H Ɯ4b{\ %f+c*>)Nl"6g \)"ojшƲM vnc/C;jl-:i!q"Xr۸ }:PpplAR|#,\G ۱*ud`ׂ~îŬ̵k1{ ׂ|1B%'ў <#)VAҬAQ什,CdHT{ nm8Ӽ~"bd4W}.4ò4gSQ`qQn79'箾ao yT~Q~U>Xhzr?͇|9>?0zH#͜INz1X'pJpD(G9H壃2nCΘxM9}]_>ͦo5aaC\o/O6o\!/SuOM'lMb(eu3\\R;Cr"!#Lc#lE9^N0/+<]kl| ^usx-4=T(RP]a=h/a~6[H$!)vU$sչUH 9 Z=8d^pː*Q p!wJ孬g,I%ZYx .+" kQ'Trp2m,FN5Xe^]j,sN zz y.v,J>[6ݷ >){Q2,Lm0Aa aAs9 `˒|/t4T)QG wntu]fEM9.bT`9@gEa{Fs5ҏOn|y[FRsA=x[m˦F+Vk74"rRWEw/|rìY IJXҎHvdoi]Cx@BFI'PL'Wa=WZH @NqOVS*gPlG{,Cڎ¹DGpoMO&ZYZbŬ2I,#Lu?n4{xnfzXf7ǷWZֽd (?T.3ѯ}P9\u HyQK5u yϘDxLI3N )'H[IHƬVEUբ%bye<!DAX-c b?qCo'`QqJĂf rj( jHTZh/s U0{d=25^ ɕנЀ#@5."Q@ը )5[a&қ`b!Xr? ԮW&O sBw3S ;x#W@.[B'G:(Fu`h)8pE@XcLLěUy#wX0rc6wd'WK2a~GW|%~.za L|Gie{/EX^lf&$h\$lT$7`z="X+5`9PMb5^!,Iu8!Ϝ8}&IAAթ=tmDU36UwG)}4BӚCX?|29+V5VV*eI] xpZƸ<i3!NZ{|[asurZ Z<&Is!?zIMQ*Na)w7OOn7#wAFd۳hR#:lz> č~=g-F%sd:sdRF٤U$5q 4^vE6]PЬvB;w 5ժAțˤs6O4Fy) E.#ؕ3IT?-)1T0ٷ0LAF\f&S"` 8[ S{d4eyt&` !ZOp1rYv&YvwAx qrr`Vu#D|pg|32Fa Ve~n-n=?N pm^gq,ټ/V;Cy_0Dx^KNqØ}KE}k[`+Aޭ5k&\ڕT)F (+GC*S>uX1X\ RT'wcRJAb-AS[W,R܆jպK'Pq%+E|̝,U8DaD/aSՑ;R\)#[j%^v;0;`$um5.fK@ #%}fM{#/OP/YPIt;~+j!#>1\8 ƃc" .S..xag 6¤߈}%s O;3R5I낢$fy 9 ҰLIjs- Eg)<\~ZJQHpW45e[35T{mNoN(Ѽg*;R6 3C fxL4,pHY^g#*v.jDO|ƃ$34^PCZr[Ҁ!5ހC:扬 i Јeɣ" /t[{/NQduxCG9h@ @?M/nnෳ(볥P6Ol#.x_)2E5wPWu@Wu=h/&ps#,V(h俽>w$t:vn!C,] o&.<1aOv'ckV.Sj]eqUywbLc3^H|_|ה7r2Ӛd5)wi*~/$2ƈ!'xx L=:$iuMj<4X!ղ7c*pҽSޠ}Bگ7cBˠ0yBH:9BEVa/w˒OW;QX{هsrVᛃ.l"}AHy鱨?Ak`# f k8 ڄ?Rg]6nvWDq1@N򖙼&0E&w$hw~@;egϲ+T峟9Z 7/1dA1\+C7S)jhPSsBkf,2I&xS10J#l-<f h=j*}D@bQ "?7"2PldD tNaC’N0 !J!Wy=:υu^ikyR:$:҈ĝ5UƠ5#>Ż5zj /Q+JZ!jC͙ IMigkTB.U+wTԺvG6SH*"L| \76dRߞ\)!j&gf޽JT6p|)/E*{ m~ZQ5dF`b_1$HG$yEa0>vTה>\Z!-$#tS1kUlr4 Ͽ_!QoNYuIٻ@YRfEžb'g1ZibXzKɧi{%CJ(\XQ%aekbQd[+"j" PHFlb#y˛"C',iY˘*HkwyUiL\%H:ai$ykBKmC" 9X8PⰮ 6yE&h~WBqŖDI` FE.v|xz̡̼Vsc443 Z•Y8R*\RZ `f{ЋK׏6qD: KAq9Hz;WS}h,~b qX3SZ :ŵRZ #"P1a Z&Q7ڳ3f'T=]pVfY0䬺YqK墲rx__M׷32.5?|}!Rn0MqW=Uُ=~gz)91Sh59V!4\=ߧP^;3V_[2!ۀH<&&KLHi EHU%f5M|ɔis1Ue2;:& ܇}et|yٯxmùW @Vc5ZV,*nP WiW P~tJ Ӛ"t:v9*?#*<aK1winu2x8wnv{U[,n.o+1KSFc_pл-=1%P#܋Dǽ:nG:/|Ic \ @Z!+#=F{D_Ofzs0Aw=fUL-&dcrc4d~WF-cq< FVqm48sdV׏ohKrAɘp<=ndV]*4NOǞAYDc`$"ɾ>Xh[f G?Ƙ]T<}0N2-c( !k# tbhsӴ:$ci-cTnGΙԵǑG4)cF|tz#6!X(rª,I~HĪH^k_CcCEMO73RdvBwIhPd3|=X3/d5EUk[R6m~LZ'YZ߼V4[-eܒmgDŽW,לd2MLi#)s=tֵc!R.rJDBD5'Z Q5{g2kުZRۨ]RTּ4ʳ@ƾL-UۡU$2rJ>}LP'0QO JEnemS9wroX$An[4n)^J(6gY..8B\tCZхr+ T3N˽_U?SZa90{l Cy]l6*"6G/ &7a׳m%M_fq;Գɺ$^04:a6;?ݸ,R09/Kw~w׷Sb~np_g^=-LXq1c:̗ .@_Ҵo ݂ ??Ud']iJ" ۰hrR~4: |܋>תuiJT(t,PJd|Ľ9x頒`|}\GM~VOkx%· òf Sps_јɪEq}]^Ǣz`Ü1zPΙ8Q!kqD㎄n~,M{FD%%]#XjeU-clW&Ѽ_<'>B.wf,a;&b=OLvrCl6` ֛wazsXf*?_i0W7 Olpv&Xg'|샙=^gkWYt!bVb?';Y%lK;yT-|iz:T~']t0ՔX+;8!2r BOz#i٠|K2 ֠P><Ұ (MϠ|yRv9;hNQAW)J1(05EZ !?c[\ڨBCp=!WQ2ްaQYmPLd^ɀ5EAmm;Vwp4$ޝgb(2Ʃڕ9JQ-s Z#z0kZHCS xõOEBLFUUwqìR!A'gA&;Wvx9**NokS'_e]N)B#|| G޼\o3P,F @u뉿]R/Zi^8sU}A7 ˊq?.K嗟^r.qW_/>?k4/1j:|, sys~Fhx&dzI}1/~Y_~Na:?Oq=isǕ/k5T+J#9&ˎ\>IX @L9 0Aэ*K(`w_u{yyMFLWfA̓+S€3-saf3(>$ѣF#iimJ:v}Bz(Jϫ@eCR(R쀣C1Bf pEwuRRӺNGJi3OG'9~P=@*?8}DjbAkL&^aИ,G%iNUWt`_L o5I "LС}A`ֈ`n,Eg ,W4)ͺn 7y6x%7B}$T\s{ӗ]N2a欜tv<`=~|) ur7\,763ciaD/I:p}+cVKc?. zB W]r}rOa10br&QUa-VCO/r-{uIΰlTSU11k8F5JkNchM#tQFҍưFD}cfQjGo dc ՙ,=gE=GHB0U)Kb%cX{<{՚͏yޥo01K-`eAV:&)}@3&sM[a~WbG0X vqOWfkqr+ ]o.t/QɏU!WTlwQ>F- lSU V0dA3kqO5e+Vrq :>5}[r. ݺiAd5-I[L3sH]j oxu'7Yr8W쁖H0GWeM|;Opshq90qaMΩ>MHbC#,RNW{^ Hrx0Ykv6yQ5=c V2S(sV[L&VD` 'q KFWeMf1ת$Ms3vz[J[ğ qI@Im¹CR{<i$O+{7r\*aTim CL{w;rB G<9#+))Q>pbƉy%\pr# Rc.S"F?)BEP&؞|(&W>Rl/5V""&! (GZJ01J+w!% `Qgű;7,>OM ^b?4M}.O I%=R$m +N QM?rW(?+]փU$M7 ]F{|Be0%X K({O83bI7g:`\hN nůA4gѓHÄ &?Φcrj|y8Ç _grxs<2R*xnc9Lݰunлa7Qr! OΪy._v}1㚧:Wtr}& T$Eid]Z|q(jHr~"}5Fʴ8>6H=z9Oq75sN0@XJHeg4p-]^?&'8r_*`cFZ9FB`m8ye0@?9zjOf-QWVwb(ЁD.xX0ߖwuIVwvjb 7lLtXiҥ->^w`>Iތ~1`=9ε`hͲWNw޷y%: 48Q\0~_~&951Y]~tbq!~$jo'-;"-{$'bLL nKͿ $_'=EmQ_\;+?|,nIۗdw~m^a7] WaqR%Ӳu>Pg^2೘xk1|"%$REaU.ASފ&0[MOb"cQ^q|?x{[ÍC"iR .M]ZT[Mvy1]_ÿ0,K"k7Oێ`[jns3|PƷ[@z! >#iKD44wC kkB-g;_6ϺH6rϣiƷ_oQ׍h~X"{j_NK+f q҈L1"*BhZ$H9 'FQ+58υǞ(=HzLR+܎9$ sT;"a-n.#ۋaZw_ߑHb& G(,Ɵכq OV`ed6*(#b<1%IQA|$â0%,O{W;IU0`q+s|Z#DcWÕ=^ss&CSŋj.fCU}$z7)Y5@ܤW! UZ뙙_&{1M,8hڳw 7 m_1-_!w:^bqNO\ygY;'CyTc{sH;!/HA:MVr%C`Cހ3&@)qh(%X0vYEp0\*$=@A=sud8MuVcuV'pEܦ2'6Y-RHqr:0 B!X0Z8Bz% D&@!VΞ<ғ~Ϭcf%fjX#9Y`JjewhdS1K|sDĬs sP[V U6*RͱR%~mǘr<0=ƨ2m9u&#qڃ5ĂUF!(`{~bIMPV4{LxPOZ̀A2TXBW@Ep@FKJqSpwzD'i=`7[9AO ~Y`l@''OA`ֻ)Hhz&\/DzXE~UeB!*fψa>+޳֛$XGauR!]%W*sAM,79 1JMG1(h (pѨ @HOL<8X1' 7\Jm^#1ٴS hӎ= b*v];R{MfC9Cs Vvԁ~\uYP \,PvtG,xbޱX#r DӠ[^՝5NZ(QrВ<\'zάaU@+1k ֿ]nQ$FȆZ{dӀM$=hkigb ϯܔWtAlo,-sba{-k6r797;5tÿ}@ok&hEZstjPK:4_ #hrA~s:$b iwKt,nMsA;MVn3Gp綬c-ni& }07䍜$:-ZZA +(}jiQpTĆ 'y !m!4x,maSY %fRMpq$$Q]ē4=hz^먌FT^İ bs&Nka6!@LRVoך~ kFW7ǧQbX25lYSGo;ɗ!c QEi 3iI =#U3S=|¦V7on1E{XYHј`4:LD8RbBZRoilec)B509X  K, !^g|Rs$[Gw/fPٵ]aލVJV . ҆Ϲ, u R@gӏsQi $@J 1 S$&j\{$G20I04 h FJMY-*|JBYP1A\BW_SvzkC"\Y+tUgWtm(qjKelK92? y<;)jDQ`{Ulpp H:/(Z1Ʋ_&MW7+>+59A[DϠ;4/݉#g©b*a" (lhտy,"CD]~y>GVgCH[ .OiLuF嵥͜a"g;8;VY'~:sDm}Hlajq[((i.4uL0dR:MPQ_yKvѿgxł s.i8r]/%xnFxٽo5#ZGxw{}ȊN~yi kz$Q3á_[;d4"h:ț>lc}{/;rD? poE?93@QM|X$14]8(:ã ;Y4뽃@Z) ˽C=Q-e .#s-B*j EMPcі!Ąi:cBDiQhj`5O=-]G(1|xXn!I\6 O)R1p+E1 b2]YLR{Cǔ>QEQl#Eڊa")v*$|{nBwO?zNop^cv:k΂$Y+XS,wcNN~e%1V|4%q/'K/N V+)3H*-$?YAI3DK2=uyp[7bDzv)q`sERZ/̰cpی(eq[ጳ577]MW(d.Wx﹃pv¨km;f,-9q؍b/)nIԑȇ͐psT+X7h|^[,QڐS~>@9m\V U_nh,;\*+s!H1:1ĕTS bΣS~96mLl1Vg0Q{0Ck|zBtu#Lxưg^ID/iߤrs^$O oiU}>7w7,-#M 68rfxM}O˶$},JNᴵh&ɞG3MZɪ_c%=!׺=/ɄrZ4!F:$:=T e@-_-FNWS_qn7莕v_HJhYK'^Oyj mZc"ljsйztl@ħ8ursjRԢX6vAg{b8}VG.fg'.r3a"%R a\㉋Һ>]ۊ)9.xXK\V.e|n q*l-tEwQ=řE3t2=U88kggmw\oOxl۟w6~^=7}0 K5KZb_W=8jt~6x BőSx.) 9wR1CE"{螼8J I\bNӿ}{>ž6.t>G]At˷az:#ƨ=,*_~s=.#^~Ʒ 8Jc5*c s/$%HZ`a$\?ga'[;a8({~q=:x`F{hi KZl`g PU1X&5 ~è8 W " %ySllu$ju͉2!AR&(ԔX)!"Yz&kzeey}sK*IEQJ{RQJ)vHpy>yѡuaWq=BA8 ^Q'IP\"Q kJqτB9dhȓYXqO:y q^Za0,"w2TkvQ E:[1B@6Bgc Őպ\ϱvEo#T"e10пҝ0mSYg^#Xc(sV[LM}X[R@l'_S<^𜼐E;eZ[p4qWgt?~(/D)k8pkg 8$09Â#m|E86[0"< O!ЄX=Ӽ3F"k8i ~ɻ7}7XIy›wX@̋' A%dz/UDP^|,^ܒ'ScB b, au0ij%N/> 9/9uY1EYgEDMB)}@|)OȱxEnK7VZ`'qFWzYX=y0h#L㨒#H ,993^"qq@*HaWYp%Wx9gPx0 TYM* r+up!PxlRx 0&XTܠX1B\\(ʩ4(rJOŁ_Y[DE?hdAa~" BKo-ca8:ͫJcغ" 3>dP8t[ ^YhŃŁm4qgϘ;:eV,t4fO:T3l3Ce{q &8k|ɰtU[QsJ/t)aurp[7"VR,I,1C %/c 3Ai!rg)%Ȟx93V` Ɖʂ,PcMw~v #}ߋp9EhlDYahɬ :RN!J.>)Q$151{4񢜣aoJ?Jb j(() ;ʨ=1i9u:c˝aN8eEZMQ[0uj=5/KQ}I7AH82E9ws<DF Q ɂbaaQ0)R LX"Ϣub"xQĈHe5!E@yq뵱,s.T9-K0?%1?/"V.f̝}X -S4҅TIZu*UXIŌVixbi- Ԙ}iM-Ғ)@Ri[6K׺ɼ7u9+UyqOHFD[֐O˖J|ݥ73- f~ƭBDj&#(8(iJOED+%ZsT:bv(ԃn;#LFA WDz.8]]e7G_oSg\A7fæ1CHbְg*#j7s]BOx@H0u[P@ilw8'7G*RcHHNqUPv,yQ(Ȕs%#8a|a/8=> NPׯ hz4yz;-uIUHLR)7 /85gu,A{^ Ѣf2g<)C,iYc|əwާy \a5ʻ{:]Ftgv3mTbtR`wt gb3:T5_.ԣG(|ކݲy l&˧O{`!҈`q۷_Uh0tw7w-3t>ڙ~FS!ߤ_GMt۽6X 5n%K0(9 joα4 +)R R|׭՚k~}B:`CQPC+Utzj*Gj[] TWle HG՗zzKlqú nt3nָMz:yͅį7"A # [{j)yAsVb1*U.005R%uutm <]nhK? % eI\ ,V/ù'e@ݺ;g^w"{̺w# ލI"z@yϋwxk_yR`z 2 "ϡ3˵F9][S!XkbZ12;;i1KǜNMb51]ZwlD6yclH`㽦H:Wa7~*UШ 0vq$FC>a"AqB k*Ķ'JkBͨWl~˙Y'7;p!*Ɇ۸*NL6w]nmEG#ZTBXD0ZyH sD5fF Ҍ,c c{ 0&CS 8c`m2H"7q 6"e:e);( 6h|A` {QR6Z"N#1Kc;-KUQmAԹuBE5DXIk2=M1pWp2HP8ʨax1ۖ_Cι#Z㍿c|.OM˚Ɗp~G1>6N+φ5xTsZP=4#೒B xd4z)5 1)Rb%Sfꠥ yK*}jguǮR{o]k$H>;w3'Z |'Eڝzf+}U <\ܥ/9O7Ι}`z̗odfk1\Qx3l?{a3: 4p| .xy91SD"Ff{ͭD0¸u)cL?.f AnhU0\7۳ܲƔQެP;`F]a=yg) gxT$Yt_92O>ɀLo;X͏t8/2>֓OtɋO?¸nYc(z˯_Nb+n\3yVeSPB{LJ  )Ϝ,0AX--VSO{y#qlΝAk7g^ͦh67= 1fFU PAUoTVl _cgP3p2vqA[`|;W_k"y By.+Wza} e nWyC"aƯD O)0[Zlwb@$-x'Bk`ːщXB46zDu SS;[GUٹIz|8vYz` vẁ7+V+2.}ѣ+Ƿq#(j%1lilUj=1D#,lP+Msx{+Gw1vJȑ1C@WwTK]@Q/wLt< aA,teDr&8$k} V 1q4Ifi\PvDcҭ$R5Nf xw;_W8ç/;덃n@/?e έ>KmirΒ|L4Д |ڗƜ] Gׄ܅M34n>˥YsOoѺvh/j8ɧzJ7H鼢t닃%튺Q8t ht+!pSnMrF5 jARCNG#%\x쉌KvIi wzJtKt{mٰ4}DcIN[ J.&$ӈzWɇydH._ZU Mxa ^Ϡ(K8^Ϡxrjg._e[fHzg`g4"+xłs)sV#WE)4N(x1Q Ɋ#G|ZxAjӴ10r 8Рz z= 8wĂB>NiAсEqq:jAH!q:XApVfXDr2:%\EPEF(2((%×EJ!X{uE;VBIH#]B)$bnP* 6/8lgoKPgɽq+\qeg3go QqlmFY~ƨ">ykhoީ3+䗫ּ߮sA+C ,OXf;m i'onY=cp/* |VI3f ]F=tٻFn$WIӘ| 3 IM0`lYJü+-QΠTDq`̩ibJb(sHp 9,ЄA0ǭ ]1/=7e2FJVʲܓ\:d18,0f5^V )iVYP4ކJbC`v9J ?ec\XQr_b<%L1 L1ُ0!=UPb FR1g!~ LXs>N)[!J>PJxʒb +Isa}y' NZ#~]_>"8-ZA, 4u@ raKF44 ZjJ2Rz +,'!Va9VAOL]9$-h"Z3EƯD SLρL:'{3Zs)"B5,ݱOSEiuô4F#z7-ػkޙ9_Ed`qQLƕfV\e|,ߒ< G0vb~xg>xu".' C4M`Tb1eAa!<+XML~/x;xjw91Nf/'/5( w]=LW'\АjZ)?Xſ~yAxicxW趸󋅹m51pasEׯׯ~9?'N/*s X ~XacFhJg?h>*))QkC #` 6&r<^j5?|f,hUBh1W(4Zy({^}Hk[B;`LOUw"[{q{Ɖzع:`li:=~蓹';42 OfIZCVjk,mX5*.jTTDž뭮sIi.i[ۡ/-4Td0|ly5z9|{ZggiUR/(vV;hL!Ls9(hK>z <9 L<8?"7#)2N䕶'Aj]`(Ldu9n8˕%N F*` T'tR!崸t)?@*z(!ֽI Q;Ni24%FayO/=6¾Ad{­'hee'N8}"7Ęrc?+tP;Wk\I|h-+hN2ށL6jcyDuȣw2([8΄Pq ir"nm$:iV/D.QJy3^BaACEA Teř6r(8fU9_jܚ,PUM-?ַ9k+jLFaJ1J3^F/8DUybOLGMacѹax|sZoҋ ?5ͮ(!s\iuEi^yy|Xs$r-v62vHkyBS^:gS5*샅WYHt9ggsZ^8l_qoP>,qXu9cU>ثr^@h)L]Ia"g-KXٗy"ۇ5cJ|Ck% S1 S39;6-ѱޟ順t\H !龜?71,)ApSf ۸UThq=x<\U:]ƣtFp2Zafv`\~0stl۞@ |si<1PkuԀ(7@KNn_W{nCV$:7c}ƛQ0Ɛe]l%>(8HLuLj"zt;dlJ?M8{(s.랧}7[,G!(gmzq&o.xaQD C6*$:n]`Ym>[ E/^zDmYi:&>t}!Z!%!=@ly"w9 1 9f rl&]XHTz1rʈt?`;`LTcYm-g>b}Ʀ!o};UM{R'3 ,`sbX(z*bѢ h*:ɮp6nRliRb{ox2)1Q2UKֿZ%%8 Tbr*Q7ҢIJ459UI.7u"bL~iC:SřF7gbzܜjtsfglHFZd>*~;A %Hbۢ![?23f 3CXO#1(I(ǕXoȽWո{نyy~/«0EQ?4rhYFtB旳bvq [FƵG}.5y.5۠KEO4tDPޙ$y>Ǡ}Ncn}%IF_r:"=>pT\Eڤv5כk[ߟ@vSe[\xSqBϺ0QCP\#ms>x/""/B4B=uG&ƌabTؽm=s8'ðVwN0%ئL~оI !qNq #0:ז% ,d93xsð|Z5&gAyi}E%| %jX \~gժ6;޵6Rr`>SW^nD\[^n$ @IJ’匚ٗ 3Or=Ӏ ڍ;4;!7  I6'mddƙdg&J.s2̗ςaNw}js^]W/A]:aG`|Kbm('Et>qR(iIaݯ".ܕy⡺ k&@wpj)ᑈOI4ѾiRݸZB7)8^:TeM `]Sby(kӤ]`$M&Di%(bE/&|b)P^"J<@_xheF?C9#N*Ԧ?1I %ON3qRpB7{sCR&ťPPy5oD]c9Q2P []1[='pٺ k,X)bQRU^UB (`*AdQ)@=8*%. c"= q,?,dU +4~,!&ijUUB[V*y ,D8*ea$+  wGZ._gUҕi>"s{#/ R8GFR:$H {qA(0loU^U.q!@ZEbЈh ?o.?ৠcy{z6Wrzr kHSB5wAJ"".,Ԙ&@]}GkY?W,j>+8%EbOB7W{~C0x3[~R8=L\1-CB?wzx՚!zC13 _?)3(u@nfͳL4ϫV0aq-SX<+F'a TJtUg-Jae#!"Eo݈媪*sLVʗ@cR眖ZRYOXV3cg-׵iv]*߽U=̭ uYTÔȦ~xWQlމp@2Q;d(LVBŠD{I!`[nYJI*fฃYs~:xaI@""]f.'7%R}ejBg@Bq)@c2CI@;.(F9e0,j5aB VaeU v6cy\51G|L=ZC|\爏1>4:ϕp`7%1#ƍcQ)׼}yczxY;g }<=A|爏Jx{`Tt|{2j0=FJY?XG(1=t'LcL|V1F<>9Z}+nn1?,HrؕEt lTZjdGŬ-΅QP%_!o}\?Mx,gs@f7~Y|.^^_c`?q+,U(p|^nJvnR.mȒWm)˦xT%E5F?@ Q2up2NX̀ f5/ـRgz+wyC7!NIx8u"&"W!j%"qFe NKwrSW@Pp/h>2eCݍllZ_fj-Y]W(f[!:U}w9V`kıq (/Q0K(euI,gl%Kuʰ:04y+T8_=dƞSw/ `Dպ{{r=O̩FL$yv9a,`9A$+PDq̀Eee0<ۼUKgTz:;T$B'PH<)%& X["'FYPm ?ΤGߓV;/FegO>e炟}U-u`-r../!DFpx!:;uf'&؞ b(TaaQs^&!׾VοRow}~xLJ 緃c1pr d{]~He`RLsJ)tߕ ,'k~3o/ @R0˶(/!X=|g!׏ #'GnV5Ƌ_9lw{ 6v&8cFat>N ;l烫9iWX\ R l"3<'p5`jtv 6TpqbйJ}kNlA *x п'Z%>˾z{\\#~PO`6Z1S1_d_{ɾ}%: t8Iq8s)}e I"u23(3&0!,R :+Uw̱]%U3!J|Ȳ?1=̣ߐ\S@buOV#b6<.:3@Yp xe$G ؞@[FιRhXIm"cZǮ-Қ,ze:s}g;y OS-6.za| 8'ix+,gHmFo 4; Śh7*gGwfw_i;6<s7 uQUwxsf8CI濙<9Fme/lOV=4f*?fM{t0sp,Z`^Y^s_D.T{]Mqjaз7?~ox19E3==蔜 8j,'cdQ¯+sthcy l^Ʈs~⻀b 4{6kmjFaʰ SdNAsϞ E« wXɤH\frFjaBP0!)ĊqW,9%ZR,-߿r}&v'<PxI؏ kR}xly{skı$ko.~ygE R@!~$8\*(E)3T)YƑiuAvB4 ͘i|ai-w嫇srn|4豼Z|r|b~=FL$yaʬϑ&YN$+P[k.<9WpnoVd4ϨA53*}AE ͎;`LJGk| ך]Qaq*y 駻q}\ :p 26t>M糚9 -5>㖧Jf\b (PxVMZǷoDeI$k8" ix}2wpZ:7f1\|^sZn'1rJOMҬ@SR=yoNE[ 8Xc&2:W6z#1 4xW׎nh*ƣ>wg e͂T<^l&gkkӌxfm=ۚC2J~xuw̡m%Gu <0J HByҖ|`|hvJ~(lvRxt^2bO_:!$n٫l>׺V9UI1Yx~O5.WY!AaHLƄߑLSEL\;hVp߆g<~ g/L>, ckaRy4İM)Ik7 k %v6HHs YH0lTNh%qlA9TRO;#CLP3$ `gBQ30Iq4& lDCUkDa*j :ZF#f "ZwZw}lEH.-֧))AJ컭Y kn:̈́/Fy <yjϏos&)ʿ<*tټ?`ޛT?UI4a:s:jb bU(g5#!"#Sޟl[awAщݭCm0-t5!!"zLI*i}l&71DՖ=p]9+w}ydz[~I>mjlao..r4.C_z.cokgFΉT!2|N~6x96ݯCӦ[I=  30G3K9Q+Qϔ%h3k=f9Ec_.c)Z/G}.~E˱y)7巭T Bj*B]W0a>=Ѻ> P'8-P7s؄5 }[L;b[t4ZXp6eX1|iBW+oxco]ACоNh9=GMH"+u\)qOB[ǰQ ;޵,u_ax_r"_:rvE>%M[Z}NR]DͺŒնl˛HI _:F:?x N{mAJ&:YNrO xS"*AJGY<婍S1fyu__vvu}G59У;kH3Ͽ:x6]]/Fỷ>/_|y?/qs?Ο3wiw=W I,2':_^^ۍynz{ao|XaN#g֡-罪E4^Ւ ?;uף7qb™O ? =9勧ۑO+b2S#No[L 4\yY5`6S(6Y}UuH}ԐϟАQsUMVXk̛Ǖ* qedGQu;Xuudɟ>?cǵOxy51Y|'ïql@jbݝCGkW!bs`W' @aZƈ70Lz "N=;$s6%!4V6k%}TkJdX (8 Aj EȚ-GRXg8v3& zëpT`{&vƆRg:U[zz7ĥ1M0eK"%hH)ӄ[D#R{k,J| kbF -Sl1tV9+Lbd]k6*2M[]2gZXU &,LH뤆 sZB\B*-n C9dkՈ:XTp!3lcLNW0szoi# OuS,O`Q%;2v1{k;bG@)Li8{yߺYCD`fu:mbݞ0FںMwtu;B~pǔnq{/;BXy%ox!|5EL5t~aawg_\DNW~u;;dX=ѝ ;_J'>)C=hxn6sh= u9hq0Q.=IJf pb=xS-xf']T9G6ٴ &Nlc:M`'I6UcBdg>6IODY^:,u;;2xZM)FL˭91T[>QB;h{;9щڭi׾)kF[צnjdg|x0MiJoj>tCl%]{'%¥IsӽunMz^,|Gg*\GFr]Q͊jrڛ)S9k# ڱx͉s` 5ǔ"UYk!ɉbfdUÎ].0fb1!]2q6ZtN$z|daw䙡 6nK0nb36V5!l}sj1đv нPed݄/O{<@ӿ`ߋ 3@FSzY΃;7Rޕ'?%g8 K$'?D- ڜr'_X7O nJwHdbyK@--뼥gsךNo |՛Y'3jNs.ʋKheOߍLInZW ~$VNTwm V.37 ӯ7)>6y;m}?>xg{ [o W/.߿ݼ^Lx뛞_xύllƞ]{oM?67CtkRQbRk$dHXQuˎ(*'%Q4c,ݯ0z/?v)f]ZERMa\˘Þ\h-6i%jv'ONH-Vq-;تhQ ]5;ߝps f;z +ةj:7rkuu-Dnc|eΞ}OcPRfYX~rĉ3JKԘ~e$*N`DꃿuЛQ+.)4^j-d=wBO.&7GR_5_w^={GjZ(xB5%p1bmEٚ”j^jf97!V[!U[|JnT=t!vفUK4`4ڕqoLV6TD>uR$7J͕k>'LhK gx1+ 9wȾE A֋+hť<[lM N>AQO1k{ 4]05LH֮X=6Zn )dhgT>QrH ŲY[XV;Z`hza c{+QGÚ({KM7YWwȥ baMNh)ͥ`VK2<Bn&-4W Rԃ{oKsP=c! RGL13X7Y TRUlbV0XHxd ]wvԥ2rD8Eo\@('ߜIt hq lmY/%n%Si`-RB A0O+bᇬQuX}tmpҔ!,W7 LH+*hgbTBQkI ~^yzJ~%Zچw!Yq# Lɦ;ROHQ" 0ڌZ{D0܄1®ĵCwXb.Ҽ,\6;7`fv%n)\m9^Lj=>Ȓ(bR){(z5!u4L+}cx8 Zh_a Z)<έ1ĈщZ\]K(@@a4RBTɦmM0 [:FzB<}ʅ;uP&D0 |Bl%e60vMs_ ,%L/M)eM@yV}fʰ+WGPBم.ىLћ UA+]A&FV" u5 I L.E5paOP`[vl_ʷ#-"ښpm³# ûTgLj,>RJ$QX1-%:RV5c̏R*0yRNĽjeRowmzW^>{u_/^y׮T^B?6kovzSovWޓsy]__i򗱅́gJF_axA}(Vki::gh7:lcy؉D Ȗ!KP/++2|nq>9=9 GSGo&$f2ۛO+o_aSX*ʆuk,Aɂ;bmu!޺_4րφk9=+au6sdLZ8L2Le9|(! +~̗e@fx hXSD`?7$|`rxY07hS>-,i#Ft,֥\Q\7s3eI,NU{k݊6leSo` [u?6 7x3?\]yWl"(o!@fj1 A3 ^~Ĺu4$4ʕ8+(k|KY6QXs?8,B$wn-n" Yc Kr`}eAғYi0* 2TEJl5M*X PTl:4 -@կpݛ 'ae.B=΁#G_Jh㎈ɾTPoݻ48J!M؆ev,MpӣޥZ7 z6v]WC iQҩApmݒ-#"W8Q̤J+CE?Dp3AaDS#cPh7x|&(E3Td1r\s/SЁl3aw8m0piC0ϳ/cWuUI"1ۭv? "{Euo6bZXaH[0-1DggD$lVrff]ĴS՛ݱ*˜!!&-1kq"1lc5a upzg< Nd׉};P*NAhVIf( TU^N$Z/v+W$.%yLKIK7ҴVZeiCkբCPi/Skk!'c: "2]v${gF]{6 fm?fi= nTFDã"qo,,lPܢsY ?AzCh9vy'gro&mzk üح$2Ԇ0Ŕf32u%Eˋ/{yoXLq䎳Ҫ|Vǥ7+dQ/9"-3L#ӟ˫W7l]o?/o27p!Hc9:v %O}ëgXkm-p}p*[1(9,dU@"Sex<xVNjzsck$w2x3#M0{ I:dtc$3݇И/,( 426_u>d~~m?˂gd5F<ބ¾G`H p40prI%DH#J7&05uG4t.7Y. $.u5(F\ Aw0%h0g= [QG1ʱap.y5jDJxOc6.nO dDqnʋէn*J7Z?猌0AI@t rqH~K\/c9(^ú9k2rWord:xL`ebe s~{pyW̸)ňƊ`bf#-( ᗱL$*:ZmN?}tZiQN鎿F7VeZF)‹}̩^8 ߥ)|s_%tNgz#\97?wy9lI^} ZJFJlt*5TxKH~ lLP w/h2)>Q{^%\Ԕi6j2PҮ޻P{gCrV,{r9jJWLBSDQ.A^_VuEOg鍍_km}=[t4,^/&y ^\)yxKƾKK7R߱y4>]Y-f, uVʼnL`Iw5,n }^,ڟ,LKP.+e-YnI6%Ump/[N;x.D"}h Xӻa!?&ٔR8g%v[N;x.K'OnmXn6%kڔQA˧obxvm|5S?+D;4#7C Kp_gwZ\M(zt quWܪq802e` _|V'f`no?qn{b EW@],vRK<\O qTCܵasObƤ%Y$d{+8+H_! F+~\v_f("ϗo\ 7ꃃXj䳙L'qVd i͎IFB#c~>?os[i٩{hŸ6ƕ 4hDȼi~E R5,ju(SP !sH6[S !DwI"0&Ur^:|x~2H,0㇡hhe'@hBXY,CƬՙ$LH^㻛w[ r¤Dk8 OI/kfZx1܏8eh[q[Ԉ16@]9cQ&8n8G\/4'>AG0T2p.%S]'ǫ4%>ujN!9cs#ֹY(YD,B+a4|n1h$j$~J:bBjZi1A(ܸ<1Z3s̅D9sEN=|d[5ٽi$NԨRSUXLrl$^p2j1 / `y k!eY UDb@#"H&_eq飣X=X ʹ(K;k@T? %ߌ78i%A#Qw,Q=p4Cadm^T3ll'`zk7D2/p#'b-GXm0^0Ty/6Kla-s` $es#LySh'6qyos F)xQ3 xDtL%*0% UIXrL%$Ԅ"d7!XXV"EǛ%AT>IXM gU$RC8Á`7pu F)S,`lQSX"v"Gq1%ѰP2I)s\!b8(ͨ Op+ r"F̈́*W45#A ,Nz! ; C`'Ă{ !B&F$,T /O(y@ᴓ)38" :f!h%K։th[kC[!>DNQ ;)x͞+YRFT4{P4R24`T4Ͱ1ENsMsD\΃,U$VwDeaTC'If<īJ'V:I[rA ;=brsA?[1d?KnjŞ6~|O&1 pՓ"=$q~>8\kr 9tC\D ji04 7)-/wbϻmS3=Uq;//a ~fkXVgՆ-?j,c/ ;0 kvjq5€6ng,nz@ hۋb$=bdD$1riC?Iq"; Ҋp$FNK0flʮ?Qc&ų~Zs#&`PNR.?`/6$d-0`C+Vj`/f ǴsXbk PџRnI"C =G gع/C[A"ghcJ uЎa2K˘,3>3(^{!}J}u8gI&%QkZT5?mX OAQv .)Do q?WxhFх>_5҉d?韴ĭW>$LFUw I5e#R$]ņIJYmU\K=aHKzfh#%) 6V':5Z{晬?t 5*Gy<!AA޵q,B%@.MƱH8O J$Q sbRrY.w`K\fK_X%b\*>RV6xuѭ s_Ne̐l`m⠬ȲQ6DDXJjѶ}8 Rvj۩MR"NaxtXD<-Mq||$DhՇRVW3B!3VVmB򐒲;DFs)jwS6'5F{0݊iYK.mA~v|DTv#7/ɘ,u௜|&~\J[z3}mO[yp:vӓ^99; _=峳4m?JrcA纃x ćcPҖ]4/ZMutH'C/3),#nOfgN ա׫kaWi7M kXw7I- tŪ:m_a>?+ݚś]tt4q$\c'طo5}kjv/x*zHb8B58$ 4OۗSOv=| iVu8[q{fQ# i`ggd?NO{NU¸h`=r  ܄eBLiazLwlׄ *oW2'K<fS^-9 CiyV6vWh 9sԨFwχNPtPqlJ=Q9ykRLHX~d  F[\T%%" 9c+Ȝ!U+$kEzְ@jpbԜP'TE2iS<]\)A+jX#Ȫ1K,Kp& E뀰*3'dY']3"SX%(DQ%/ &rzGayc3{[S PX 9Zl% ~RY Cf2'.12D.UTaT՚cJJIci 5x,py7_jRyRX6)`b.1E#(1*n$mOAW2e̅R;g|̃w+,"Aq,&?-Df/G'FH>5 t4[kď|n VE5&pN b+SfRʱ]RVs9 .+I)#(hgcגm`f#>LىmD@jzn]YrgeFdF fJe"q\E8vA|cq*ڄ 7Vz&/?-MY-^)g}o-sYaI"lѾM2MbGt/=KN5~g8_z8y>vC./Gi<*6\+Y^ԫZ*%y@Qd|ʲ:j".l9}Fq#p޼{_Sv|K#"}X@+u ܘfcjX/l~۵ /F Dԛ0"mw r9<\z^wy{8NRRıRJtHL 5;i49`k7"JϷ#0G6}lB{>=\Fk:n}EF@`:KJ^tr4"Ç`ܿ<>f:++!h|y..VZ*gǶdeF7lsqPM9I#O (XÖm&O_|^fg`|x/*@tLs7OV*rpMS\?^_z@xeYo7{V.^8YTBYY8ςbsgd8\Ma8"f|xMU9^1qz2Q KWZxU]T_ľ'pWSRDLS!*-Djhk& =%Z/8_j$q=]~>v&%s16F9zGzCjß, 򐃘q# RE[ՖL+5poW8t&|ɦ?. ,-m.U>lhܮgDns]n?^7cOa|ؾ=dͣUۑ)o0#cҨnKu1Ż\l`mN|])k>|xDmz;x"Fs{:>CtSSEx6nE }ac J[^/}{#Ozy^/x oИ5qܩ7E IO>^\(Pԯ7Fg8Ju'J D@RK.Bd2 Lpu5TJ2eD}dŵ.!ds?wAq#Q%pzkTkmNK BYZ 5+Z%kXCPGP1[6IG@}B0\-P}2o9L t'|{ RޔgYeyY/,Q^jpFi[rOIzCjP}O}(x(C 53JOhPö(PԨٗ6Jjܪ͎6 ˪& [rW!P`UѨ@6ƪH;:+:EonImGsw||_쮤D)w|R}FI>]XGRP:H kg"JP0QH(%C)_;GR>RQjb=(m :1TOOKzSjyQjfQ`J8Q*n Qi|ZכRst(Έ?[cH{IRgSFT]=P1D|:abז@3 ƩeVS*FL HgʎUV!d)k*'e\EܠU=rΡU V CJؠ\@'vʹ}e]M5f (*M#(keĥml|TٵcdM fgɑU;*^-P%G+S  bAyW׏5<+OZψ_kkmi>0<5>jSûr1}q^^{ oʭDoo?\~!)/ Fh~^ܦO Iߗt7"uoڝ!Iz?,P2)gjհ0OmN>)&7 r] Yuڂ+RgO4%x ^w:}|$ޙ3~G~g(a|dB55@ 13.}šzx䣡2JrUʤ>u暝%kAޮ9ۛo;R23;(Vi18*Rpp+]Z=Qkb=lg>W&kW 9,!",oε(y⪒+RnH&O)yL/\Rfǟ~z1||8ܼk_0n>|'Ճon˥DDd-B-lWuuXZ~5:}$]y9hHnnP,0jRf옸(cۇ3!l1SG3S>Zdϓ} !W[(àgLQ Pw $:bwl={ J9Х{2B9V/ +Qs ~ӵ+#^eîV *ԐprH\]&dȚ ^XRA|ڰoy6REXHNR6Tͪ_LuqŨce~T璋Vh] mWDhitFB_3'yRhyNO]^}MxOs۟? ;?XLT=s+ҊkGyyUrc֮ٻFn$4nHWaȍEdgd{FNw6X?%-"Hl.X,^"悛%SLt0K' \8_smvO)@P7x5ň8bHcyFm`թH$q)JD(mcPcc;3r$–͡3Һ־nԱ]1ȳ~:WGo|f3nkOttRs!+tܳ ۻC]^q00b'鄥)$|;1 VĘW|R:A(]uX İK02jon-8;8 ';VEtlNLXO/o̹w<!wXX!^c%H|* 1i~5 vb4eA}[eaD ,RqdTT)U %1gTHgqH %ҤޭRv]2Bq[+?C0TpϭŨMx s.SJSd9P0:25H2Hج== 5@B s%l< $<'\e \FIJP",Nb@*$߃;ɔ@RjWj^@.FCjVξz%KvA pN3Q|8`!MR&t")sl\ǩ2?,3)c!(ieXD=Uat4GTuX5*8 yFE8bj7@T"QFݚ rJfc"\" >3xD  n嘔ph)`,9g* ˹ʬE%< 1Vv+^ R#$pdƉ958zFQ,GĢcej K.zrݽow(9:K\ C} >, {[B/u,}ݍ!å\\1 )$"}K,;rc>Ȧ՝v?EH/S.fͯ~n\?L̯yi%JeY"2@5fd:8CgY4W32DE 9agy峛luNUI!fYPffD;[K{(3A iOkv RB"I2#̔96.J ɉ;)NI2g>񂀲rkVp4P g takwŻr!"bDNTLd`Ѐ"Ҝ`N3И1Ka1Bzee~~캬u B?uEH2LiJ@DPTf P 1X_.:Tdjc$Wuix.PL8@8i{@5U)6#J)R 8rw.R%{)Ԙ$BR2[,Hz]Ulx/~褘/;Qؠ0Ju;*ƨ7$L8 f6i vP`hIJ1W !ca,zYGvǎ{E]',G{ezEpnnH-cR_t*Y6V Ǿn=rY۹EggFǪ?$β) G2)L%klbAnɕȴDL%1ʐ7#*VؿqΠa:yeG9RΚak o:~:hI%$/|w:]1QuDV=i˯F?nVh5fսy2[ LV"_˹{b5Oqv9*l>C:{́SO}ٲks/]Ru"l*{W6 2{mp{cXYa Ssmѧ@pӿKJXAHJU(EBG@V[޶^11yJVhu0_2Z$(a>1EOUG(S j}9pon)"Y")Y=;hSQSbJFX]8CٶybX;Rw`Q3k`ȐBЌERS \enTm#O8*gyI]pTKsԗC@~??x NOߝ"eQG1Z>Vbj4XevE9V>i'*[r)O|:9ݽ|em{w<[7r<Ƣ\ .3k8pS5>Vsyb5wynRb#w݂=d+O·o(z&ۉ\%EaWBފ qBPLHi;!$8&Ilg)8dGQf˙PC08'Z1a,ED%!K ڐ`ݩإZPJ_Q+lܤ)#6|ڲ>E.h RzRʰ2`T6TϱT'{ c:E̅ub^#326|D*T$q&iDL͌$Qd6|(ST9e 4T# ' b)WB ؄4=E۷GR 5Z5 !_.EB>5cğZ R-!Py& ' YJm4R<ˀB09(% nsycDǥ~$.):Bh9*%$t_8pnպ=uƬ\smnt\&p~5hd/W\줿 ߫qrmZju !Z,i};h~PAX@?Y)xf"B E9\ <PVT+F货Z s#Mh5)we~"ŀ`p'7$yI`bIߋ)UIٜ&E;NR#"1 .P^Qi\k>U:Yp,!);my7w tBQŻ̀Nq+ʹ[xw!oA:5o0nNn2ZQ)vko:e .ij@h SSJ'vn,8SC|sAȇywXC\)On86Lpݚ:q?p6LCJx<9âv'% WQ'lm*[Wˡ^]tJ9m'm)ZZVU ŢM;x}ڶ^{E&1EE1a!uCM*/w H\aY./_@ؙo0cy0j*ՒMYSy%(ϣa:4 JGj"0k##c!D%|Ȩ$ +u`J*K7~g="B*Svvi'5œg=xG43hҽ_(sbOm s뭫F5aΤ> IGᢦN~ դ+ bG=Ԥ!:s>hM]Zds u{2GU->,FDyLQ0ZY  9bX!&VB#mS 6-8I?@^=,V-! QWR62JS +H 5% F RSSjeɖllH jD>J=٣#fԳ4-BUD߈tQĒ7K{"D܇uUb{j]lub*X}EI]go~q:_W7dFawv[QJJ KC)PK/sي/5J犟QMK/ڳ];quqjV5գPJpi zE.tQz(48j'd9mziɣ4ҮSDYy4L&1DilL1 "P9]7YL>q>\uכ<,3 Tv_5xb>_],=Q`Ws;x\-g?rvs}!onD2 6=d P9je0~4_o0^,|ˇazٷ2WOh'b5]4sՄj 7K:ݨBffsO]d·bW,t0.Uku1Nlĉs^3y{5}W((%7|g'aDN|Ž/Z<z9wȵ?ЦfZooN]F3 :{&7Z~z}LQ~ܚOQ[w%*7"&'ˉ]Ԭ}_Z4 E54Ӵju18 ;A3W:70Qe+{EI_{?TYJQ(Єkj$V)&‰2,B'D1 e@(7G)&vir߫ G%}I^;, N"e]o")YҌD!e".)Ik0I˥)(M$F ~X(eb˷~|oӹ eԝ[_u3K2a9?56V7; |XW"{GDq^.ti/:F%@TzϦӨ} C$iD6K|\0()9"q0O %F)F2"RJ"e AX-$r"DE 1*\Yp$N4PH=eHG!ykkcQ5HƺQ2MX-6J8MX)$2lĂjCD)Rd2RJX갘ffwhiwyB6LXDPKbRjV . KHM],&ND*5fLjR+1OJ%wưg1T2);5b:`rDkmz8u+#v8ǰ~a)s#Y9{A'C0lB гlv % U~ kkvtV'Զa\ >j\\ JS4y/j׈9RX9x?L,d Bĉ*#1xXxdn?Ma en#%{K.SO7L7Z Ub}n*ЋHIzg>LJ.3{V/! -glBfE.<۪R" 6:F?N%$[-<3yqFkz\.hχ򳋏 >LW&{8 ?&h+C%tPmYjnՍkq74,Z5 &cWr_{uYq- h"8Xvؼس547w Dls~9L{NBaYCtpX&1.Ԑ4iOޗ"PMpu rf~K_MTW úSɐqqbkPPM0 Q|񄬫*Gk:d Br%֝N5ǒ_ TW7{h>~vק"ddpW޸L5Ƌ߃3 kwu~v^ }MVϒхq ^$q55ͮ+"_2pJσXBSq77v}bٙ鴀DLٝqedM>sef&O+k幼철eMy TaTbWWm&BegTnE_N5+8<[y ¢wѢw tBQŻ'6+=[y=SZR*UgܾUT\k1V#Ya=l,ɪo]$K)6:h(t>{bG5'VX'sx]\bUmi$k.toX_ݸwQWbF1H_`bUsZr(av0Ob,Ȟl~*'sk>G±,j؃y5J,0aS[Z0 Fe`7ѻ|~ y ػ{{ }h|z^7ZURKjJ TI,lRIV 6TIVzgTBr ;ԢEg֓z"hsEG-R"X,{l q(%cKHQ7m, \=q d3;|R"h!223';_iq ?2O\9^0ECxȤNZ;X 0M(JwU!G'׍PD9꥖%>Is)g%gzh3.[QT+v#V;mr^+ZI+!}$~ YjaA#L}.!u8Y=+haYj|6]F[#O hˉ:Ob7&|03n[(rl FpQL1r8N!1z'TB}gn\LUwpՀ٭ | ~Ĺݪ[|:٭YJũ[Bsnaʳ,e.ҺLZ1K;`䞍rR+$\]߭,8Z|/6n?k95FvS4|P@o5i]_l,|Gys[~QO6'ȇX&!5Rڿ1(11*JqE2RDq%Zh R2l ?M3ۍ޺_ S[!1ꉎf $K= X\mƂ [6~qcrs&q~xnO?2Pqncqzqv-QrsB,;O6_r;g@'^Cđpz,?N{_wOIzJXͷ$ .Y(KEPNG?Ae0C2"3(b)$e:)L#Ʉ0&e"B^'S uvmV An tUK7jj.+'G#J4DE @1fLgR~(a*R~(ͭ`#Jܳy/+}{՜8.=g WVG#JC)ծǯcV_m[ B(=oC)|\ʆK4G7JPR5\zmAqQ={| P={j;R幪U}†Ҽ !A:7e(=M$%Kv' RT -2W1-IJWLmNdfB̐dG١jEgbv[*;^>#P*Y]Ӆf[υB+نHِ+ͭN?_( =5ENE xٹ"}gh~qXs:!Nʧs2wuZj s5G'Uo\$b"n˺ڱZWJ{) d37 A锆ONvob*n%٭ | u~#N]]lG;ovjGK@(Djzy3mcE|~3[➘p`E=[)3w2zLCGNK5AJsc)9T_Ju.ޔWV#yJi~,BWV*Qz(eaRP[ JgR̥ R5g.ͭt~2:R`(/=J/J!QZ^7b",Laݰufoɭ\oǐfFk9cQ{)2+\)yW-P$m־ʃ#"?t܄=!A>쑳h]Ai҈=a@ Mi Փ}FqrjɀNy$s}?[W><: ]=̫97ӝ#IC~1¾B"2 +c cf`\.%f󋋿Fo,IY"qΌNXL"G{Wʹje#-sXɯ| W^ɘZ "gD=F- qvD-24b=>|T#MS{7 M0I 1Iܤ'Y R*d$Q$ֱUVgӷlnMm{T%2kd&%YbdiHDk0% L4u,B(-`/QhzImlcah<S *EmVv=!lDd.ygԫ= XrSS?0U%__>z]|qHZ~h3m^y6,lŕm ^G/,HB[86}oCM(-FwF4B .[ll~ ۦq,A3Lfa2e}Tz>Eg Jڶ1oGy(\#gP$@)c~(ͭFRI,#Jّ$BDaԃjHp(i \.URǬڶie%mD9{ͅ݋$CQT̐,x 2V̜eLJP$"Ic:a5,E)jV-iVjca [-` ԝؕJHDb37Bm8Pae$h A8.?B~[.du?P.'b.&?][gN^%t0rwjp uPR7L*yl^ _ =.]N#mҜ'#'gd[3:\N!#nP{fx*z-iٌ/t+VF{WUo|F𞀯db38#'sS4J$NV|k%@MJm6$2UP|v`'\Gܲ헕min$̕0 Bޘ=[zޠx)&҉}u[SSd)bQ4}KƱRR_k'~__W0{,khw\ףRXkj΅qE ʥL)bttd5APFTK)! E;N he?oCϩnQ- h&f,}f!qc墲 Y3MEO])1 >RA* !98QQy m޲Q 6af3G )Y15S$i꽒CPyc' W5 Ά6E_I_Vx ⨱2n(g|e|cQwDV4uc(kkǃ[g oZi'jD VJ${ա{Ot{Hx+;>.9TqIe2e"BiM,i ny Fĺ~x' ڭ 暳^{C|(8KG1 HQfiN2M(Z(FFi861 g$ SF>ղ߮ӧ_'L0Cn W]L~HS[]GO?MM=ޮ-3b.Lhgd"t,!S$ՉR&v$$1FR%0pBub?N-@var..Jw{~8r ro0A5sm8LEP)E4YZX7`VI.R)$piط)ؤ&7cLJON..ݬӱ{Cs@sy$},Mk嗟~^͙Q+vns C EIۅɿWW;r3Bgձ_+ ?NmRY,WeD??~RO?z/tN86FDu~jNP3i;НvmGJq8F<@zsra 4=ZxA+L3=dT[뫽gOO;ٓuy͂-!iwU;~;7 T\#lNQxĿ/MPͼ\'n̶. c-Lj|O^>9˦R qM )A1fp8:՛uzp(׋ k3Z/b4Äؕv[*nIJԥ3] ]r W )+es9E3pN{CZw-Y $5S/]7СaF="l/` E^NG]^tZhN{Lދ-+R[&K@B7$|us56-HO ^Q$nf?7 4M2P_\H ]ա@8&a_hBHZdG$]Off"*#*[߭xa>Փe)7h!C4SO:ՙzϻ N» :h[}Yξ[p&w!A20wPl0w ՁtBĻEM+nw<[y`zb.Phceǥ*-BynznZeh]L>%OKJX퀸UZ^?hxBIJQX&f?1Iwa$JɌl ;N鄕;lj|NLw@x?' 0iڬh\ch$$5!gY2yuUpy]rD¬zQ'x;Pd#24I>i[@M=땎f:<kRRȬe J^Tv=!Ə86Obj(FOas4UL*ɵ 5CP2Oe[SuSV>qkik|<%rOX+XTR䫅(ߢ$ 5&) $Oc56JF DYJfcׯQuG7Jx44@i}bUCw2UR i<~!Rcd>gLht!Ki>y'x9ss\_^:{&2M8uGlKZSg+:Nz]L+`SG{~*ee4mM ӏ;Hwl `'HWJSaywӍ}Re~8KTKel\Hyܨ6FYUFBbX6< ݰEyic3cu1O$9i8fqeAdA\ƮcXbNo2W1!i7q=XUy9pP@:0ʕf&',,fI XX$4LǨ2Ybm0j. Q60 ?V ()KP8T 49W2kz&i@cx{( iae+I<)R\e-&/( [-q۫(HEjo4ʮn_GO_jvMg26LHG[{+m|\?-PsvC*pҋrN>v[[=;[eݭ务5MM k3v]LxF Z%gS3u1|D`,3#Hk Xٵ E Pr7kc}D9 o"z`8M5f0.>z@dܞ?)O^Іʇct{zEӄ-x1 ~U=H1%~7aCT˟;j-ajXe0f&rc jyf-SWxݚYK++o D ڏ{orM\/FGvM,j9(91'H]6{.k`?)(҂.zsQdsA,)G[%ƄOC's U 8H8$8עM`CZec§KZEK翱@^)Ay9zlԖ-8㦵kK{RRQBBu&T!dSS7UHȓ* r`jDg=ƑM)T> }G6[xw!!L~ŹvFn:N;x1q8 ޭ rV0I(L@7h|pMz"-SsFJLNy}4Y{/Q vŘ-OM)OEQ_nVf+;ZPbvKa;~3t3}uVj~=+.>׳bZnaWGy3J>s?ig }كf! sx,ꃃ?z6*6c,rRp%/q_RT.AZX{'9K%O8!ռer2aS-PjR%:Ўu]K\"JHdJBPLxB/rOjcaFR KOOIJ-Pz(R9Bʍ)UxN(=>}P9j#&ҧ^V O(=jrR.JCb*aN(=j/E@ MRԂɗ7J%TB.E6 >%*TKyqvE/q(}ZeUj!?QWnk`r'R~(-O(=nv/E9 ohNǰǍR~(EQ!&R~(Hm ǍR.źTiTzK Q LҢ]KHJ-YscDW]]qd$PZPz(UhΞzY F)?b-@)?Rӣ GR8g>{E;wBrzYkO(=jC)&FIJX0vb'JI[?R '_z(ʉ*s'JrJPz(/920@׮)?fr]}'OsW+5ZyQr(]}j(J-NǍR]}~/$P*j!R/^yu,J@r ceEʦ`ߕ9}xI×|}Cq[,]sLJه kCSξz^v}UԔ 5&)Q e*JBL"˷3I.D.mǢAF)mH`n:aأYHײeh>̣anOuR>lSH:] 폣z|fh|'?lWkPV[ȿ~f\E̠y2GqeG(8eRWJl}]C͌R.H׻Q%~>2Ёw ciXRz`m5Ph} @ݴi4,mcvk8in %'y x*J4ӧq X ZJ5,Φ7F[N@Yi7.ciaԁmJN٬vQubq{yA"9Vd2;9}y.?8\~(rduq37:@y=T$ZU1,i*IS#k06Jzsasls>|Eir0mŢǬJ 3:,%Մjx!ՠ2NshsT:2J0҈3ThиB @SyW ~ʬϻt^{H;XЀ4Ǩ½3БEŅ3{>YHAm]gH#ÏY]$AOUbr/ب8M,gXZ(3iQ(32jQ4r[xjQ,ɇfc9vT`>~9SІ<Si0F`"T2k6JT+H.IXYnHA"Т4w3ɋh]'_?O؍wqq6m=K6eKjwOz#_ْ/7;E/0Rm_eŷ~rhWߜq՜| g.$3r,N姻zQhFǟ]L!7(2]uVO<`4-{ϳ IX~6٪w,ױТGW̽g8C[P\ΗRxFVUqFvPfYq%C-Ⱦ8~삫+x']3E<\ruQ7\ bؠ{~U+2>Ӧ[ܼSW9Q \HL2!atDAڣWGp|:DGt*6E5c)gk5x5=RP<8pe};\E7_O:Ri:[m,d[zNT5b˯^~z}3<.ҿEw2Z뾭Յ>hփ[^,n-jdyv23wmqzInE/Y%X ]L<92,[ZIl0=dw[>}'ϥ%>UX$>n]DJGil59̞` |-4+AI#AiHugXp0@3:)Es)E?{B_= ҢB|!w1Ms5n> 6jAwtwǫ&ue:W{N~]m:>1Y \}VEՎЊcqgyy)^Qӻ}4xƟCU}5/nOOt!j߱ymzb48NZhO. z1.+t`.}q[`T?1?ǖ*gg=Y+7]ٷ1ӻ[zZ rL%mDy[s2[MʦFYݔSnN;Jۈ:ݶD |4V_ڐWnm3p+Q$sƛd=_.n>,6 W /ұQaRԝyWsyz'f5[UTQ vG͇kff}pg6;+@]r^/JхءNF-⊕J)a[%ӦR!>RPI"("&XE;|= a {6+ZOǦ~s~EZj-'ʹiUH7[.^[ 9981y3SRLg"qҤYّE"fGGtjwD} !T{cB<FT SIZLOxL LywKE:TFNq2e`sKEz~ned\C,{)HQ J.8nKҹ 7VJ\/@/L5[yӕ;\58pa'#>ƃ?h`!qSo0=x~`09r50 F@ S!z:\#R܊h&0B"z F̄爢$IhpNf>feQV ú;leѵЇi4Q\(OcU҇AǹwQBpV+ú,":_w;(^yEJ0C6?$+Gw:s%pJ倭[P25R9 (+ٙbx8bx(j1)}G t^U&:I5E} CчrVөM<VAԾĻM})%a<Lևr)<y˻i")xZ rL%mU/"n}X+7 6%L0%ROd)BϏ?,D6擱i?<\-]sҷQfţXogos{y×*݆Ys~{W2_6}wxtdtd u,.yNʱ#ɡ[ lў19JMZ'/qP3J6:d%DU~*7szRZSY xRާ ,S&"s)hm^]@pzh AH8A|mUSsì 5cifxa? S9F1:le505jQhg/WFWmٲ_)!E]9;cFW%;`("{5F/b e ;^ "`Ki0~O MrZjkz \$3(~uQ ez39gJnW6mFBf|̨FΤ$6Jp) {j@ JF\XZX2o_zd췒3pH9t%&&V*kB{#1Nox u-kaYۂ4wy7LttbnsP*Dbm,b6L%-mX.n)KRV0`H9ԐҘLRd4.b#FhGu!.<R{1dZvS+5Nj_on* yeE;oթ6[u8(G 5hm Rþ\2ⲸO-zT(OU/[ cR:SB  F!rq%cЀq(Cňt{ ] bl)(ƮSΐ;/Ai#Tk^R(p !r$L9f׋+_S9hm|q~o-nܛ_TiqW띥> 3Lw@u*ZJ=2L,%y (t㽏Jv E9Yf䌣Qg sFaׅp{j%<{gfՂROW?Rўwu)2bzt笢 79u=룃>$<@hSJn@.煮짺Ažt¤Bxދt\z.2."XQscV;Z*URz ` Әn lPKCZp}'p"+qwʳ;vPRYbuPoBà ;R1 ZGdaooԾq `F$y0"K/Ɔ F)il8 >HhArb)U$N4F:WS:S|S+ɗz7# 83(2o"jTh!Q+ouDiOMkK E/} jv!kkEA4^c0.V* ÜR J{7 fbyr.$Bcy+MFJƭ9DK(Z'X*3cf@xK]*˸^Oe5Qy-tje*R-"÷z dfUEj7G?=Тgl ȩX7.a-'/ߏ񱮿Vp D3k!3iQXxjy!{4&..Y4 W l4%"K@&íQBg5a\*DkR0+#[ Q;W"C1'BP)G̃$ R`3rlcBZ;y7fJeƟ>ި,!+XXf`? n2|obmɛW-Y|!Ɩ'jKZdp%:izt-F/IEKK`߆Odnt-nv`ɄbGi˙FJrJsL=W ߎ2J(>HVT)TU4!#ڨK5a  4b1d+şUKmdGmȎfDbhD( 0RW44oDpIL*(X-#ȍq4ccqKeRkaӄhQ)K*P ?.n(ycgڨmTvFL^EHQg>R>Bl@hQ^jœNg4B9m ߷c.^mE sD n +c28VMJufƱj88vXW@ֹkIUryHZ^j뻻兿_EF듕\ ]h{!y5_CHMo~".u`'d6dQc;߆USsdqdzߝU P/a_2BDU{l7׺UFV v~zcrvaDXij7;ý5lJY)dBC5V<+edП{>%U1JC%-U=J)ȦTJ?Bϸ9v]{&h r{j<'΢ISOg 8 |fp`Pll ]ڍU#GF`:ZC2?9K9{3 R:!8<"a }ΞVV89z'&}0=+P0d SpݚOeyQ>b]Y0v{:a9 :9K\ 3qKpg6G#A@cSz6b"6!T=%䠱'*@ 4wZV8VF BU<&AsN󬔄}4(ŞmBYcCʵӿ^ч{IiMDnlD)qmn|0_nt3XmnntK%'|aơ46*C4TF@W;I HuunkqF0]Q #].ƹ_2`7]XKzV?i/Ѵ*bW.JՐ(CF{B.j~wQMCw& 3Ce/-kֲA|&L wPtMI+@f+}R:̧1ʑRux >fI'&Ԡ.'r**Ԋ`jKa1lߏi00TiOZS2 Ϧ*h*8aJ"m,6qQͺpiһoMThZ &Cb8oA5)=Q Z,5᤹jT ͮ.;5$A'z95roCG `QE:(@[WS ^D{myJ)w.U4*&G_$y#(!,Ɨrb-:+:? k%;쒷M+r!g %F[-W~Ҍ~]05N E&l:BjW- 3At YsAOaoKAuiqpbql ^H^,n G*ܐ1!ZnSH|#( D$ ND VTS%Icl2%x7hv`H m]! шx끐}:E l+mQRSS<;J .9r m:#rA۫xKjњx;۶Q^s&?_}65?f']/ɇ_H}UUElB}Uh jӧj՝nMTU('Du@k:^tL]_՚5wlԱtk‘Z2Ml:^QIjBo?GnlG`rV["AoWP)INվ(J3;U !P^dH1=7+/>Uw `dJnŐ:twRZ51r{Rq]i\ BxWG@kZV6;`F9;Wm%nJPRed ޢe^[Btl1\[ɑcZRJ\ekz)Ro?36t)*!*S*zz)ZUQ' 4UԱ? So~%#-Zz M`]Qs7e !*_, +wɿ?3ˋ&xW{h7 3s [ j J0X@X EnMpn%+ɚf[OBY2n`[9Y~pMfO`\ pjvZPjvfWxyu.s:Lp)umf&Q6UHfRSC*qyn \dN ,&/-TJysfJ"`~lg󌼶 '†”1 wNع$J 8O[U_)ֲOy§1{(XJ 묤 ۜ>KBfW1줩))2RJ5R-ݯ63O>y'f܎wh(5FPːJٝi*:<3;[sQA8nOXc&%TlJHͮv^:~/!ۨyx~?h{'u6UViPsZ V:D+D_ +_$B/]檴tV5tuݔ#yb#U^X9Os8' çZ-{f5L\y)/,nf+  V UaۨqآgjB/4+]{8ZA2V RJY 5WF+rf>èQs V{O'^Xi?KmY)cS=A9~qg?gL#*Zg_{XnVx 2VbIQPV\e>BRӉ&)5@ 䚡l1ͪ0aIAɍ #TSl[О񅫶(0.K+#$>Jp!~>^(a*g:HGə,3tz)k0QQzԿiqrM'7qR_c=^b ,iZ\IE$47*#΅cs5uTM݇lƀsϫ,7Hȥ-;׀kgy2%TyZҭ o᧙LG}˻{A/jA)6G5 l[5 :emQ $@p_I5TьQgc9@WE.9/NٍY "U8j|J ߻HU9y?͏t=)Gng_'-wޓҷiAen ~Ԯp1w/?,|<|/ե^HXQpւܣ mcV۰&s29ors·I`(T˕++w ME- )\pQ[&L1-v+sl]y\RC% Ѥ\-\ It^EzuI|jI/[e#c/{k0_熡R#>' }/)8[ '-64ڎmӆmOĎ 2g< Z%jQsMtW@ Ih ”$W) JH#9%9hP"B8H4H^2ֳ7*C5*c\)=iNƂOEo NiWcx.}{ wNw5to(p4k`RjCzV ҇:72 *M[S{Q-VBTμxPfAx 䬵7j({Xk!.G*wD;Ԕ`9hS,)Ӽ\'480>"at\1Fz8)MD!Q0%TrN[dOZ55%DS:ޠk!^rExʺYvws>V:V/JXǕ  -S4N9{$@h惍U@}|~`9"|5Lo \7 #"wa>x=Y$r~jH,>Ą Zqك#ܭt<>YJ;}fef~m]v`SsmY: g}9:\4{m!塧[)&J"BbPGYwo_ĥn&on=PI* fʅp墠6 ;4 (əWX, tc}Pu~J"4/WUmOſ7&ߛrFK qH0 3C@Z(56& b>)'΂aJ{YA^|ɖ+%p ?+{onRn_!a!Jk:1vGl/`_忰m= \2{|VS"Z"ZX٬I;S2đp^gbbg>f_zCӭ]ARl/P%XU٢p%2:M2n6B'J2ȕ&+gW*^"ru gW*oiaPGZyDj}:dB }dNjx4q=X'Br1byb*U<UYC]l\$-4Z6Q1߆pR[g Gu8<5KsS@q/凞e*fG{]g>BKJcGĢe\RIXNu0R)](r zKݳ ΖK{܆oƣ$f:%;aف;l1!,|s %i=@p5UTT`?|?n<0/y7 W?]^1#LVVO/FH~x/'"Uڜ-F|W+3iwnXpsn-ifA\prizS֊VN^O>Tȿ̾dxsW?KT|b@ _I-UY`%U 3*rRCQDC**|š.+^>_ݔ{ eob%I´BI- 6|*[U _ :iA\*۪o5%q42ԱVE"+bD Dg>hF UjBH trAd!z$+PO# eQȋ"/*" Th$I|TFrez9xq7îufuW-hw3q6dhhdsΆsW6ByS-f|LgNPZ\Eɨ(! Ɉ~4RX+V@L+Qaoe*'S/pGKyp/(}ؑj͞gj 9i#YȘ$# OoRّc9q0$Hx"}v?Ce{eG E14+MI1瑰׭`!IT 8|#N*ZZ0/GldʱԆ{0Xo8+?Iv)-eTs[dNhuhX%1AH]y;﹁ μE,%C4JD(QL_;Jfq'X0\ñ{X 8i$ҳv7L2/nGTN"Fk*2вX[C$r8M9+I)Z՜B=q)[OR(_qGPa+VQ44%|ZH3$XG&hGxਈ'h/:Vck)X,uF =@g%6ubc2;5hщMp'6;mjj4vTծji܏T/Y^* O.A\drE+MҬK2d27%<,)Pr^)2K.)!} 'H뒺Nԩ.;PPs$bnKwX$BPyuL5s!h#D J.hb|+$PoG$Gܮv"$šH~\0<'/˪ȿҹhzy9{w9q/7"I #77XF(D+yipE28(]:}3e#zc{ln̴opm D)(oz XqH+*Zs.p0(_X5.&wvEjY4w68V$ʏE/ -q m>_Yum&+|IG/X[*r+0ޕ"zEXwLQP`7*cU*P%H-spɲh u AS%8M;Z!l 8[Tj :9L$p4#^1cv3eX=J\n0tu86kxBC)o"$|y%T&w~+1|(>i@+EIhC <=P9ShhczCN*mB뗫u:H HvS0_n6\ i$s,Lx6,LBkhn=/77tl#l} 8K7su]RM;d fo{]{S4|+_Ky_IsR֘f0۳[{{3IIKL4<ϛ| 3:3p$iG- $JEk;-.m^9f$[F8q.bwYvL֨ڽr@Ӯq~)(ET>=8xr[q>ٟr %ЩKwW7V<م !Aif.d/Rw %Hlܗ\ΒIv(ubX(}> t;tp 2Z((P1r)8Et ?Q-7f6Iܼy"4GUӸoi\kOO%Dnҷnb3/#1cf8d:Gx1#yK^4USEmxWy,Is q+o/rxno>,tߺp3d<Þd󂫜hAU]{ /y.}R $v-}m ɘ=ЎDC-_xY@^*ܹJ$+WuW,gA䤡eo$Tw>9NaNXQ)y6"L,u|NllݸldSÒ/'_&@|E0;|I/ >%h[5RW[q:a1_E6Zfu6w|{O;Ɵrf l/%t.)J F $sg8,2&qeI%/kƲ/Y|@X$N L$ېD6rw ߗT]HzTl`w幗%yy)XKq*zn Cj޼],bڜ-GtNʨo Qva=CCN<lvIB.7$d R2zkDpͧj!r|8j7x،D&̳ '*ajog2nynp/GB,9Ġ~B 8,icWg: $i|,~ٺAi9]%K}& BRwV& dhX KSۯ h&Mw8;[8%5\PC XPb"6! O~ ޳Z5CE7BZƛO}3? -)"䙇hDL a&ycci-ЉF6^e}*lt' RD3јHM>݀4H8b:\E]/^stK!AȅT|8ɈN!zܯz}p %B\h3V+Jvwa3R{#ɀWr <x~noA"}q"y| օv157gMYE.q{ΨgT_9{embI ̒\]5nMw \jt_S}|KD3%ݛp߄ƑQs_xP gFmw56A[_Q.jhďЇ46tD=ڸXݚi@u^}{>DvKФQZ"!$ŚV9B%ev^r-r{ CX=TUW1RR).!ZIr^Bniӊ+1*P_򑞘s JIJlYSG@HDεxJr IfY!Jδ/b@PuoBA%Us 8## E`vN)+JS4`v+Vٟ (c.Vi(+̥.R1GAإY /N"mrńLI gLUe,'2g;!cJ(RhBQ *6ʮe]z+ UJK JFD!PR-ǐYռZ'an5ܜ!&6sT%aไPvP\ dȘZFAƄ$[0Np&=b_Uˢp袴QC+rFWLYqf`[T*NUٗw`n.@d'Qg';a;0|\ϥL1$ɩ+ TJBl3ܦXt-"Iğ'r8FSz@,G^Jទm` ,n"Zk‚"KJ~_?3* t)ڟ&^) t8p5pHjvE-''KwI4gS+Pd2cZ RTV`g>tc#`5N}}q9+V'K. L/-8g7UP}/zK|}'e>S+K|GeiG[59i~v§)m#o^qD ƞ9b\"8qo'G۾߶ c j`g_>tڶ_#~pWhVn",/oy1+T;ٟ 7.޸{ mf">V1J\A^qIʼĬ2Wҕ2AE6)9+@Rξw q5[9񄭍Y쇮fk!؎L ւ ̮BGsb,Y[oH4|].LYRAUR%(\R9Bȼ WT#* j+ #,ҕEsb1#9!wX;:D'an9d.[Rȑ+Fˢ,hEZZrsJ8)\c.QzϽ!a$d_5CVɂ1F9@VFa.i<[RΓ@P.zv 6̑WTU&^(! ۞D׊qBUK}!M[_ff?%}-wWbi?lSqYrlΤF2c+82@]0G\Y/}D 075e)WO|SF NROh80(V4A~k5Mm4A{czCP At8#`H(,xk譇歯'a/ tI)xӶc OwNJm |['v"wm܅8#II9,r7KGenʤd8{Twa3P}y+$IJƆc LJOY%VzTux)w80d|EMkodjz00gƜ27D%yI Y0Ʊ`*8h _HƇ eV}"PࡷM@i-MS:& SD+ ">D;)J}V{| hEXFpBAey*yi7g};8fOlC}u?kf[-v۟wwzԞryp+ `nnEAy=6!7ׯam]5ea׻)’]v`{ǻliPvc \.yj }v!XPI,pKD1wsrU7vYgai-nf֩񬺿}?-b>ۘxq´7e8c N~^g;[g%Q*+k;^]VooߩeMe˞ݪ7jw$(Rm8]8]R&H@T qlR0(`-jA'pk5Fۺ*ߚ;3?z갹]br62^qe-Ͳ?oJ1DmN3jw}76W̯dOb}XW7 t^m |0(t=?[;l\\I!X8Kj9D 84MvlC Ey\.xzoP4wȂ<20AgOWÀV'ծ@?4S̈~^h q<\s:Ǩ/_Rhpѓo^ gaSuY? nU})Yut_RJ8M eEQQ9>f08*h'yAY ˷7E~SEz!e3$<24kje@ZRqb_O l{BjbARj;Zm'L $< 데{|O$ h')--?/lnN^ĔbN$jaq˱AB"3Fʀo KHLAÇӍoLrNpzZ"~>_y(zxhq0J,F뼸 c DcEu "07 dbpC1myP& Zz ~Aп2jvY'q\؛_#'<=A[==7H*ģR&6v{=ȿ Є$6}B$6} .eR>16JoERqq$Ȩ,Q-.R\W8G+O$au9+zj "ϟzdi~'>/L C :^ SU j@ %U\PQ5'Bт:u΅(lRe/Wtv5~+'ϙ3J)@n tgM#Cl3ݕfxgvBGsJDn=/2uf_b_4mkwf5$Խkj#".cn;.J V+y ܀`0+ ӀŨ.vrz" !ՄYh ªk9x`XګW^h^o"_GGjaP甁# S!&(fLhfF0@RRA,}OG vvոq\rVRQNvi N=)(&wmq:yܳ+d_da09`_va$Ub'2IR_ꫥVUuwlx%H"?7:9kdۚ#-yZ:YТ!2XǬF'xN q6 OC] e|֝E!qt!o\(NzɈs Zc- l}@]LeZF/ =t9{ĖQ-}VP%6yp9{ZIbHmS=x!|=b?Z4IXևY2!\Ekt7Ïhjy:cX#VL;[Ds[U`"9 V05uAL"i2F^v si; EJ<WYRɄ<[ŝdYn%Og-g" .No(` _T~joDvqcZCkvfD O{vL—UT? 2S8g,~"1vz|9!&'N9.OٞʘroALdN陟ꊸO0Tgx+`V, # ?.P ?鴬Asy iɶ_1$:>C:FဥS' |6ZiOUs ujb̥S]4 x3}Ev@Ӛ6%T26&X)4QL#I!5)Nݱ{[۪Km `/ލײC\iP ;m݀kTDdKyGGUrZ,/_zL5uKw3W٪=XٝIO/00Yv&~D93IB)Km9p_HJ$ÉiN1->]Jq $QG%0GCtX?w1@pPy9lC6&?\|0=Ń=w&l[oVL/ՄlFhx4*vK鳿{׋k&v{|jվ缡~4SG@LcjFuxle2E1.S2E1.Qci&*BVr݂s ڦe'YR XrAZ&DdD6q@uǕZJ9?.bT?$+pY,{ηְNIy-gυ{kR6dy$iPX!8Qqd-9H&`0j=jźB#v(yW@eI0I}&O¤'!e*XfNA Mx_V5^ rM"Ik8ŵ<w>h鑱h+Bu.wmR@&g;!Edb&8eqT+j3IS@UAdU%j#tC1V+DUCCVbJ*M'a*y:cYdzLV;lփ&QiRBDqփ0hȺFv3YƧ. /. 'IrP[(.CԒ"v¢͍V-/rC%xY& ƥmЪӪ^.;Х,(N֬O54:_z Z͍qccJ['"ZחnKL'hFTH9du,3 z6 Ef8[n0/<\>%ܷ[:%s3wEO_y"Q3ao?/D ]w]>%?EfO q7o_s0|HI1ڋWц/嗇Yⷯ0\0fευ~})>ܠmh!|ǷJTjЀWk_d䶗+t{q!Yu +wv!E,0dHz{L dF^øʍܝ(#Wt T%˵DݮeDk]K:zT: \K^J)4fMsi)tO>/X||-zsoq䣊\_]:;lL$"=Z,显$JGyfD̜V1.},g RDR!Rsa[tƥȒpA~( 2mt#@#9#TbϜgn~'YNpCJLQr#lGs}4 $9q$2%J.RdYvŶЃ{T6!/m2=7|Tr>?Ek[>9t9U7ISV .>w7<ƹOAAZQo.^݇C梹 \%HG,.n 9H[i7Q.\a2,2eDC:/[Qe[yڬ'l>oj4Uyqq)ԛ Yw__@r4rԩM/+Ւ~'W0AǗ4Cy7?eYM'ʬwօ5e|0ishGdū[,7F;`x]ylؕp.ޜ^$  Ax Z<ȩTd"|RIfVU`{|x`9:[t" lEa\px0I NɞrްMHRM pOHom ѐ,4'?M톤nXұ2eU> k=޹;d[5~9|SSs_3ޤ x9[F%R}_fdL2X=!D>JkAMgqc46ۗGgrJH>Fj?xJf*:ĂLI.ʇ|jCq3R~ +FH6/*SޕN"[< c$ Id"L!=iI؇ki1U#IxHU:xr㍲`%o,#4y9+όkh7F '|jnYCF8X.e zdKWPiGjl=o68df Al4 -N|JLxMh 6r|d:A t7ֻYTTXFz;Z) 6?Q]08cv3Nz5zuP mZ EjHY4EBT4qgZ3@ۊ]T b5@r/v%yQ aDmD4e[ri؅T2bY2$r6yLKPb1c"?*L-=Y#ç^áp]c@#@/Xƞŝ2F =F:(^ RJ ?no hQվy4 W{F`Snw2;X f"IJ VhpXw^ܮS-{<$q}e0gTL$c+l:\ O.dRT$ф}xb搬f(.u0& ~+[?eUBr9i&?6{VCV⃩N] ⢆W*S-@؂5!j(NCC jA5b׵<)gzDnh"N>: y*ZS')n''Q˃cO_7^ֿ̒=h3W*2w֍j[-rTvvSm꓁3[UpҦ~WVk;?֟c.RVQ # A'*kTBɾV><7lCRsS&"4YL򬫨nf#e?B۸a4w>cS⻻Iwɥ`\癃l|n+}?xc4᪯>1rK]+s;i>oo+,x命sԣ_f[OgWKlʲ}yTM0 NNhq։F#my_|[%j ѬBU@'=DRVz`1G$W{rO6: SArL`^i2-bNV2wL~xQR/dγ/ T !ߥSS.))Qvp(g4;=x><88:TڑB!B)[nQHOla-7Aͫ?{Q.1yL-O+/\]_Dd @52k29+Z9V*OcBqzjqSȵz>lO+e^1PԮ,N>Uoj8*IRZ{T7)#BA 85V.fAE"ҺY2TŏEdpL!mzh0h"XƵ2SX`aFk2P)*e"6>]D~M|c5Jw"uT{>z1|Mu #)}[bɈeymQ%n6B)†H8봢?H|+D£}MZIׇ[ g50mU[Nǝ,ft.u^sFx""NXυKri`l̼3fJBCzky#$XAD'%Y ~UhF4j ѴݳX?jQ&i]ms#+K*kP\rI\S fv%9)4fHi8$E]fݍF>0 w;y0!MK#ьT%TV;[TU]ܨDn[c9ş6+4e៛+ |7c#3J<+xZ:uGw֟3Aj 5eh5e by1M#s>9%c0soQ0F*dk4ѲwKq/PrM*wQ1&y3Ewء܀Bq%<]Vxd?Cw0,B:t zF5?0uR"G j\OEt}[ĢG,Тb)M0N.bE„;>j:E6kzʹv8%*AD@08 F.JFtMBh3'7Uynơs=D@| [dBe4'6=AwiJ yL B{);`?:꣧xAǢ{P&RW v+صW뾂-Pg 1huv@Ѫu#-5SFX? +J*6r_7-ïȯW7mXHdl̠ëoM ?a#I@?X2Kd,>b_BJ-j'CY1 7Z+*qG0J1ٺ\e ,URXߜi9qO 3#lEr=dً?EτlEk;!LZ7$\@"JK>na2g*>}, 犖2Q4]25g@U-+KuTҜ7cdP!2jTܵԛ/l@P cS˵_sR.~wÃ7>4XËFhY͡Lye0YF%(!QK/#U/q^)8r>ǵ20x:@3 Wzuյ׫^aV[@zЋ/fR m%8o4'RVZ%#sXI)%©?XqO#Du⮇t!6%$KFH)_ RgY p4 KPhoMS6pJ.jEHCjT0w;aVbY9od'-A_(&cT/}6Sac8o$cƼ<1*9wƮlܾf5h^:Č L e\(%!%L3. ^XP&\ݩ#U`f z8## ReD Lqr&pT ^齯N<'_0^I#HU$"cfGra܎z G0L wUeO7E8SSyG+^Bv|coj:Åca.s^hB088f#?"=Lf/h[ Rɹ)($KEz]i!Hb,ֲlTlb5Wx׬ɷ7Q.ʡ x3Fxf>5ڎ5x<!aB9KWJx+WVJ-'UƛJ:֊;0FWhkT]mhʊ4]éJ*~.ܯs}(izn1 K{]{7>׼(pAFJn~bk|8 +;|aRJ&}2g9^XPWIZ" xw'4]'5 xdy_ =hJxQu()VZtTKQbƞ\? 09M@<ؖ0u/$7x˗{]>|opIr /TqHp'7Xhe)Yj@.ֵtY!DTY藊f"c1iCl5YGȬ~Ԁ$_K2R*bE#>.c6iz$]|lbQ>sI"l,#}4LЗ5h#2IHA9w]wp} D9:`q OX(uDBMR]qks*RKXN78} yLa5lS*h fV0g۩q*#*{mwyAJ ͤYMlG3=f‡(#gf}b:"@\GOPKPSX FcWΐ`sdAjbvM* ryVM;o^SAQ@llfvmŶ=[fЃU>~j`Z]ȕZ(` {jaԣ)2%d2sFu= 1x4Np %(Rq%i͖ S;l dժE(@^ \I2)r'-ULײOTijQhB+/~r&> ;dL-u\pb#qb '6+Nl,v%[sFv/^~VK%,Ip8Pr!O(am?-=1][yZaR{dTz>+n9jJ[ C]GĨ*lIxN8MkS)nv YڿBT'7kg77޽ְkK٠e%r;;]]CLrv\gf[䳱X^C$+G۵dP~d:}n]on 'ޓIh#4+ ҙUeLvO[i~9/?zeM{X'`~]F/Fa %fm~hFcQ0|Kh (W @>h#1{C8 ?I|Wzcf 1eTޘ7+pXw};[)Z՝,pƬ[&]_˱䬶WWIه ]D|` q~AmKs&Q)#B $' b hH,jgOG%_LEi\[Wz`+4z)]J45Z-1ڢqٲZPSmVZiNgC`hvLQe>zް'oDqF7g24zp<+aJ k1F !ūsd9D3d3$PpPO,!Ž%Tû ?Ê>H?x@^[` ѥL@^5WzD:ş{u"wf=F>{xPO`4qeF*A]C^3-Pac5xtD>jL*< &.G+.ڿB_nq:nj|?d{vƏxJs{qZAiHT6$=QvILhN)~Fu0BʃIFuQ_mTѬ[z\uˡ!&锆87[-!v(7f [ UpR3T }/ <7zcH@Uk~Q8Ckpqfbcbb] [E!L@.Z4l?rZuTKޛhsRBO)AMz~ַRwm$G_!v/t@nvx}~ںz(iǔh4j8GC3ջې j>>Ν46;OiקĐ˪tT@)_͜T!ETQ*[CtZHN^wmK(m --}U^+)|~G\_9?Y-dnY_]\xJ-1!]q{k叚3y84;%N쒆\^ZꪶpTrh@m.?Σ1gsO]{W鲜MFbp\v=!`˯K8K/7/0:1E`Jx-F -NLe&^A6΍| .r@mx7î( kLrrM|WmDU/'p#?XZK_:x_^~ gt.N["vfW.1ϡM}WI%tXXJF18;G;)c .t 'EQ20 0)J$LcouxHMk0Xu^4h`$ 9UEcT}U5yvћԲ+7tØ=uw@W| r5`šW*v &-5 o6n-.VS^4wS/3VqO+XJAP1\C9BrGC0?=?Vs"3`;cD,GH(f#VJfɷ=BrYabh+'PU6gA*洧`;i6?ǒ ɹʸ*,MVe`cj7h1  PC, g|n_Z%#k$zp܊jFן]mY i]4Cy|Z$Dy? EsM4lhzDžD=ȭ]lF}qKvXibRphaJ-~``!|w4[6(R3։ɦH0NUOf`͙=a`8*K`ʒc:%L2FC_Y!\|ɐ6!'U=>GC$-z~FgL_}?s6+O[*t2~ځ"w+o[(5,ykCqc,j>[14G>9IKn^\^aR*֛_Yf(q . \qS*!X-hW*R !K bs3<::eH e^bˎ[opq;fUaL9]IWP6j/X/Dr3 Ujxi\GGHllO\Y%^q( ۂQqZةvS.\66m-˓˵aM'w X:yp|Oo8w|فs}<-&on34>}pp%.4_#7_c_\Izg}T_Đ<<# FH9K(CW1d3)_?_6ow*Xb 1v%h9Cf[z!l*2} ۧbwaĖn5bA VvUUgp] ^x(W(SW$i(+f;Kz}wZ1!CڙJ+g$4'9BzMA ]UX#i=Zݎ.nQAWPj-+刐d$RNMq#.r#Ŗ%+)"㫇|qpڑW tF[,X>O7wVh k}6~;pF1v† 1\9~zogGEqe a;޴p XVaunxg.qg}wIDY}R;7A;l$$(N, !s .N cv_$okq={o,  81g4&%2ƀMYPa6HP΂QDቪ+ p.04)#"1-{} 3:=)c9EOq>#B 7TvUV"Mcj+~) ␱6nrHQf|_^^Mw?O'YHGSr5]~mGw& kz#w MpMAo)!Tc?yQ a t~1E҄tnk"B+2=PJI 4Y922B6 bx|jɧl^.'g3Қ9A[2to'*tfivgK Mѡ7%\Tos>nUt]t0n cb{y|2KnnCGתSOj)p]KWy9\O2/ *ϳ"VuBe=WBAov1;Z6蛟?÷1],nnsVO;Ur.v GW2椅2۠A[+\R# c" $=q6 8  &ꀘmnNi bmnHM)]P *`XH1: YF d9 ]C+1CB F;b6 T+$RrƔ c61g, ضbnn/+)%]?`M~g)t/y[7,qV?d:B4_+O>e+~ZꤤV'%:iV7Oˊ,$Y ,:Y̥&{Qdn9dyD'ORM!yӋXѝtpB"OSה.^i4VcQ9Y|HAky;l%f؋'+K^BZ9$ѽAOJXwRrГ4sЦre!1yDңOH¯ ds}O! 89mܰmnK?󫣩`*Z~o$>;;x4&_Ϝ?syϟ/.B=nկĖhx[յ?LJRFĶfDs{xF鲌Ƙx[)sFH eTFm¬Dcjfܧ#7\PPX~CQR"DDnаrChp\ S|~1O#2K.>ZHAw:SD4RxpX11dȓLh< C6iȔѡP_^@P| [N2Oت褍K=Z6q#۫ U=ۗں&8ȔBrk4|1RؒAӍF~z(ЪA ]{6{vRCN{գ'9rФ9'ΝI&xCzs>eri+i Y{V1I6AV&E²Å4-k֗R\)l3tN(")ѽ중( >mlYU} fSʮtF͛}~2X~-GpUd#ᅫU )bL%OiՆ2%uℛ,ן~}A,tz^ my=ӫ'M)Ń/.^&B-V|tcgz1p>” a W1Bn~>U#[Y::iT齋?m(?$ge?BTD:W:K:gR̎1NOM&%謹ٳpW\։:^O<.ۜ=ȩ]#ڝt9z[cxeks u:K#4e9LPs|;%%%UͯJRjᄐ^XI4Ab%l&"<m?Sͧj:P#l@%ض+|au:Q<¾yM5)SN&70J17Mddn~YW8DH9fqk*SL"!U1;&!eKX(f+0`!>ǃu*PFB 2Pǹřؖu'Ctf䱃 IO,}FwЎ1(pZɎ=3]ĮB=pJkEy*WihnKm=Hhpgoȉ\>u9S"yy~))9r30B/sę 't /꫹\z%Y|D1.悦Fn9 k}WA kBA@H8S\@@9ݸ M, A t;&tw:&i\^Y끥nE%% cZr`抒Fdy xB/S(?ѽIPj c D9x?-]0㺀5@7Uuqm.^^\4vjm$;8tj D%6{oelxN_Op2Bl0IMva}3Ő)a&Ӏ'ko{J4[or~||y @) xNV;(i 3{<9P)J-/ s*ݧ0]up mK3ypd]]b _~z?<|f */ֻ_yͤ#e\"Wi߄۱ ߗ7[#h N#96擥(חF*!J e@:PwF(.& Rҳh5k7;:I)2-);Z-lRJ:.rmgC<GIinýmO&R늳xrGcoWSBƈ`摦ԸffBEPeQJh=ջf2<BJxdԚZP+kR3KG%%Rc?UvCj^z?R9,D_M?xZ7Qm9ۋi;u~{3{R6T&Dyu;)pRt>sPoׅҼҌ]UJ%97=KJuy ֜=ɨc8Q cE-vtU'f:rC|j?b 4fJsnWj1\…iPB^vkrԧ]IXs]l a(ѓ/J}!Xaڃ>Wpw0(BE_,gS]J&Zݨ\EX10̗̱_d ~_y;o?~#Ҏ/7_M3?2Nۀo/q^@eH*:|:3>>E7Ia44#D*#Ǘn-SގPAꠂ_\w/:lk^my&|(xOepDh"t@&P[NIy%E"48gRZR5~.5'ܔ|sWNg3̖Ż47E %sy#:R2 FDRi9v.ƤP,bȏd+-r{_m}\[R|hnXb^\MjB( K#+V-:]{#< zJ" Ww @nz҅"Q}3P)RjL`#FGbBS"»3Q2Q2Q2QըDx`old@!JЮd*sY6{Je6^Ӂ5`|Wv5?ǟ4k=$.9EO ?N刋 4X(#T X鹰+5ӊP54hR=-^ AȄ=^C~={7'cJnL5FS=OyqU TXoT]F+1S)ɭ`jTA.ːhn$u%cD5Q/%3:F>u˃WJBTi \=*Rks|)5~ ֑}RfR3D6-WAWŅgH.|R>!G^.-SO>j1å,,eȬ)O?m% \ўꂰLن& W%Y3C+gttP-V-=Rm<Qֺ%B;.?KA o2VRC(ΕRRIڭ6k5W)VM%8:wA-u0G"@B+fe4%8]OW&m&z FP2jB$@Ko0BtE4~[5,+g;8*$ 2IdV{π WqĎ3t *JhBD"7"/Et'ıt֚ΥhZ[k+/ft{%5N_PrY)uD*^js㖇l 14*LȖ~ _f7:.36:)a܇mj)K!靠}w&Mp֜Vvi"j$Q) aJgv1L@Y|wqE# AeVVl2  "+6Z1]`6)%>5J|ZZ p$11Ɔ צlUc<3TGl:5X9k_i8ǜ82KڙIXQ\SmRvFY$($ $znZ=g~bLr fznH֏ArvrdR$œb)) 5iU:o"+h4܏ "I<?p AޒwZ5 ;2^1m|16O)IN,c'|4d׏>N|j΋rA0Il.#p^m9:꘎":>/Ԯ R!x142h`~5q/zgu?׷PnB} _Y %Z8nw _&2jGSBNpaHG!9"eeu~ [9z7W7OacDWp۵6ڴL˙i1ΩsLv2m;|~ 6 7[2pSƊdܝbHJ/c?Ub(~6/ 3L`b/UQR8 |8)Qh(MMqYcwӑ"T,y 3bWgX-=W*94]|[U"=aB=B*w qm먇co[u f4G#C,>90JT(|峟gf &8cOŸ 0)14Ogy}O3&{:XhM.]N(g#%ݨ\`h.^7m 6B}o! LH[ypu w}a~pħZ+)`DqZ H™#f?-$APL~a X$槐UO Z12GDJNĘ8en$D4g; ~Ttmca sD6Fˮut7×h{1_y1jӢJz0h /#&t<v㵧1Ive k62֗hK߅Ԅ_f{gED.x1L"{n-z!^.}xi(8.n2#Z,8fV#*L9m`_ξs}NF( -?؇*v•~En gxrux^4=>]_J u7?oh߮(>s专9){`h\%JM G5btS#0-yFl f8OrJ*hEt6)^_]PC)~sP9A1l 7Wz&IP؛Lq/ΖjpGlb㚲% z(c+ dEW+j>߾MI1s8) rGǢD6n%"Kik}%765Qavr Gz7j8{ x)!R7_~!_Ό{gJ|SHix/a-RUqbC)b c Ʊ*],_Z'ɔ0%NB)L < B !v_-Y+wV8 ѦܶIi+z1Mo#\,%F.CLQQ,Ĉ}SھӇ*Mr^@ҺWbg7N fsT1Ž̌5rkbvG?|:d_W?>=x&o1|׏_}5>.L+=};PIYgHಔ\g~ U5g 3W{NRS ΩB  !dz\1$fU@>-E&YOn.I޳89ꋇ%8}6s$\4=D <2l@R7ʝ"!9uw#cVk4t ΤU3R d;)5B Pw-PZ",-[#SY,rpw䤡`5Vێq*1o42Bm sDkۖt9IA{r蜏Rv@tx-)kIyPs#XII;^ 1*ؖ\F0 DTJ*^<->lNBȹјz d3?d^^_#ܮP}W_ݷ\{?sV/gHR,v 6p(١Fq% uv>dՃK~m;"`p$,tִ RشAW|DZf0uG2F!*;_UkUx()XU(JcP%JcIp,ܮXhdoedqֻyS{'{5Nd&5gkN/c:f71-VI4h]|/T&54vdT_Z3Vs?z%ܙfB / x9㇮hO%q˖'cua( ljF^*pC02yP8f&n8G6֏!TAYVFӌw/KϿ/NDFUt3Ao^gU^lMZ d*G!M%GF[ jrϒ `%2@-vV4(6 pOmmcŋṖWq뎎#spGQ& ΎaI;!-:00`vXrʤiK%A_|.]%F _㌻~3 r%볓ʾ_#Ҋ@EvR_ [P%Ub#u\vTnaW  [_OQzHb+;A=~2Dz85QP2}Q8Qs VsbC,!rmgSMl– VI|(acnsԢ;Pp'chRL$]IQPhKc_DFpvd 2ӱ{hH4IPRf*)IqF٥)uЯȪqWƼa*mBA`AalK"nZ.T+}??+us)I)'1ʙ4b6+KIc4A31Mt T"}:DwS!#e<3 :zEQrXUD6C'\ڇK9N1{qIU^#UOAQq֛'‰dΎ 6g&kya२F|&sc]fQ|0TyLGGٝhΨR4չl =e o. 3[Kq%A6Olwu;ɃTKTY9,(EH#]`$\݊qlLqnd;aK[%Y @@ЦԪ5#xlt' `g1ٶeI_aZu!OMbZ:n(eU&td^t|SN2K6mBHDQt<owͦ+@@J䖣\|SOzbv KL5/WFqBWm'!q\hZe OՈ R1FDJuts'[.gxr_;}u=~^գF?])ȝ@Ã^~} չ011޹+WvհBO{AW4?]f"?ߺfzw,=k}KUD_H#e_>mtK@4c>7b14<UPKGq5}/k)Za(S؜PQ /SRBhRcθ GFRJJzz2"'#, Lcݍ~f?0dG] /{|a26GtMz3k\.;S@5.]]o#7+_ֲMLH<?gtc˳(:$({4 %] %rṱPmrt=h܊FvwjI_-6Gh`ڍo\9 . W+ kL p cCHNAE4! V#jl\"v ` Uhm (W Xb( JRf> ]qV*1/u^8Y>nJMX#z!OK1wBK1ii)5M۾AKi))CRBt) ZG-*OKJVITr ϩ~)uJZ2m)c@H[*eJ s8 jaoZ .yɕ^ %rdKꂳko7f ή>ȣR1V aJ0L NHN7Ėb1Bk׎mF%砚fӠvfORܰeUx#Cf\iPՒ-;JA 70^.y(J2NX'Rh>k x45ƫ8Ud{ -J+khYAIDY~ I{(@~_:2W!Qri`L!^!`mm6D@~[e>yY᜹^ƣgMH0xua:[~-_K_'Q(<LakbEa]HR؄)O>TeT \qҵ+lFO|xnc?W0]j 2C;hvB7c/a*$ѻV!55No `g{OJuT=8g7,_jPo8kNpS SW>kd@9jUk^d]$7dLDk!Bpi?Z1,ۉ3>n)[M0Q|UO.,8ЂJJ *$BSL-}[#t\kbD8?'bWil}}/Y/siʢfG#fW7K`2)ThW#ߛfP҆O}dÀ594Sh:F u rh*TNÝ1;c*#oB_%}d=q:5N2={r* uD5;,>IvѓsES+kc}p`ax:IgV3γxo [VڶQw:{oBInKf!$u<!S|r>%2]QNp%|@=kˈMk;I@Ɋ#Gw9yO!N$4a\YI"x,HĚ~qW"P .tn ?z"3eoPdtFOTTV{4\KnNdSliGᦷָTKPY!E Ƒc P,_$y+ٜe^$-/6Vi7Z`'"FQ3Y( [:/xtR~2D>%OwQ_ O%,~S[+m„:Q935a)&`m!|Q?kkkcMS >Se &JU/h\ .Ϸ 0ѵ=m:@jVz"~-8Wi)Ȣ(e^ُS,-9ݐ6I"J) :]~%$Ơ*˸x-1vD Mz+w=FRDk?SG#pCf Igp+UtA: H{ƅ Bz`=4Φ(EdKcjp>=;eƨ96ag}TwiWiPau/OH[-aJ6_j=~8>jha YiCvc2"fvLDŽr^&^^2&Ry.n EL {\)#-?4`8t2=VI\IޙqC(H j! 2WCˁ`A.' pTV:\х!Thpyx-ha?-͆FN09Xm 'Tv50>pt0)pkޜZC }d&9IՇ]UceJe$ #HB\ -S9mN8PF %, rոnCԸnǫlܳVRqr2d$Η˔a g؛r&2?xQ1L:Ҍ;(h 0V ˴*, p$X=QT(.FtoP`׍\PNn\SsT:Q+-]յkУnFj#6'Y}T+%Al{ *h͑gs-qz'ER`sA(Fj^*UTre (! qPRkG4Øy!-^i-00wy6A0AiG -i7U{$…c8ᨖij"ʌ~Zq+kq;UJQ㪛ps^"JZB}ܻFXgqcZH,j"prUxLJOݧIhf,n#4+J>΅< X==rw#0PB\J7_T:.yG?M}lS|a^ (;,FG矾|"ǻÛ)B\ە'%wnVxn6'ɠ |3 = ҟCKpҸh̏_I7<$ 3nץ$$˜]%ezH_!2;]Tއ/kk遍a= 򰉖(5{7HI;u!+#8222"r`*H nx)&:F6x7ĺE^#$H)dP sʱ.D 6 x 0^"҄DcuNxJ4F35s<\TA!s@ Hqw߃3`Yu*GЖC L*]i߂--UT"Fu'b䚂>R(d5L.`R vCu!eHx5fb=x/ r$f/C!AZsBKeѭ`'9`yoW>]IA;E6M3& _W(9=*e :@dgof9eyMeB{4}c`/B7~} ɠP0V= 6f $q kr㒑)BʠN{8sNaxZQBKs 31+"GA {?M6s6,Φ3>0,O 8ȴy;} ,&x0*z}4įmi P\U1ZrۗR?NWJZ/Stbeh!_yD@X)M\o,>itV;!˥ǿHU$ڐAjBI:IX 47L'EADzQ[8aȹ{[({2\ԏaٺ@<7_^cއ٧ \}ۛgs7'(͞xlecpe]{Ya>1ʴǝ3k11D'a4 YF081-߭\{sv=|> ɾ:`.&WFу+8;4~} @Cxg*ZԆ{FeTpP\HMvya_;gY㞆qLYN OC7?Xxnjzp2N/_Lqk|]h 8 "2BkcX!I|:,F\;x&N k$A})Lb6296r G{009O "ə"C*eA(W"ΒrMg{`KY;(\JiAZcKcR29 7\!Z 9BH H9A5Z'GI)D~?7HdR@[4O Qy{ߣZoQK=Щu>u?\[zVAZ:.=?oPq2Aܠ uBrSI1*p@y XєX\!8+re h@ [и0%OkZK8 N֐H>om%m$F,+&7 ګ's$n]ѱFiх:@k.;kYt BNоS!?TS:cNМ<=A)⨻$᷽ꃋ~%i;EY%Xȴ쎅YEZ0f*`e:)ΙU(';$sttv3sTbX-oTR NiER{V!qp: GTDHDIA0=PV%ytB5݀/!>{ ia:gxl_`ڳ19Jg}TxVejLN#NO:x}E*E4ֆ"r=vZ6Ijsl EŤy )v/8P t6C8TYJkh,xʼ<:RQvP,NknNV`)iwu-%]f%m7â} M _ })~6I<>RJaN*}\*nMk<4r +IȈ'8cր=Vdcr kkmDt}Ez.5A]"e6Y7˅GyS+$`_qn\orh&`{Gb/0溣狵)n2"{a8㾀WIy_UIݚ>R],Xkb{/fshYo8*ERJ. 2oa8Ƀ3R clbl iʸRX*$l)zQ!Q*= L5Bh49RY-d[ Octh ƘFtk|Q __5`t7Hx? Y0-}UI1 80#<ɟ۫^'>OoS)/LY˥~an-rMWݿ3'9FQռ}KqRPX)}EQ3UI:rQB]+"YhFCG$"muP~o#IVw49 W(!zV}kv 4bY8['V||Vv-پڤˊw%WD.p>iuBNݓ$pNJrptf SY=:!Kqo)g}D˗:Ŕlq]$ޮȑj4bkz|9*|)o䩝 |tht`n1trɖx@X@(ˑ: ZgR0Bʫ7¶EwI{$1f&c&mٛ2l.u!o}J R WJ j㯟CEuu5ȦhHUHK>QUT^u\Bh>iAҚ48Upwy± E_t`-9C/\Ƹ" eNy[ U]bA{w&7De[=H  ovݎ"(^_)ں ;h,wϹ:&~"sAG<\˓qwq crlIY~JQ˅ۗ`Q/$ShDD*$ASkG-Ց \t ܐmuh,»'FGhNUx5b _z'StL(Z/+.($NpO$7*.Dqqҩ =BB")s`,?{1L:nG,@A,\A#fL}9n$?k j j jՋ76\F[6F`;uNxo{x!<_A6bG7菛xPqKDbAlҋRt/FӴ7K~N?+>A175^.}L)Aj'ILSJkg&p'yns/uƘ\R#ZDӵJȕo97`ÎAy/L0 2P%ù̅c)!#bޠDNEDne[%^盫X LBxzԺf)!ޕa)pVY$yV9-M@0. :@usϜhkVU=]̲'!6AsBnR 4ASD#4DQF(xKJQvZ 2͌e(K#cE0T5#I$45aM*'ƭ6 &ţQQEJ(^Z xFAʫ)* xѶ|AI\_ηlNrQ#5rK"BnEy.6s/nV`h gg_'ƚ6,h5`z 3cHi!UL$fW8v<n&,P+{y klDwm~Y,riY,NL۾dat%X$"ݒݺ3c#"Jnsh^} y P+w1PRaؽOi ḛu{`-A _GCGL*j/Gw9`(tжq[Ȁ؎s Vx)񍤵MaňwЭ=J y G2j ݼt1JP-8]P0iccFxӈO/S˔A/h%x3F1Ѩųʍzp}=2yɛ7@I46%𝑤HZjYZ};Mc@zK!$z3zj^MN908{a:@гS8dphC\&XW}8R -0dR $Xʌ)14KDf)QB tqݤ}Y65XmϹ߽ hxʺFVC2cQZu*X.,eƊҒP=^j*{hDLYKV>4ֺ) +w&ȟ&Y|pO>8{usvY6[D}R՞u[XnBfAJI#%)1yC>j)Ym]T'P:= P\ҞtȊZ/WBY!و֑33.tҜ[GYaĜꌒj!u+8 @%)9$[I,)QrĿO?oO  $ `-( 2V31*Z0srNebJ#c"q;BTU 0Q041X3#mr[L8ED*s#f(.ۻ:/ w`4EH)g'VJߘT.`1f1;0lN͐,JR"OČDf-Hq3r:!`Ei/Cی+"" hnrI5@sH3$JS[]y/?t`*$zˁL1`P6a[;!n: ئ tܩG4&<|(C7\Yˀ⠇k7%h) XkSXQ)TJ8g)}>5MT=G[O>=w? DT z5h7G .!cMN!O^Ù3侌sPr*Ƴ&V^z{/Jd:_ů:w1% _'g>$ sv;$vtG.s2Pty go%Nl'leBzs2b.}ONwY:7K0,6}젨R. ]XN}GTt·uHUDg [*u6 ](ۥЊގYo;~%/VGgYGw(p-E쌻zb!uۉC%`~o3k`4W$͍N2gjryRv #&91=1ÝS(bqړcz2$9$s iXb; <,0p^vMEgp׀Csg~@ȖQϰ^Ygہj@'~^N7`8l;֠1<?PSK 1sEC搱TŭP #*b[ fTZ RK 3)Ӕf뜹jI3Ō^ k!GˆLYD8kArNS4VR$`c"PY\|zfzw{yR>5Tʔ˧'Stqt0|͇7#,q=\]`y ꅯo.&v8_, Mu> ~6/^~~wO{ K6G #_rۓ!iI,2E‡>SQACU M:6^!YԶ”^bPwejͅ&sgL 1H. ^ 3KA/bԈQhU#|}N.!,W&$dL,n[D{*so=;;dB0pЉ7㾐g|aAlfjPUʼkSh{?;H0TPNնGdRxt#4^`q_.݌MnPWhoכB*h1U e(:1qATC@tFDNjuj𩰿yIa,*S. n^LlvE|&+EfCNp1̤0( >x"9d吧4Mk;18CZ{eE8 s9|+DPM_U w΢÷mO!cd`&8vFѳ+Lb&+~8%NQ_"(0;Al+ T[NSbHa6ְ ZUO @|wѶ#?x'w#%rF (k cb^xھz:{/*ډbN3B*09=/al(y8;ɯpv=(B!^ѣ~09N!֩b!˃ j}Yy^:a'/B^XS6C s H%  Br0W2>ym‡"|׵b!EO2e!1S4 _Njj97mo '@%RLNF Y0˪˒HEoѤPo|^T&/ucjQI/e7>ǗT˖op>)\(9OA)?UHR_%'J)A9P :)CFloߟ ߢNZIlCo'|Bu{œS)EdT}4э?% 4\LLDu>0ie/7e:ݸBXe 580D(EP AϹݣ[lCQ-DZjEB,E/f/ub۪S>E[T EQ`?A_*@}S*gXR͸K$5r3c b $e7sqéq|1]F^> ;T0ĸS9=<TSl,#`%rE9M"3cR +rD5D'T׆Ż_HRR$1  aGH*R%j11ҩӆ/J݂O'.kC񓞻fR(ňNxk"2p? DqGQ1Nk943^]ꨮ7EQi~uh"TIN3H!HsUSņg1q瘥ڭsb'6WB(U5=;Rڷ,O;] S9||Lf!]U B +mSoDo]"(iQ.i`=[}'sq D:yzc"9T fj|6|9n $$LwZz0M sMܣL3a5j_&p?=H3_䷤pp/OMC=\7Vo…TຆQy~__=vQyn^[G?8E{p I+ɏ?p{bǷOpBRLz[Ts_/ޡ߹_Blr ^=9Mw s.(]l_iK#*j` 5ȉE<ϿC A3FÂ5;+wS- "bh5af:N,F;#qk8J1mJLUtG 9SP #p&k]`vߜ́Yes?]Qp3Л[/'6]z9tV3)iSz䞌Gp#8މziz8„uVkf-sbc:WGؙ>UH*;2C.VT-CŇS:R K9 gjyۖ1f_CQOiqԣ)8$~ht{GwC/I]W-O:P&uU2~4[m6`PR}("LNP]bu7NƮF9sk=hveε3wͳmKAR;4IC7zB/@%X>4 y3{%RGaz7_]LEgqNl_]<>2E:ko 8[yݿmLes4?_݌3/b(%z_ߛy+grGMYϞGYF_Ff>wra>n(x;=9GPY4B^S*SY;|uM--]َ$ƼFp .̓ɀˆL˽{, ɪꞬՕ޲` Pn**žsۨ5<عyѥ _$6,M4ǦxdL a_+EFow\ٺ,)z}Wޭ/O쮾?˛wa>3;EI;wrA0U@Ɣb.;cWq6*nK^{U?17MIU$onyI)J~ڕS*FD:#:+D[Bd@QC SZ+R(TSL*ayiʹ e+6؎%劳a8f\nׁO)z;KIMpwk@iXZP!CTفq)`L')Fsh#)s7%' ScDR2$xɣDϓh6}F#Hud2LFBf[CodGw_o[FHDž@\%>J&-Oz*:!aHx29xXhbiOD$MrZ\b:}S-m ޽YIT2M*'B{tWfvU]3cCEP]u&^+#`[݁@P [!q6` 8B!SilЖ  $qљH>X=PaWz'a|7RM i@cΈ܃3zN%R A0`BnEM@N>C x@ !f o/:e䗾x@R $zbf6Q7:Dݑw ͟ˉ nJ֣O'mAJ6p 7JA+,ǿ۰ 7,"rrsGh Ψ GsubmƒT▮pJ: a!nY6Eaj DjGxՑ ^<R0 -aĻPm/&] 1lןƢCpTQ]|bq  ڕMXu:Rӭh>NUKYYŢ3N/DK~_~-= PI(}ɡ )4(\\Y ^iHm(+5SJ^vz$RjDI9YGtJ`}ջ/e'PfJSR~rJ:%HLTMrI i!NJusE{' LAvƹ1TGQQ,]PQlƘ!gvytt˯e~{a;lחTJ;i>OoIh FDhtyT;ÝO) v'#42mZ{{zmx=)G=~AF&wCyM(bz^֤Oֹz]u g|a)D1+֪aW38;S۶f҄s=(N#U:,0H1w V"c:V"Z'ٻ9RWKOn] ErqRRt\]I(Prˉ#`P2eH0ЁKEb|AHDJ<$JLyñAy qɚ$ lz )kR.8\#|uel(%9EB(4Eɕ  -bZ$+uV=}UC1*7ɊYrX~y qKѷW |/臫W1(< N-KLiD ~]٫{Ӓ]n6ƚ$D(d4SsK\IJh{e(QhIF!4M.>m-cDh$ljAGp!Z*xmg6(ej{yZOrV$x2挋̂=@C9O Aܼ*fWPr,QwfrK$nh+rGFJ2\ [yDzTA C.8r$_i5A(v^h"ŷS$~Q2z()R^%}yt˿?VsF Q'IqG"h+5f-;Y ] #\pS='C >U )hNڌ~*,ru\hEg$frL!.bP )4!T*8rnh*3.dhŦ$iWc@bz[!bAN„)=B+#@hyD$uI-hJ`XY> 8UC;]9dKaW}fo]lms?/1ZY*7JX”=K~~]e]Uu\e]UuuyCpZx,7y ')x)9PiFQ' GZh3%W?էUb߮fЇWyW 0?<ŔrZ?=ڼzRкqٍ7a\vJ\oԂK)b9`huSuTu]xS bZ QR: .*XF;έ3&FxrI4jKJZ)U4ķ 9,15NIWr[*d`G0̝4 ."Y Br2.q_6d(otR ^͓0(a%Ruq'x֓$ع%Q$N-cwa:gm[J-eCeKL_+<ӡaWr+;Jc?9A[CzɑPjjC)mǡEފ534J9mg ?Gl{1B.W誑;>^lM8{S*ᣕv%h VM h~i:mjB"|[#G\۪66VZI #䪳{wQhlM*,em>_'kttkTHEk`qYgsPb:8.:cKDH!+'ڣ'=.~Z7*JmE _Xdy?-Fey(fOؠz@ }HrRGژGtgI٣*>;ss_„5m1%0Q}Q,^)jY!ST"NSOZi=Q'gsIyNT&_윮ΪSqɹ듍Qw]w[O-.a6hSvyLG|vFb-Pi@9'mU%#vFx aU-~@3SH`z>o@/ !sl=,꺘*:d:> @A)qKIOIoOKȹG(;OR1ſoqДZi .CtEA %^3v%z*JF3!1Z"*9EzI5U@R@I<] Fkl-g3\Iq-e%6z)CJskJ)`69 #[혗1a->\2OdJaOJOeNa#\ö$!v|B.d!hB #JrC5&\A0fw{HŜJ$Jy g1=@URuv23e|U˝t*#EE,I%ChP,[k(^$jvj|?~r_h\?~JR?}~ >~~R0--L4_h LO7~~ l77ɧ'ψ7}P5߹]dB 7F<`k>{%4*mAx2oi>O8nxQU[Dʍ!v4v5v-#헵u6R 8T `| )`4 T&>ڤBNzAk̓GK[xyiɨ("DҜ ^.N8e5HV+ E شɈ(FM2+dElm7l2n2ㆇf kŖlX F]sQwVs\T~ +٠  y:? "Bw^"QUco-]~q.jyG V˻|KQ{4"W}/{9wn&b58 jt=ŭ \iobZ'.+ġz+6 oh蝅Kby<L{~y252Э']@ًPv tv:u~?SLj>\9(l{L,VW~:->hœ-8n&Y1^zDD#/U=.(Ax᭐T?mW䈅yM[AQ{BMeLp(2ԺePJ*F"5S⠵~^~Tq7;>i!$2Y%< eԉ5"'bd,Ϊ|j)2F5O-8H{0{ \  ˏ;>C-aNyK90Ȓ༰9Sdƃ64üvRYsqo2Ml4K"^aT@:-r^..K \pv(EWwUa=QVTY-D&.4[ < lw 6Khqֿz [a`5̲Zwㆇ Uщ,E9e[^&5gPnTf\ڹOt{en]u}H՚RL dޓM~N_-6KTim<&K0׌wX#)=*\+)\! QC#1hiGp~-=dIr3ɥ 7iL9Kvl 3&5Nӏ%҄sAȄH}sPN킳RI%P_iöJ8 м \낵dDA+1:$C//|WKmM#[[|vݺ]_eonKUAgnZ˰!*@]*b[%)~!(ckwWSak*G/ ;Qck*hB v^dmseyxKLzhLR IˆVf i+?0q̛y0O .ƼkήOȴoi׼^#[ߙOmwv ar׉# NdЧj{ϣ \KEq%X=yrrWSb2Vo8sȤȃ暯@fBu Z4ZkL@H4|?5IW`$рZ4> g)\p0s$?($ ֨͡Mki,-6bȏo" j-j4`7!a}]/y1pTTyL{G:Wwv3N>+f^%)ɮ4o :SB(3#EMъ@*g$wEUIX aa:+/d=&oRiф̈M㬜8%+#n1Gpr7yۇ0##z6;?;zF[sv'hݍG2aBoQHaZۗ[@r' V+U *Gd/Ypc |x)6)$)!ђ< Fkl|8IœsËbܩTUjr+ Z&Yf_tYL*}RVjJ5iU[fjR= Y1gx=ӭ+e.:*PJuow&ڟl{K^6Е+$"Wm< 77BnzӈBm)ߟp]SXy˂ACUI;L$X$^J2pXEd) $Vf3SU*"޼+@\vFвklA)QdbV\׳ؤDA"kȤ?oP 'P&֘6`_ cOM6C<]7`SZ);'CkRHnBq35];0CR kvNL4~TNi6vRCN״8:3̅wL;>PUcOB’TCP..K0*'楻r<N&:0[Urʘē(gF·y 8)BZ@g;V:jjWm _!?Gߕ6 7^pn9wQߗEs6*,[~:gIͶп/N]SWũũh&*5,gL,s`,X%LK%:Ҏ{1%?۲_E磏eU֑d}Q8o#f;Gzv|o!FWPH˥Bx^NGV(a? cP ^=ǮU*cWXU\9N۟TCr..3gr%<8晓1EH}*sRRD+/j|o6`xa^iqsxC3s?f0)ɐ(2xJTJagA0O3:aŮk6N˖IĉҐ1#(6&'ܴ$05Cp~9kh w$HW*hb6@ &)nuwiebb#@xuI3]t#md:$z̛aGc86A; OU̒ZS6\І\HW&7vBTJWspwUO?fc3 y۵y#Gjm4KY_巏T[n+I4Brwq 0a<ސѯ?;!~:[ ')OʟcO;7n:n<Os@LJɄnT o\|`I+IO> hV /: -8r<z+M<&;<顚Zƚռ֥٠DPUt|=C2&.+0sj,zHQSYƆn"cFD%E (fxrU\ WU3J˲@^tS2&L Z1`| irBAbb|Ō|o&%^M,Y~O3B9]r[Z6Gݭ @s\ ۺ]6*Oɉrf%׎XeimFnl+s";*dCNs mǬ)܈.G-(뢩-a~t@d4!cr_\6:7GCpv^  QZ ѣ}O1\=mdjotԭ[ ~>Ȁk|vA>mM~K(5c]k7"8@V+OSM~͹>@}иZK6|unX}@ٛlF ?@7ȸV;C6I6ڊ!$?\_ LoJHho %z(( /X;۷G pXhL7{Y`aa ګk0.;{¨f`c^nrmƥ-X5x,4ǽp(^U c1cޙ8t&ѽ98GQ8>ѽ8G5ݝLF ={V9:ݻkp\Jzu;fB0l̉N!p@Hq5Spvv*/{ t i]^z=[][o#7+^idbf, 2ٗ]l=YKZ\?Eɗ,YlE-[1X-vbUt/Updy׼UH`KmBB i| u8pЊn35B9R+>x" -h&K>잮ۻ1/$A㹿Xr{Ws4c x?"Y0ݻZ3\fnSWMvx@D}y5a+!]w:fio 1OX IڽӹU^@ |cF/U(t[g8CF6?bW%qXR&sAh%O3{S${.o;$ѳތAE?ҟxQ }oty2Fѓ TtUL6]T|d7ڡt 쪘 ~wi^=J^nD+k(*m6@Q$-RIMԕ5$ES=KfרjU ڛQaԗhV*|/ĥX#IPRvzf}8,E7jgz]g_6Ū'QNQ7G6y5:`sŅ2[(m,ʷ8_ɧxpǃ+٣7õR/%3PӚ?U;ϙ| k˫5zVe,rZKȵ-7܌1ysu_lqJϟMӏu)gYYM}& %=].Yuy.U= o%JΧ[GO2 VDS4R/Rh6̡eh83Tl;UMZ-ґZH-s4 aHiCG; MYkWSe 8r ztg],~\@¨& rTK υ:@jPp4`FQqxQh-+wZ]>2S WRx&IAs STꁻvwB29#s$R?|w?>Y|>Λx_G_ a*#4+CHv8(F7o//GgCf!w.t]I\!dGvZnog?òq૝ɍ>ฦQIqj*V.K, aFf'M!sxhAӪD JiIL PDk@t٦M:^ .ݷ>b Ml&!ҒrY=xRl.U &\<}*3XKpj̅ ʬ(mFwX>7;R2/:,C\.zJl$kͣF)=7sH{|yli ]ϢZ.{ߍ* dއ&&T@);+铘"CW55vs]qX\z7(F6sa{*ac狻LPzJ%wگ%^u ˮ?!`7&1tͪ+犀*A24\uLgB<3N( ~HMyCa9JO"ʂtWȿ{HIlxéBttyLHv`7WD^ r5u.Zi*ŎGU|*h2 7a:Lc/}-RtS ?.wbT̲'k?~}a0E/;3N ; lz^fקo`\{9tGr0s׃'['H4t/WUWYM)-#ybkb,lx@y<s<Ƴ9w^8ib8vW, 8CxfGF7Ќn7t|g,-Xy,"]Y%CHkQ]~$[~]! ~id_׫pMxfp!+ o{t!ωG1 Tuvpe. L[hmIF"k\i<֨V hwf6VS,IAv1FK.&S~1Ѫ1 FODȾ/`D& c!d y-\_.fd tUBhoBk+io^U{;-}1" 8: [U9HDN&Suޱ8_W66z$.yuv$WFur{x! Kg/݁5("{XU\J+HዣO< 8klRK#_[' sW^%d:gLi^9]Ot!ge]t@Is0cHGK=7E;l!WJ8* 擙GZxY8|½[>fz-۫O링ں?{A6ەIy8f][Z׼U7$ӽ5MxՆ" J}_ZVDBw|ѮJ)r/a eTwmHo^qZx (x&S< XGOi{-ϐ 27$oZ/́09/a-ԟz/(մ1G?—0헶ϟgwӤD|:q׎n'P:ZQX Y*rn H. E_0#0FPXQ9 55F _J @W 9I u 0g`]rQ*P\\|RluE&dBÓL+nVs-&0S9)^-CD|3ˁ4+fGh~#THBӆw$9ls.){СrLnZ۬o.!SѦqȐ]6 )H@/jV[Wr©Uxճ@^ +nqqA'vۛG{vfR8@h8#Jv4](b{J.&HAK g+_VEN0h*G|jdݎ`)֛̕؅ȅfyBsq ~iΣ98%ܠ(F?|A>v:D_1Ea6TM9]Kf!H`!7rs5xIz{3^v_ߝ%>6@@)r. 5e819 2L.s1W/@SJ ·gj|e;YݴwVZm&OlD@L;r2 J(x)-)Yx`aD2b]YiZii;߆r+`r)9HDC $:s[)2'Qua WB և]mҞ)89.} @& g *Yd(хuu8-X-7`E 58p,/YD,cJpjjbyYz5zXmiR^2"*+: # 9" 46HN[;զVsG:H89Jkk}0L_` -YRk\q*X|eRF}i#%yyXrWh@)򚀕N`DaC_6:'.cJjjw,"hbs53F+<* %P  hjH&BHZ"x:#`qeRj u gjsk 8]MZN[T9p/S^|vEjH?,Ay避]||`?%={a鼉7aѯq7TFhVv8(F7o//Ѿ>4 9si ߹m u}`|R(*VJ"!{5*b^O4`)<&lL:t*w+88Zc{> \GĮzE+9硰ٻ޶dW,~ $I0̾hD [Hrr2SMBIde;3j_UWWUWWDD&AN) {ŵ+TfHBU*k5,EX2‰1Q[eq,y(G 5o(JT7 @<顈w1Z20ҳ䡚 >1½H]&BF#\ ja0(U@J3"-Bk*JDH3xO"TͤJճtT`jYa1ÄbTDmٹ*U⢅H,<qLi$ԌLkJ꫙5e9pY)/)83'($E$&X* FDj8),5J`$2`QPLxЅx-=rYq|D1 OfKBh䀗'ngdk.Fٟ7Rv/BJu0I3kAHW0a;&8RmS )EasnZcȝw#-kЩcS:VC.Hǝ[YPdd2O2%E\QՊʻVt)6Պ؜CҶk=Q;+ Ze3g;Qv/#U0ץ ?NҴ߽L ׬3f8Dl9b{KlCDZ }#yΫB5oAx57_w:PʻDž`? C7O.Ud\>~L7cJP('ɯFqTos{gŇO]i|8 T2dh|+gj!bᦤg56cNĂ뜪gq?Mv̕cLb,>Ï~ Gb N?>@(P [bi0&5g=g/)5RoOZO IYUoj0P.r1MKolON+ΪS}!R'p"uxS(Cg{w~X[dAFNS]۝>؛;{軺:o/@~1O>M`i u>P fr?V嬄_ve tj铴@!DkZ ԡLD:."'<@RAXG1==BM65~ b^7Z(xs,fw»C\94fO.-C$ ׽a ;6L󻄴L\VrmW_U\ZM!"CP+ɵ,q1,6Ji-'sc(m=W{fa=p"e) HZ׆rص.S\0JN[Yy{oC:m"Jc}4erg7|qLj3mv)t_t{i'2=uu l&˸87`3I+6Gk AlyB߉wryOWRo[wiM 4meZPe`n`&^=Je`jɢ=cx [=5Yz(/l w0﵍Ո'9Ty<4*G5]!CJiHU~R]vP5@lCa [{X|'# pk*Cه fo=\]O#Z=c!bc}duXhwnUA3ŋo4&|]`LaٱZLkX͟[½Y|)Wce<` 1p逬v/KAbۡ"ҼCG^nGСcY[u`\zKaNQ. \]/e"*~:Kk"**zyX{g`9GT(Whh26NG'e*·Fщ>vDVN6W*;ԁ%NApnhP&ȭLh|{t}ۇny~z6@.08%Lii7NLⱎ#Ju*X#iLb$Kk lMS*Qzccsr]`k>j!+ W=y>I|JE,ddLcC?EDJm#Xsd8MA 3-jNb($%6NdQ(@%lD "kTn=^!Q!Vb$u_ O m Xa,V@A:󍷖Nv`" %[$M$MN l i8XRqZ N& Ahqy1rZs֭?ݹa~hz&Y$] t߇鵲:Gp՝tÍz;,fYZ9]4/0{Gc`X|wIFRo7/ #^>erg}w01Fx\WFy}&lXWg|!Ыwn|4S,m5B3Pr_9Rb,q WW۝q,PU4n5+e`PM@AlL PHIe(?d}^{ysT(5H⵳!v9?f+;mO7ovjOornf?}Ǧ +γG;_ &xZX#F3W&o)9 #gI9wd7e#\zrQrn'(2-$!\Dd^4Ti_Rb%:逸Nsis>_ւJA˞' WBI_LJfsf=Cc^0c NuQ}f4 f8 a'eF=KQ$3,FN3Q60b&XVW9tFًCrRYb~?V嬄fdΈnq*'eJ퉙ųRP6HV1¤W;æmqJ=}mgQ61zAyD-p4" 1Z.NNAYȏ?_(w|gswJi#R"6$䙋hL),i7v GtBQGVw¯xj6$䙋hLBsݳ*)e[] iIE˞'8mf>|>†e|Ēv1tԵ,BTi `RcH/MoNo+/uR9ՉMi1D1QLe,ͼ* ytV4` "vgŋw ;ħcC!1c l28cqDZ&Pr ip 'u{n%x`'2^ФSxIv6t`{u ؃gɴ<ڟR~*ZY,U@(IgJ{/x۳]_\s0 |8yNsOBZE@`ʛxRWmAEz%]S~`NV ޅB P<$I$\ K~KpcN w⛗;^M[KJ|Fbr/u^o']s¦E'cJo8ɿqS~JR DZn4:%DK%pi2L0nKѿ|>dHB_޺,p(EF˻ ,O'Y;.>IBKHwW_rP9"oyEȢRFoDP0B1<#dJ L"6xKn苂$7%aI}jFbk|L 6ǖ,TSW=ymٖawqٚo4_1닫/fvu;r Ww_Lް+th>OheR싃eZ+Z͑syFVX-Th'"Ƒ4F468k:hƅȂRTTXqV\(ad1H3f)"`E`ӪTDH4Mb<ٲ!XID>7?F1 5g(*\q#JF7R*2,$e2၀ѳs4M]n*'Ld!kPb|"+6 T"tL))EQG!r'S#) ]5[F{\K귷Wߞp=Pz)EZK%`qr Le3$4 e벅!9\⋊B]DY/?2αL#,S)5PF1l.,qt,ws]qqzw,<':ϣcy.Ɲyǻ_|p07Q`^viw4wS/q~aBvkW˨:dPձ{-i:x:h>}[^biyQZPyls\/ x rtKG&͘T.6ʟI}ewѠPʝape)V4 -J>]O yċl~E&Ud`ˇW2mqZ&)s]8zi%6/ѫ|?M]qGx̹=o_,>Ï~ 1c=!upwI#$}Oir2Z:Q%]s4LF 흑%4bD,TTapӜp 9sm^ '%b )R<띿fh ѝd "ҿ&N^^;:t jL$:(aiB`R4&E" l(soE'T%)it㽛65*kEe!TKIA"J)1iBY["\v.U?<0ܺOx<cU(F5ٛ;{_7+EfÀ?o`ְ7@`l{.HI߯-8,,֑F=⯊*rM{e\\< XŤ|ǃ(k@ѵv෻'7?1$cX"pTxfq=T~ӊ ]OXḣ(؉)#`K iͭKPK].ȺFn=Xh3aSfR)+GL{K v#d4&2ctJ/e1X7x .[' t)j,+GӫnRHNGӶk9y'+ˊszwPHM&q:IWIATV΂t: uKW\2Ε^ gJ̥g-Zsb!\j\ zlGr|Y_Z4MSOU1G$GA%?6'/r9s< wTC+zb xlAO̦j{ 0x62PcHz8ʌ =k9zYRQS|}*T`4tک(*yWΪCJZ|KZ yЃ1RVwW,@rVW>tZn/&L>d=X,z'kOի/1uïev&nn'k{5~ݱL8M^bS崸tDu ػjze,ssI_  5wXSgѐdɴ%yIK KP:b9p3/`M/(Gǜqoqo_3%ӷuHaQ6w˲O㺍n^WM_|sD6X霦6T,/JJ$6a+.l'拙WѰ+;\% j9#Ըk, l\9:\r ^?bI''3JyG"|LwOzBx?J30CuPtƨZW,\/7 0$!l1 N 9a= _R'3SE2#a+Y95Fe/6q{ct2xȬІ .[bٺ2Q{o*,\P]ỦB9BRނU1'~IX ":D.}B? ,,e80ezwG3!9Qi/A(jW~9QRYÛeu),**-D)&KͰ&%q%X?ms}pChkC΀`{tƈ4GKp%Z3Fi]:8U6-gfi>f?~Yۚl4YM{/w&ί?&IݏBbldga2cn9D|T459nbbଋzqF£֮At=hpBRNa|3oia7wf >)NH6 I}^ٟԻY dx?7LҒG|mf[T=n\v 6Wu}F5/—#*ѵڹȞ]4ҏNh_6e6_6C B?˦E+Il`"6y5dğsVϬɃ6bB pZBIilڍrF O(VTxSpI@"v OQ,?6`r%)%8/Kf֘p +)QWicTgjF9/ƖmIZ .s瓚6?5HAvΤk%(5CVEBgaJK0^It`Z1C gBSA01/5X-c@D>blJ(lNk؛B$ )nSɨH)FȁuLPJ&gތg>CD,y\b'Y'yVVuG(~Q^0^Mn7³,lgF|np `O-!*r{0ۃv/.0?H QSkp_ɷ 6 f4 %:>z1<$@A3(AE7P[jrёP#xi7lrDF FeDs8أQ@fkMijNRD0sEc=Rܮ w_~Ck Kxb/\6ƒ8TL啭SK/pk¯Am2pJ`-'0#Nޅ{һekդ ~ qw|e1_W5ˠ:gM۰B'd@2)̊B1q۔0Q9w9':ہ_%Z'jal+vm݊fҨ`\P^-JO)_"qߏmO"G9'=mQ5 ma!}+/ңA#q4$%JH'B|@[ǘݓ6BM.R_E l c)U1?MF@}MF@MըlІ= !<{ $]P&l<<>( =ǖ.So7uL77VOlUzc|{{|iUxrU*&~';{[mV-V)|#rӇ_Gt$!_Ǻ[k7.Ơr1H1hZ nU[hLEVe+O48߿?`#mT -͛Ĵ5ګ볨%8Ha+;=V.Ke`1;Bϟ]:VnT|)j543Gkf`E7EՏ:!COb-񨫠s8B8jGfDmƱF6U>Bd&j9%]Sv3 qf9xu tU$ cZac.5yu9.`jqƄ*AG˟Ąa{BYF_n J9hx |=}b1"g֞?*ho*0Z 'xZ@sFa5?SE]sO}P:ySt?ۿ5LFeN,ORZk~r9~R.2ct{y"ͩd=#oaN6s%Gb/qaEHRB [I`/e:LaQC&Ju4>)!\xqMn.T?}t!MӅS dt\G9Qx3km$G"wv 4PO=}15~ن5c<ܗԿ/)vf1)@7ʶR`qh=c8E= 5Q}$ekZ# i]Fghr m@Ƞa[?\w7L8l_/;2!R\R׆QgVs|E*Jr|vNTx*|Z # 5 /nqb:ٛ%}?V?C|zmױәq v2?o\m-rE";fh1ڃC[WI.24 " q2:4KrtJt+4D]E@`d17-f7~nzrDS @2MTwϔ9u[(?.rZ[BEn7iGV,Z׷AJ+$R[Ԡ;ɄP j|` ( }E374@Qٱ!ZrLG.O . < -Hx70XRF`%~+K.rZX*S,F:ѻNoz,2]*kZT$xFy\LX cJXAET[qaDŽ`u@]cWRD`J1ȡ(}F&yS12L0]X7nmJpܕĶ&.IjZʵZ%]؛g5T` q`sg,X,=anr[%}KLVUqz 6|i |z=~+& Z$qpy!al+kɹXvR5Uv6>jkKۓ='%yz=Q T3m8fS<|#py,Q_GAQ_-#Jwλ^C1@&7 &7MQ<$h*5%R/F!1 2ތCcW$jr#SvjURb4h`OJ(b\)]IOфPa"ZyƘS`C 9)1[Uqb$5S,LC9rE8ʊA*t6P `g# 8SRu='V){ʐG8c ,IX;D*K=BEJgA BI)eX!V5K'20]E\Kx%b:jg7@-JªB Vځ.v/<=\{1Ox߈VSO$ ү}%oZя7}qM9L!^BXk.ח,gSPq-wZP$Pm:u# -1n'4N9D$v\q򭊋;쨸h2,o(fl4Y:i4űg r3 ]?{ҙwwOEFqס_? گ?vvcBM;rs(`T&$Nߘ51aj $f!G0 ʵ4@ }eZpIǽ{Z^`GB%D $ f+1=im?4SjǖPs/=?>К2E;9m@M8)/Oq E},HO$VOqQ|s[R 1it1 @qUۘ*qLg熼ѩ8>A!*.:J-#VJĎ3-5vg4:diΪ]4ALL.TL.c.[i=G2%?zOeW\}g9Srq:9߮ɻ|̜`N8Mck=jhQtiϘ濞̂y34rN+T9y*']T9~(erh+ixڛ|6/]xh4&аly怜Znp- 0y]q5go ަ#^QP 1bo (Ta'! W]! &$+[Yg(DKT| { W` LD6G@ e @&U+*"h|5`K6b#n-&q켝>D(S8J_%:'`Gb;SKV2˔k!!zv(W3k(e>"^h#V Xo-&P+*P` P\(I0(}r3j]jj3#) MHUNGReta!߸_7zR rLMZ㗾[@s[hM>yӻn[)9S&YBB`ޭxޭ MtͦDMtIqI7mHDg"}vϤϡ!rS95DZEHAv[FR1 eeS,Ӗg ۳4W__] Sky|z)?M+Ϣf,]cVVėŘs)_v)b@g7E†xNQ\*Dqgqm|R\':ۉ3~v?~&7Z?^co?jП:jŕtk歠ϋ'Z&ሠf1CޟhWSd,5eaX@Q1A'TukDTk.[.@KTw)mngQNcB FFNFDǪtN)Qw?H{wqdڐQjU;6If VخCTqJ%1|P~J4=JaJj[VkRRjN(u8RZYE\Fڋx:Ub.&?:u VXU&J*-Az10(h7M} ZMj3dA <\x+4'( t$ϝ盞*ywvߧ; EpIO宧 BjO&z({ xyQ=H@C|gG7MTׅ#}]7z_FcH+1o]vF=i `ȤT\jBw$b*fZGիT_F?E{L^**^܌I_Og>DMv2?[ud}|QDxo8.3b*z$.a_b-~N8|]By*gt.G˒@ V0 eƯ;M\e1;'J :-kA hz 42"/ <e858o5QzkP@nά;(9Wɑ GT E'H)Aшtmr9% Z¾6zPQy\4L+z57z ^6g*hdbA'Bgf h%H™vBT0וSFЄW:ՒX%RZ}8ȍ`gҺ+-hnD~ YFk<N(G)U@j 7 EMObS&StQOmS]X7n%6%MkL nN3xC;nn]X7nm3km;9DN޴qB"K|Yok* P8`u"K=7sG |'G2gj>7ww۠];q̰mΛAgcsƜs;o.f*ʅ[Vѝ:||p=?-=k=Zǥ.>\K@I G|.\ik^yu|ʛ)oz`MNf)8+a RʱƊjnQ}/[mkr6;!&ۨ'0 ,ӃKܱLo!iF(չw>ٲU2}(S[LQgUTί,@U"cDg~>| w&(@פN3`/RAxϔ7N!0Jjb7@jabߒ_QQPTCqALJ3*Ks;5462P衂w+30X+>݃4H-=tĜ52!%d 7H )"WU!sN([YUș_"2&H N)EC ZgWD dB"S@q$E&K]ZH녈_J*YG:!%pb-Ue2|Ed}/(C3Y =qQ^JUq;@!Bq#rDGY熴>W& }vS _h< 6Vis "N[ea3s2xg<и8R\YWPv7<|O>RB 9=%):Np_I^-[7.),r= ȕ2r;)8:B5 /[݅a.K"eiX;K%8 %@/ >@"a} :ocaiYU =p\ #l.ՃbtD^;ԻFF]\=.MG :4m Jh98f}pȘu$LSIWf(H8q4O_n¡xtE:S|Ifw#׺GW48G2EWgzȑ_)b02aObg3;/30H&Ӯ޺{}J:Rb*K]G" ৊i8qSg5/<#>X$,)i*[XTkBE-cVUwc5 IV9|,)V7qqLaR5}8ɜ "{̧B3DO*>FTc>u!S'/'S(8'P̉5 o?@R-0A &Rjnwl&"XKyŞԶ CS1/d>xuq'N6@!9pИ) o mˈOJx,܆ ;Gl8wkqScɴx>XaYazo1C ncTopE|Fz[`fYF|[qA-x(ުZdiƛR 3gqf^ :PN3~k2ȭ;^9'\bl61\ JATjZe7o(8SyO{O.fb:kpa2 rfQ>FI%c]#~(8UZ.VrNhZU b]x*Uy{V|4|% _<#gFaM$M|\qj/5]L5) b - vUDjɵ)WҁVVQ@Pli*`YF{ {KUwzs1b˙c΍i5\ɛ#]rAEoq:JLUQ.FJ8Zסz[TrHn]%/Іy5}3Ȳ8*΢CYfIIġ-y#=;81<&RrӃ3s;@b^x*ӺLYG]_ n1fLx۠ahJS7UtӪveZ"YgT9W%30 P٠g5}# N*Ξγ788]X`vrDyv]&gE_:JjBd:TNYheJQ)WUzFk)aWY 䆛Ս;%X-sNF(sΕсt%eRq{^:Qv(J.hUU>bt[wl 5`XL[Ijx3J LD'_|ų䙡( ]&3/^=yV {[&}-[ p%z2Ԇ}t(WΛn5^w{+p޶05ή>PY/ g+ޱOzfz+}C7?KH-f t#޸Ǖأh^"Fd7oAa̘a*\i@*g&v'ӇX CKNh .DJbkQfx9e]EQ2O"v"{ ^<}7:)=N-ʛ4CfA.n@&38sW|ޓpSs|!{$ i)SCV;J5aՎ`CZQ(cGiւJJa<{ "H=a8 |K Lߏ/=n$ߨ6}x5bJ(FPr4=59 c :Ȉ3t$BM|]nt-Q[Q|K^#ѴSF ^FS@gn2@ X5Q³+Cр@wB\ N"ւp=Ud+L|`R)Xk5kfuVʛ*]BzC+K 6hB)/woW$_yZ# }e>:{$fo,p1T2(</4 ԌVB2x]q+x`Z1=וUR$,-FpҕU#Pi1 d{AIvDpZZ+=)cv-^K[WR\%jIi-J)`20a(]Uyꤖ +Fqi\%bK8EA4nE[AApy1e,L.guO'.;]qGe)9Js(\xI( 9EK3K#aA8˲M8Eٻj*RŤV#aR8BѲ9Zt$__㳅&,,RM ]Xj9naAA1Fm‚P%[J4^W4@SsWzdb7m8w׎p^j0 gGAK¿uKIJlE|mnMZq^1L$[vڋO`fN98%]۫(LBB]I7Ն^sMݻ *A[e*,N1hknCTlBU]lj(9i0hN,-Hz27k0LlyFg?:tNT\ZMMDܔO_a^z=1*Tzh~MIc S_J@Kºc\J1\OɨZdIKjräӋ-isKé dhԒP{Nzk õb3mF̼I&Gf8ㄏy s&O4KI"fo4aݢN[p ap l [}*+2uKu9 p =NW*J*8cʨJi&RT(< p@ )vt ^&Q3Pkk]7q{ p? %R@ 8d%(hǩ# >ֵ9A)" PvKd M|⟯=)iH>4-aݭ}¹Fz'E!B( +@V@Q*h3T?R+ev$UsBF텾mj~Fwj$m+J5cLKLJUX;%)eRyݡ4K봖`y%߅Z,ua!h )nCř~;ߗ5*k<%P;Ff\+R+!SJᜫʌWu? ~g,A+1Z***i$(`$RqT:h*shf|8'"Dw|e5Z D`  YCU 0RӲA]~մV`|PJT,(5 ibjxV"nuBe"fP 5,>7M"Lߡs.ex/wC.`!䅮.M▌l`euW R LrcгP/)]bA@q)`TK`z/_ Zկ l=V( J- tq@[-Tae/vMnB@[4wq@EUnchIV&PU/g@5S֝G@A.lBd(]s(5 Bd>n#vthS 7JVXBIM#:v}%Q؄iaN;E'OpK2~wAg;a}k`S*uKBAC/T$T+V}4[zR=7Ypx.u 1 2<=!>~m.>\n.0x-/q-ϵ[tf?9qBC~*a2<9Ǽ%e˸|a|3=}U0q>?0c&f>"n̥g.JqK5Vo& y"zLi9=Ms{OML4imĴ?l,Qqs"-}L>G&BPFEu8YTig,xSdʢjQ8zߓ%0fp_aIpn̸c`ƣG CP)oev;ȀPȞ 48FAb ZPfÏS)4#)[@yl?{i~/(Y<S#/oBǻp7/7w?0ͧp?ϜVfbo48vs3=1I-3}t:1x|V?+$)Z恑6}|zZ}MFJjnQcMA* tLmI(oUcV_t["z}tHO ju*H?H˔_|$=j5'#WU 28gWrVS NzQp+Cؐ'@,&2l NzEEuD?';٥2"m1#c^7!اw7eRwwwqi&2BKaaYR[o^B+Li\I&WrN'v_tgn.]`*%3]VBsZR9-+83jp%- B$+]5`afoE%.uS0lBw}Gl#;g}gwߎsH2% U96ԺK3: 8%'"\EƐ]mo6+  &/E~eC"w"bvw%ڞ[U|X*U]#zqZ|.bl^h9(V@c[e3n7mIBxz9Krpim`: )'16,~\βk!Ka8ZReiYȰSw%/,m k jc5fi\"84SU b^It1SBIR3D+ _xx }LK&!()Čooˇ uwsXy,?,/7rԨZ~ "A`, e}vlĻ۰'O7t=e|Zϩ]3wG+puvDWt3a@?sU߯n>yӏwt$pm |'ξSJcU`%|HotF liG[M5񳋋mA@n{zP.|df!4X*,KRW(J@V@p-/ 2"&4{< y0i0fOuh Kkc9uxv>痥~kW-GSnvHzA%Gt9 1a+vUAQݫsi&\ͩGmdttԶ G5B4H &Vc]ׇVqː=.nuphuHrh Z}/C|a}q5gG?>QqaÅNZDX r-掾kӌs1ߜP1?F}jb3<*{Jv4xz慳}M*ʯlc397Fo͆D;[9kA~ u ḛ#пу.ŋIZd!invּ!8w>ý#ie-{lS&n+.&} -XRڨnlhRf|!$wF# %B!d6FlOq5{ fu/op]69b@5`1m |,.Bi*ԑtP⭻]Lo4`= C˻gf2 sT00Tб"8# :6a!֓C2(nI;( S7FpfXb뱍+ QѢRjjbWVÔd5ly,{$"iJqHm4PsnqYl# 1b̎$(r(?t0 -1F31\8)YLIi++(摅4rBxr8iy L$m@PᤖLpp8 qU8iH-DB;\[**aK2sF^Fh%xUBIM9JC .J)j3c[2,ĪC*@+ȌyA!)4m% θeC%+šEz1^Jz4 VU4 6LPxaʶʟŃ٦eGyH%748oG|YJKa\XY574Sa&C^UaMyDʁWūfZ㠑^`k>a;9qceq%p\V <(N.R Ȗ4=cLM0\sZȢۘZ]'Шxd%b\AKBYPV&+W[}\sXY1*d^#ķ]yZDrS$'GSKNV?U FE *8`,=KNnPYDF% 'iBzLCf#M_tfs91LQ240-!]U A nեxB  jG@󪲙(w ̰u ^`ȌnW20]-X*T( 7AK]9J'Ci s^ oK2k{Fw\/(𣏼J{,AJEw4)hj䩖_H,ׅq͒)3vr1Hw4n{H}k쎦j.$䍋hL~V͟<2LbrEڷH"Y uP#a~ZHubۣ CaiIJS[۱gH70Vft`,-V[ӓ R3Z&b 7{- h_J\%j ]fiO&Lֹ d3a-D]2~kJ:?5h%DNCm0ժCYrT <9j{3&u}u*'u.՛ѫo宽?~ ~.mG'RR?ӚN˄- 8+4UpWFaj-t`YƸ*`/˲|Rl]~mermkS7&sb-|O6=.Vb }GŊh#Ok+7&.Vt!!o\Ddd`t- k{{-Z;ݛˡ}GݻW3}쎦.$䍋h{~.GýV Ev)[:>7ΉdQfX3%;B$Id"33io?Fl% K)@oyLj!7v¬`S7* J0}k&'[޽NDNrsj~%'P] Yn{8j 7twkݓâtbm߱E驔1cJI5TSt5VqqB̆ XrKѶp{Y}WΗ@zqYIJ2>N8oR;hJǤe;]嬚%’K_m}NLGS *L!E`BeAN~ #bH ͒&%$tPПZOe[1f&(fhD?ݣ TzH̖w&spsMmzzs!a"G^QP󌍃{k-^*lTR‰t5+BL+T*R쥬)0T/!WcFJۙ=ڌ 4 YM8^*?w:ilp+J޴[G Z&W%[yFu_㛄b [0sQ LH2;0d`*K~҂z h?7ߦG:`~ H &Y[P{T<`*av=NV؟ViiKHWr@D13H~;JmJO rJօ&/*2-7qH[Tf['Rް{qbn cȌ]ˡ[[|ۗ߿KXn41+LoHGD߂B;B7ճ9_;h[6@=i]@A"N?oy""cU(~w3|{s;bu_6.yS/ΎNQܞ\]'WD$ջ:oxOJ -ty:|Za]Oi+ðΧf;PYU?xv=ǜ9<1q91,řҡ 8 $!V JY!Ƭ*\APAp2c~]&gs_}yk6 @W] ;#e6g+Mp D!O:a$-aoh*@ߛ"Xe IW@6 .cːRHN#ͧX +#pa^xBb4ɒaɖLgz8%nQA͎VΎZoB-}4H"J U*HqYF@)!Y9Ҿn ]yϡ$; |di+W$4 KW+d3UI,,#.02\j!h $p<8%BJ=L3t:蘆L0~h3֒yW"Qj:cit"NR sXѷҿĆ2X~`"Trox9{h-!r:XEW[Fْxmǿ}X?{ z%{x Eh~~$Gp^|{)z}7wGd+pwuEn]τOb6rvJ?]_ME}xz;=Vh=Z:v]㸩Q. A3j"*R2c.*^2B L%əRXRa(ZIQ$`:z!+3*.^( Z鍚@sT4,8njPL˧"E #y߿>w%E3q` (9ŕ(4Z%dlUi [Z9cy h$Ѳ, z)zdvkbY >Ȋl™ 4F9"M%:fDWa] t:kKH- Rzv&dz3{"$igUGd.|H.ކ~ "WmT -βi6FO*;iD30R1tl׽MȥGŢb`8< 60CB6jFhQ64Fi H€C`u2nH61XHU5j[yR-|>5h9IH{ⶑLt}/?(sw ߏ.ӓ~>e!W{PK`~[ڃn"jDcF^'"weW"Rj{iڗtFC'm#M̓Qf8rVrkF؅P%3*& ;Bx=R%Z bfI<+ԙçNcw |Wj\Pu5y1(yFNsh5)~Y%YȀ>KlSyJmo_s ]1\i֋c)ZkLRay F+U{pA|lЪi} Y].Q>Us08"GLr.R"XyRmPmPmPmQEyL2O9gZ9L9R<0;Q)Q.=3P\fh1*`j}x}mNot\`=6 b|LLQ]un|FXP!Hs7֘RM,Tl5i|ogCR`t/k'J62ͬcJRQ\(iP'rq!JZfL=y5'ﮯkrq LbW  eO\ZQμ+(ZPT\Fz n yFJsEd)brBƊ1(HՖX421`ųTU5v]CjM6ؒNz@S5%YuA Y.cVx^*_\y¬_ 7Z (xİ( Xx.l Z"X+JN;mEI Ӷ6GV 2p651k,#[ܜqRYcҁ9oͥV`#8P"wL u0\3f%>WY7/@|<V/*`#*0eBUTYtA"h{!%zH .kbY9솹~C |g"s;\ub w|^^7hs-P-uL[lm((!F4I+tXriQ adGyoᆢdK{tQ:;tԪwP EZ5%aIoᾦ" B$-X䘇v V qED2DiR`/bE\|xjj.f+AS)LΘH{pZyQ/v[lhӐbL8'ƫDs2 "\JL_Y)f]нjhsb|6-o]tFHs|~ޜW$Oir2jB`dKZkʓ S2%?LuI3, T̩ zĴMCSA2D'?Kc71 g8bri*g1^[/cpSkEμ$R\˝v ct!KfYS0?K_\32A!Z&͕Ǝ"B fy"g"q V7W/C-7WA,|?.?9kF)tqgw9=n qwA8? PK%pir3S{S|+ѱΘ뽧">ܷaDcBa]4L>5@C<@Oc5|91O] 2D=X/ڲcAv|i<g$ 9#],~-zc{%w1STo~6 o]§OWە:k} ͸i4bǮ:|[|?~UW3['kGc&:lMז8MS ࿍ )_^pڶZKn}skz!}aWocW=\{d8 P-몰pT?ݹ.qHϨ4Q)Mu|Tww<4UB&6Oi͝mе 'DCl!,>ZnJWaw2_ne- s\dB'TǬg];7/E)W7I8YOc^nΜZEGӡVߞ&mKV19Ic1! &}3Hy+)$,eZk[K<X1m9RN9eBaQ 8h_cz6 CIeZhH8d 3(vLfRĝSC1/6vDϰNW<#Q0spZ#H0V`F&FbǭZĠ0oP(:yf3(]0.ZKmjY U?P)ƍ) ?qlSa,Xr%B\pǰu|ԃgz1d'MO:R r4 SEK Lk46y(!B >ctjvXUXO?ٙXbd,m(*m03܅g~flp\ u Ls?ud6y\o\ 8Bj)t^eԒA{QAEMg^$2u^$wS S7Ԃ\kQpI V7mBCewa>IwFK=Nov)=l w2[Pŝ}i؟̳d.5rϺjyN}'x8r] ae=+S  ,/\ݠ6d!oDk۰hD1;vmTU "WG?/OW>obeŸ. %Ȅ7^GYwak\u;1QwvهRDe "-;lüd}3WvVqfEo0+L{v< ?JQF_N⚜jxU/J eoNA&߲,+bR"Iж$H'-!Bц$\潴%*$+RԺHVgn]eI4Mm*\UGͪ/['*F^o0ŭ5JJMZy9 1tdž7(cy1QUqJWf)'*RJw=^ӎbo[e6cH<[:r$(nxYR2UgL"D0ѷv aPδe(܅Dpʥ2 -RI)V:"ŒvZE,ܥIS9^VF;nv̖";`2}9̗b-L8Mz렕(ʐ_E<!NmǬQRsCgKPC{Nʰ$)0*F79FDγ:JhiN{(G02!97\piC䬅ŌfARIggdP= 9Ō kΨ1B)f{{kX^0Ƣ~ Q(R|?*dZ]V t"Hiy H! ]SG M+6B-w#&xxg{FwߘK ]>Vk ]ŖB'z)# !\;f^ٌSp2&^>ccd; Z\rV̡X?^F ^v^:quA -}͎:+. nO;{ّD,0%Ҕm",-0VYAb2UX`Eq;0q>xq)H[mo%Up CN/i+x9Y+p%~O|ds$&#nhX،QfynM[e}ҷwZʌNJQ6Za )3Ci֫k}^ E@گrʔ :ό#G$P wڊ y)!4ي2)Rg3yY(h"H+a(<͇nEuί.tw2`TqIǑE6ehDHt!s\AV5.6w@ߡLa3b(ܭ}|WZEȵH_{DRVH^1$u^,q7Ev'a$ELutd)" sbŸҺ7PԇmXIycrA'j-!aUKgzʑ_ "‹CK`}q`Jٙ 濇nIG}yxn ؖ|ԧUX,񴡤wÙnwQ"5% u:z Qq^s\+4B."Ӛ0Q9){aQ!+4QC8'\[9Q,a[fv8h.WpF`wlu&{DI;KҪz5kX# 0 RrV jkJ Z+ Ѡ2!j]Q^VeYi3 ꓨgIRTz#v)d;'UkAH&;dYphM4-O}GLk|)4n^Z 9qM)t-I}Gw14ҷޭ 9qmoS tܭ[ז]nIfvІ&kId D-hǂ.J&܀t QLz:p1p'8DIv-{ O#RΛt|KH*6s4Bàgebi5L 2H/a< =j my(P7ohtӦ.,?Ԧ.bb,5 8QUަ (1׼rPDm^c!vL {Iqb",C'u$S9 p6Ĩ<$k[-{$y@0o [}|ntZ^gEu~Wled7w gNf~ІdثjqUE"V9[q9cI;,5̢fRb>xX5e{~sXQM7y``@ ڛW @m7oHI0Do^6|;5V0Ge"ŐCBb! ["/20Ȫd4!Gu*ksY,jTEY;ʂ*"0ccdN)N R 5ڪ. Y03,Ig*'K)S-`dd'MB\]FέԺr8V$sPnEֵ *eQ VJʠ:T( ݘM?7 ^ꙓ e 1pkւsC3%k)%Q-WZU6r_չ\KH ˪2t*Uژ3JW)HQ^y&v+㉊١)XY+R fۓyv<.,*Cճ X,g7q :uxu&())dVV [ǕrmIŅT>>NRIr$~l7$p5(ʵNW8PK}dF^YP Typ6ɓzlA}Hcc8lA 1 #끲@Sn;h,gs纒%̀jv(j‚0YbRdU]o 홻]QݵHrJ ӗSkmȜbds{lfB䁻nnd)Kz|~l_r߿v.Ъ]~ql9BvXu&v碞oo o~˙kU,[齻`!3yx6pAa=GͫƷ||ҘϤ^}]TwCn5KO7_g_WWdSϞ|P]ۙJVǃ6\8bq{q~.Aυ-/|5o}*>[gϟx?܂|ƹtzߔO3b'dT4Eye1+ t\0$.(/L>UԅAn?B#Q L_['fJV6N -SG?oϫe1[ٟ.'=K,[yəY CkȭhFSfNàp*1z0ªR /9PՎS3I?0mвnxDqOO–5w\^}9q/EiK V ǿio\3NV㋭]c(b!Np6 ºixa?sM`B3 fDб=ǨGtd4tv[` 2"i=L/h~rGԴlp4jrq,y²F({H.tn/EA0yr%uΈcP;#}le>ua!'n)6 ΩޭٛhDA餾#ƻw96+znŁޭ 9qM)&qny7 #zT BL'1m]Ac/ܷwKݺ7 @9Q :fM 5bM )taY\ Z÷Xm[qmŵU5 fj}3q iB96B+(aH,awsX Xvr#pZ<ł7d/Lx:mi3,򼀌<"6BJ(}E=ŋ]뱿vS ۮiI:wDdxVs]FSfݗnG^~+qvz[ջ]͇ŸG+n9>wwgy=+/Y[3Y?؝`Gd6doEcSw#߆M3s7;%G~_ M$6K-&$ɗjBD%WC"Ae=Tcڏ]-]rt8f擄QDyHݐx3ǐ4v¼)+Buz';IBuV'0qoҭRI@yiFs882dBDbWm @:H,R)Њpo(tE4}jد'Aؽt2I*sn\)]rW5Wg6mQ*.Ŭ̅I{xkYͦ9XmFC!8r= KF4#'QOw~h H:ld 7è&!!V#o qQra"T㱬٨7w:/gK^vc__9bjܗvGo]~l\֪撄VeY9w*r  6F*׵)kvÒE9^<K \?.V󋔩/8[7!A-U˺sy{L~u@cĞFH=Vj]Yڼ\bYEnUhMmȜHA]ʺPy\A| կ 1vZ'fpm@] LKYV_%*j7E*YgTc^afY:)D7f܊{&$+,­TDQ z\۪RE&t))wh)Νl:7i*QBNc%2*+ Ϊ10*z,V,39h[hXJUB 9dTaL׶r~XgxA8KK!hh;#鹋]΍(NaH) }Y>' ڠBw8W-2\S8 svvuJ /ڭC0w=վ"f0>cB:6Y%_< -oO9\6nѷS)oYɹg} =bDCR_nIޚ$u+5 ˃)*q|;bXL G"1pXYyRW]î.K oPNZRh'pArCs;}z%NzbM)ӃGNce Om11q /ѓv17b2GT5khbNh#"٫û%CuKϽ+) wod219"%5l: }2OlQL>>Iz Wah5aD$ %UZj4}#/Jے!r>O5GYsPNn}Zj%9x)56S N 81DyUSf0! 9qMg]6ynN;hьiıw|/Իu`!'nI660nV)xT BL'1m]ZyNޭ 9qmmSi)ZA`4+&SVt)L|ә<(jUiВ@jIӃ$ܟ8[ / ;L[W1=<\YU&\Qzzp-j͍ci̿ĊWEQ<$xE3mȢ@H;Fg|'(:A!҆Gԕ^* +^A[aDB!MW$AT^7Y RAkHhЪDO*Aj֎|k2\\Bs=KwwԒ6.frSj׈DdSs@4uq' 9qM)0uzۻDb*!6^t`ޭ 9qM)G[M=UN$#g]j-|hqOr&aS6ЦΔ:~~\eW}7ﮫ[Ϸ۳3lyq6[|~*?_g2w ^YI>&|žQ \(c$OȊ/_7_Y ߺѧ~*6=1") -9Aou5EfҝU ԬzGXvR[VaQwe=nI4e;STއ=JcH,ϋ<[Z-Y3H.,AR.ÖbUeEFFƱMǽ(5ocv`k_?fl kRY*ʣ JN\zQ._J݇`͙t|ys=9RKuZdzmRR؏PAIš)yOlDJ)%3 ZJZECVhTPli@@ ? JJ]ZC6@KE@2+ VIZ h ڙoSSK/_w[**Uv~xO\[7\U_igqH>.JEz|_lVbDwbUNd[YDv°'ڂ8aTJ ua}V [%Ҽ볥Ԕtᮺ€D"@rzg| x Mi<18n6HS)A].`n"Ӈ>C'rw(Hц.\b9eFDs/"#3MLuMnZ_H)er@t&څtx" ADwTkBr${i/ ,fi|iwp "@x$93Ji~zu~ǓlIs~o;Wf:۲^wnxs'dB}v\(=Xj9g_}|}7%iY6xd::OLgpWbg;9ɺg_}+  ia}owv:t| ;X;lHoN|R \g/0;3"RS)i`}\%bBATw\Z +ޓ7ڀ&,~PY/钜Tg2ŠMi#Zm=FŖg_7lFݗtҖS惿7g7~^2`y `$\jnCoW>xnPiucIÏ]k; jx&GB"fuvj"9HVGm7Ez6I_mqxtMG:M{[ j4c Ͳd]W!mt&zHjqE^\rxo?@Gkװ qkZG 3֠t r$8(2|lJM1Z?r;e.;EHqK~li|w?q{)=ir~ K$bY\I})k&f1ͱ6ƴ0ZkqFS"sݐX>l[ϧe'ŗ^gqW=a ]I!1˃{~/ ʛ 0Jk3 ޛ1GyD㍚@.%=BgI#C:w_mgILf]ܷj p={'Z<5ˑ }@Z?$J֝%#Ƙ[k(4]~dm{'i %ř WzN7\T͆aڵÆ*14IY+sJ:%l'c (."r"y$^'f`޼z}Y3/oAkɀEJ'O-U3ma}t0c4,׈(RQG3&|-D9`鑤qT%IQHMyz 7~N?)qKkΆ!(9`'~ͷ "RCPUErMEiʘXlyf3*9O|6aIQ$N!>݉Lm]wmf0:#ԆQt ^i>&g?VSvX9_ZX؉-[6VBk$/ 4?vca:Ae6KemZkAp˩)l0rI>)^':5Z'-4a`I3+bTT+/PŖglP]KRg_MѶ'݆ALM[OrL]D`8s%fxdH8MJ(J BԄ "&@>lV2ŕNh=V('0N2i a  flP 河no#IK:L(VQm%(%.%gG\gGj}GH!V2U7Kx9$%:^ZFbXq׮alR!ؑ:A2[16R/PRvٵSs3EX)gDb3uMEK@g=wQNxkK)k]XGhp-̢LEZNJBT-l"(s/(QŤj\KZfG:0Ju4CSB=G>Rc;tb1{,&ݦ/ &6)9Am &Ŋu-X4kxMM 1fr8/),Waj"X}d0qJ&RT!Uf9y6]]p{̱]brp}7Z$;<)R>_B*2zSTV}RԮB+ѺJ:llM.;-&8Dy؟2]U xWgaC̣-K"D˙kH-U? 9 n Z$Pe$ {psIwZXla2}]}O lY4Uý7jmu Nd(]$FF.ST,0;m0_.}g}UL5lFZaYrLɑ`kJ";tnϥhDYNDz#x7FsC)kQm6,1ŸV0#2)S|3eƻ1C s'չϕ9G@ARBFVPPSs$0uL4ݓjÐ^PJCbn(bJ3"#TcssTk_' JR;,aLc% ${Pʀ,g4Ĩw*6f[3 p"P$=鴄IC,`jŔxBNwH1) N8x5l=Hh{OQ~_Nv} -,dVwȖUKy֭||ݾ&1 } $t BѤGꊢ/22CD]9Z%dc 8E_Ѵ|FzOܹדv<9LkRu۾;W1Gm}1I[e߾ƈq݋g:L$UsyGe;͙_~ +:Ă*1x$A%+[ dx?iٕj&<`v:}rsV':@* !3ƝɔU fz+yĔzĎDY*5O|:n*~s`zmtdJjz=EH$gSn ȎY <g g(TQW1#JO c/ڢe{sSJ)x2LasuAƆ_LxΤԖ[╶р tQ^ˀd!Ox@=O{:/^\}A6X'RundMĻo4>Ňo,>B Rcv~LG_&ބDSI7͉悶Uv[c'/U_V)RRt wNDWnKiX":uCa@mV:!=bY 0]9h29KR?1Ae +iB&)"xxn7fXWބHe}۝xDL,mmfu"<$ha1Yo@,b1ŗY7[֮AV_}M,I^xC"SndoEB{}昷}Wy%X.Ǻo:~=D\,֧V'behFpq茵Yĭ RbV[H%s | ҪTt{8BsW40h3\ux( ;_r?2 hѮ]_zp̐<6p\p53ᬅg!=՘+YPi;ǻ O.=ʴm:л)qDh c4h?w8k'ES+ n> E!Xa{B+n{B+a"pjPH.NMu 54u7S#F P9@*pjrN;@];kQwNj҂Q0*V))Z9ZsBhƉDnh̞dOF`z,PZFgoݏD!g잩Yv_)ǯ1xUڱ^[COi&lL3\CfcEwޞ8%Ǜ/k (Fۙ|^,c(2iy|x8q醹^i0s?gx왜V5}]}OGbe(vgf׌I^*q@|]judނ >37,u.jv,/ґcd<ه+LnzݣoeYPb!b(Lzs Ь]C" a#v$Udd4nlVrD|˘* 2 .y'tļ҆G[aqkđU\W(.5IxE)^CCRG}fSCJcvHTS 3OR{o⹕ht[|E3pqnr+k/.]~92|WRIԋO߶Fk;}v_p#Ιj0#~Li)DzT]SNQ/F}TSA/F}>#U7ОlRVH3bc._4HX Eٮ]Gev]V%/m%bۜA,P$8F\@yTx͍3i:CÁ=4IA1j=#סDHVz(Q/JH(FA+N V)&D=(]KHǨ\Y$g (NZIȒ\rlne6&dXPrֶ&H[j}hcD!u60*RL3+cɴ\X k({羟 Rye+X-"S & #$x0Dx uBޓD&OTCƳ.nӡI1!0(w5ȐC*Hk" }5I)JgDh!K>1r`%$gj2cﺗ25׈T0qF80m\ )8 n9i5X g8a>\PI)(`' E$v"9$bb +NĈqĕXUA,sRN(ZҊri'tޕT0Fnio$HJT8Fuľ 䆖~k]E V"|%cFt 5;u/u'ofhix>\CN/y+ȱuCãFB 5̞qg&-$= g<6t٤wCԑz$҇R qd!N=Us-fkËu88ǗG#_h&Wf+HʽOc)Qv&O$i'H;‡3=vM8d"[hQ>S x&NM)g$GsWU)a܀y 4QKb!$٬ @.쪤ǴmT_UIpS+!P $j@2A 4U݃_$$"F+fl0wwfsegv46tgZwe㮵t(fVv]yb/vMr`0)CҜu/Elji 񕍂zժ])t#}ʻ{b4*mv[;K+is^iLzxSqi6*u&pھȥ.[* ג 4p-!VW=f`"3P3>:\<#ԃWlfRt }u+ KF Œ+Yĭ #ߐ|RlSObLYj| mptze<LOM7j&ۼm._+b_~tStRޑC!kOxy)r^?FR m4sv|f;\?i$|n,e|L؛{_F~R|d',fnw,캑2IBD"5D9hICrmSǼ&p#zk}r V`BY~,˥rAIeRW<Ί Z!ju.xǨ^:%^)c:%m"pˆhHX%nQ sNDs+yrZr ,Q ~(]/N8jtMSJv0}c7[r]E_/fLNcQҮJ^9 [yׄBM(@oBRjp|]G#*lroJjLI-sr+*PRHyHPH:ؐ0۷e(("Z6mT#Ⱦ qzs`X-\+VH%/ލn $%ϳbq 9ywgC1ěz}zxч|4bNDu,b9Q2J:drE-Wq#Jgs3ZE/L>:0|!Cb'堎INJ_/ o٨V 瘑ȂH20b*$@]I&xc iAôlk*!끎& R HYL5Qq%"f1Pf4 hi)f(ʙ0[qn\zmI4Se[ ^c7ǼP<9,S 1R)aUHr"q\I7:rz"(`6xCRFa X(d>m60??E7"ϛ-Jd;P73f屄$x R:Z3BE1 lC= q qTYdQxlB4FL)Bx9??lR{+O˯N> gyώo?5gZS3#O<Jp~^纟wvdЪӇs\kGS:>=^] o?Oi5{Fw߹ѥs;EǛ|> L6C+Ԥ<~g n~F`Z֨H T8jx8ň8' ,vUC󌺈<IJo 3B@K}JK{vtH]( |~G(0:$oRuY 8$%E`' 1jl@QIH˂f5tY[l#ZDr|>'ju5D- :c8ɗ+0Ɍ L:уhb{l 5`2OBusd=3]n@nN KՊt~n6Q+.n?1~o .$7 ^`d0؉k+TFQpI} e*j7`5r.C!Nޜݠ͋7}߆>~i&1{3;G6gag`H~3{p3bqA~Z`g凉{6o~˖S5\=aP{|R"yR|E[_Ff6c_b89ݒYK}r<p&%/}rr\QPE,uF7;zw5t羚~qyW»!SA# ,gGn dol#A%O&~./]Z&3X5Lv ְ 8ӜI-SM4S.J1ψVOՒVM-: ޶T &Cd*8DCDSAO+ D`z4Ny㧛o4Q0?7 - IB+ |[Пp91o/|󼈆UrQA_썏U>^=T8WpLNf7/ۏmnj ǩjaqcԞ}Y5%Y2TƊ[i+m"nqa}ueQ@ S~]@ED!M"$ %7k|JjJp#*DBg yABoqܷO㮽g Z>җwaֈ̆(.H/"J$S,IaI)9ɜl ܘ"R/:ge%AI+*xHSVi C:'u}y1"&M.r?vS³"&~hCRe2a/•)(e; Dw>䥨/~JvC =޴䞠vk4l4 Q1UD2Y+ @4yQ qTpkiDN@MF :7 z% y҃Gz{ AF:yT+cS$H~"]HQ(<*F)Ez ~|&IF^ VNfiU.,g0]hג ;J-l_\9p0^Xg0dmVeH+ZKJjRm2⧵|8ig ODz7jG[MlP}slI!{}'ĚQїݱc8bI.<#A8z4# L_c3MP r3qtd ,pFٔ>!I+NT לW&)S!$'9{_}U'<E*&ܧ;J*QΫcIe'i=cB/z4ܴ-"U2w$q sYYVR#DKd?,BmS8v0rkw[@ <3P4`#3@yϣXi9THT$4Jk+ 33RZ/+5&p{*_+W*r8\ ?("jJ6|=ʇC}+b`^a6 3J uߎʏlY@о ǠEg 䧭E^ӋcAM }*)Aq՗ c3 nNO& 7 zmEW|?\4̌j.|49?+Ȃ>>[x՟TS9k|P?V2|P FcXM +m#Ino֭ v67<[( 0}#IJ*R`%CeV1382#"V9A"3Jzr4O ѿ{ˋ$h쟗Yy/O{ 1cgI ޜlgovk;|ʜjj='+.4>ɽ\yD{WIp^%yUQpWeA NP#)Rk%{6??;0TDq&]s3vM7sՋ9u"h &,ѻaү‰bipbU w91?57=fWi,hچy@k#ȼ@!?ꅰCW[Ozxֻ:2ўۏsYر{^%l AaY|ИnNCRQ f-犝TE ,PDlH+}|E@)kNp7_MvPٜY~|~#ӚG!D䇻=~治O-3+<37hݰ7׺]:}:/?1_z/uanf%H;ƞGSm'hQO][_=Uar_虉~_F.|0ksE BDCRWkivx;V#*hy+n'[עcG )fN<n;G`S-s #I_e-,1@1Ufwd} &1NDԀIsr]C?Ychb5O6'4R"cp`)R%K92Գ$%2D٨&V 9A4?Yh:8`@Rr(CybBN*hQK FA*"G|7!Y@PqwٸrcDN}WL1d%BB &ksXhWE%Ţ@UGWcJO9xXꑃcǣ5N%6jtԔ# 1ɣ Gi8t*a>U#@1m8p)kW $RՐd4JDm SET l6h2 g+!5''M@P8*"$ݤ )iAi=PL:'HxX:|nec nトGB5VC,DPӫ%3 c uS{; /{~+g,刺G$J}H&HQ˨wrzXmE "ߧ`?h2LJ¦xGݻ@ ލ~\en!h$ Qޞ@C$4 iw|<0(,P$ٍ>d}%+L8͊Ge NM}ĔRvN8 9ISC2ee&nb. Ae&baN8qR~=Q정28*\%ZJTFȤZ#KXm,M_`2` e$xGqR&ŖUB@V Jup!Uxo]% `U"ڨR,[Qa1͕_39v;vkM>^NX_u"~2Psa( aUޤf!e'K=7G?$RnS }SK{::oaD!UȌXVb@GdEuE~OXkn *O0)LͩR.vpx*̕_z)'ny_✼'w`}.q%ȖbWɜCe5ig({0kI dLo |fJ)L)Q]JovKzk'ո# Y>O"$f n\rvQl9Tx c'hoq9sN{aaRJ/fR)rhiNƠw+ S}Di,[c"Tyd/m04IqRxrr5+U@'5T p"/7Y(ύ/QC auڤr~&Œ-ʱrcR_5r]p QRyqkZ.,A;%YFS S 4V >M%)!(M8ƒj|(b}8xYi?ms0je7n#"]NcFƶ)D }4?VjǟUDwEg#j=,FtqR(eדc|Ү%C)0[J{;JEN.sB:񊬻],{dݳ?ӝ/~E[eX Q^ ||W'[5U%1J5hFR5QP- 7tdBl~P&wb9#Ϣ0ҿ rx:/pӋ~ۇQ4lUXlī\-tz&R`@4ՋTdӽ]_ϾȞ>KYg ^X.7jf+J-Okl>fvG}ʕViPt´dR\(-)[.4A#+P"BǬoYd$2Z "QD{\Ⱥ( ·vz (؀ Y(FbD2FY PXD`bGe5A ӎ"A7a*BpP5v*ʣs,F:j1 s\Ja"!` T&|jqIR_6K+1kIG@6=YjAQ Vn 9p!s֜rT``ZfJqu*:#(!k b N0p)y*Qem3*6ә!" 6jRq5.,R{; /+suvϗmo/M\DVDj\_xah3s{'ƒb~]Glʼn׋><ŃOO;7f:q4Ms|<hNōk>}w{=W " ~#[-~F@hZhleaj3c B1Dg`e(‘G(,AպK׻ LwFQJ L@BLAxx/{Hff/&H?sָkig IG"_G޻^'߰LǘvM,:oѴ7 ՕXu֦koMzgSwJn0HaA>?;C͓{ᗬYJ,P7'" NQO5X{)#NUg~+gX0+xBoZ"ǐBmu@Ȓ`]E^s %XeoHs\ RKpla$/|j^h&Լ(C)NvNUJk37hD517G4Ĉ\E{:R]ټݧ]Tmͺ(gw|k ϓ>`Fd?CCFMNR'!%H %ɤ a`sw՗TK6zO{e01D 8b)2i\G Oo3!SdEdonƢ a+QBDKft,yTpQs IZY>8X\ 'b]@^`esZ$XjB+Rཡ!@ 9`^げ 0c{細k C89̜އȌ7,AtC 51h nD\hP+DׇUkԻ;a`FZϓgg8{–:1.ѝoycmQe&ٙ$ILϣ4RPI95x(hcGV9G>,uAQ .Q.CwXBE5F&(x.\Hq+ L"vp$yLGl՗TSY(戝#y,CjF4?_ݰ J'9(ޕȍҘ wD0ƌva`c]x )ۗ[# MVYLŚQ7EF~_fdd"e[\Y:ߖ?xWzՠq{2\$+Oim)pA[zV\u7lBh>[P*"R*'V?[弡l%3rf.,B5`xR< T3s3lqxO&1I2) θED -1ƲItIv,xT:0L*0˃ՊcIEÄQdoAE :}"*6 [# Cڣx}-3k1]C#ݡ[J'sW*Yfp{_$ or j43+%u[Œf+Jcfﱍe_rՐ= _5dz!: p3.(l0=e`+S;2cqذ֕*[\c/ (;\S+g&g}{PΒ%g+󠕯oo>O*IJۡՂ%#8W^ 7/cwǸ#]c2mFG 2%\ϳ1\SN[D+"D#jµ Q:ΉAbРVӘh%eAtm72R(![4nBpAx@Vy$ nKŸa|QDㆺ'\f\R_\v`BI[TH_nC̨eyǓ}9o 1FآKH. KsFlL[q\{T R+tFOg1|u.' Rp%+X]$ :_jݜBT<>ֻ!6O-s:zD) ,1LH-UfN<%3ţv TN9<ɒini, +Y2lFH=v*o\D 04{-1"XiQ 4QXDFԦ4Sjz,>@ $u*JNRLʤ.KJ4iUS/C)QX uPTF$EHJ.i$Ft+8x'H-BZ 2SjyM&suͲO?e5^HT˺Y_q&׋|s3'YPgsSs:=?Y'-qN W6ܾZh??ϻdOnjrѷ)l6,2;Jh i(_5r%KW2H8HJFA5#iժ "2 t{_ &NUUbLYkɊ(^##; ?u/PِlzT_\&o.ALGPgQ%[QR'GrrWd3 #+1 ӂKM`ܗm;*}٣{qиeCލ8rw/򜐑lX#^0^X)qrVEWEEAbgb`ApJ4ΠdT']Z6/%NdTY(I+:Ie%Mf!lzS .MFI8 Up2}QJ$uȨh`%M4s6DVtW4'K`IK- 1 #4.-9X62VAD(DNʓE.5x"՚sʤ'3 6 zA3"-A.MI؎Ft!I-6mа3m"1ЭDԄ^44:Y|!v 㮗b[qFczK 6zxH d}EFV߾>,gx1 >5Х@4VU aDe*(DŽ dV|еƁ+\ΛD fW۲8~iсԊhi[q[P0<817MRˋOIϵ= ~iG*wo^&ܪxI TƞZ ΂c.J ^5MWmӆf{tpTTS Ccѱ*ja"nY}cChv3PyX;fKdׄhsr"_dJd" Jx gg4^&Yyae ɑ ϋNK_F5K1-&BĄ(ͲG?o'jYܔ*kDP>3ojcjn>ʆӢynM|>ϧtAWKFPIp%2$M @Pu(Ax-0|jq?΍ݢi "MϏuZ!{^|clUZ "}bZnR5 ҒP9f1AcJ;絰Zj˿(ѐ) 4֕:xOM[_sLɢl#L%mJZ$1 Z)"R&!2z ,g$xZ׹Wr*b P5L{ˍ՘ԟ7k$h4 3΍ c(L۠ʘTlTZo2"8L'NT_1xElkӛ̲qɚ%@%E[Kh }$o-;NAR%Gmt4^Ϩhu/! __A IE ^;1T:n.Lżj3p[- 7ӕBA' k t0 bEasy:-N``mdck]UDLh+ӓkbL(@ ŝzpҳ+$;Mz*g0uێ X%V=/nF:SfA=n^|wpiQwO~ A '. G!SzP?:$t:'@EU蕫K6&s& WήMb OdSv¸I*iGMȬ6EǮZβ+I7vڝ=%9U .@M.gH} MGEbP>Z;bӑ\/;Ӂ18?&=L"ȄMnI3*:DCI$被FA< *#8ɕiQXo|OY2֕5U`k-~JUiwi~]}+wG״o7Uׂ%M͢>nJI bZ"x)E0R)NL +LyBljzĤHC97Kn&0ʍrNQ㤏yomհ].Ŗ:z*a!uCw4֐ x8BӯEK4Qqͱ(*=ܪ6@t*˿ xw*fY~Ff+ZvLRkctj=kh-2O|]qXmB 䣥ncmeG[yyݐև /V aƃW,hw=^hvk훅zkYz5`9|ϿtTA\Rؙ{eWkގGW {yvȰ/[o`^7 5cSCӴL-!LbF %yu$cʣaR_$h9]t]+;UzyJwazp)NZ9ܨ?1kkvO(6LWGLPd' _|`#Шp] D|u{|7"^LQF080mvNG%!Ac@'@F`V'yʡ,+4Xm}`sK!Z$ =h关sJ-PDLLpiau0Ddy;g<'Ff/)&Qp 8 ؄F"*B!c4ZRZ(4?Q в/36#|MŸHy.ZcQzNR5!uft& bXnjfșOs` &-x՚Al 1zX9yksJJLqd9BVeEos><]17Q.$?T>TT?}Da(j:@t/?D fޤ A4r^k_20FT!f, Eغ35)7Ĥ u>?=fyۊĨ"""*EYjJpS|*$<^'7̿gbO!qw]Aψ&OB&1?Sl꟟H kkbseD=I.V.+H !awT}u{Zo_dU6rÍvg簭-#r .Yq9ɽN 脡oYka !'wK(SSw#΁6.҉],f&()iz8(ҖZK0+HD\Q1TvzPPs 9?CUi.gYF@لEBM^U eL3x1|2xf%|ep9eF DHIFS?{OFr_!%;4Ua@XdmOd63/:%)CR>&WMJjRT>H4K$UwTz/ycLIjCrﰎ҂|2sw,W* WKg+_6栃O x}t~SyJ:?&/GpӫqeX3BQEYdBJG .d  !b49 w%_Nu˪+%M'z1Xvh`Oe%eX>O&"SEj*)Gh>#\ioJ\j L !-:OF<dΫFԨ2P0ԆqI04 QК誨_jM'jԚN+n61^lnIPb)جA=B4W-So]^=9rHj ?O y< AV l?(4јkhTMd@M2JI@w.tUf0Sum.48L$> qLc|u$ip+I$L#u$N#՛dC {^9GlT_j')Xl^!n Ya|/"TG(GPK`x bѐTgݞ $`j2W'͝ߟwf?6r;P>-S3L6ae1[8h6F)xqrny/X{js0V]P:RXI!NnAޯЧ;}1:IXeRŕ+iRYquU1zt$}y>_lp@r٢5t[<\+\s;i%ôOk{8'.r@W7q.FN>)pNv/d>_ө@L!..E%dK3RMTh޽ŵjmQCaeT`EZsTS`!Fs'Y!ҥM3–WD|FgWH\XE-*(hsJS=|ېWȫa< _.C S[.d_b=o@faIn[x/a4IzFCxHۺԃjfOt49y~V  n2x4HM7T.vDZl(TT>~Gk <˔R6DLbpkʞ _7duK; 22$Gݻ4y"u ֲ1jL$."N Ҷad6%ɩ8Ȯl$\ޜm.T_GԯA]_Iv]Zvvƺ+kfvq=TTXwbjqߩmzjx$TL\o,u^iβ|?盕 y0hD^ƻ#M&Yˤʟkڀ i̿t1!T0A2)c9Y2!\TÝXe/>/޳i4D VwPbfjf vM[pT@Kzf: j=pWp֦]:݂3xk ',+“ ]toԆG{ⱶY(*g xTiRp>V>V{ɟ=ޭ9=lLln8B:/<5[o7c_N9u0իʅf~ ooH BtwJќƫ.ȲS(ژ8D4I Brsl ٳFD4^sm5:MV-U B!JPVz)@AB<:|CU:vn;,{=9t@JϓZowa%:ԔT|pKjWxSe LLNܢWE w<1.pqf^ȃ,`7A:^Ws, OHqow0SW_!;i/J*~̵m8^eN͚d0r В|"H89xXJɟ=j4snT1ݛvk:dj6$;5™2qw6Da%V8EGNCT! cB@!g5 5P@`u,&U3pFdB-DHy=yuS&':DD 5@<☧Nt%@"bFeT&QTbjlի'~ȸp\ Q%NXnʁ"RnFG *I,QAR(at/QoԦwVյX֯ {tLr.u~H_xD~/15^J=>tS@_\mJd a`tE3QNci]7TpY=6ꟃtOnS!q~w"t8\lFćhnGS3/ן.]9V4kXQJ&Gܘk^/}G?urG~P p{HX0SFX8iuŖ92z+YzIzd}Ri/Y^ yd:c16]_M֮w;2w\vE~\qQ`gNV[: >kz{R'lԈTOu__Nbҙ!R#ul$cE4Ây Vs9$ ȣw^Jť\H9a4pcNu -s<:Elt$vP4RCŞe5Bs1IT(k K,2h$="2qf"FC1Ԝjr%ꍥlKA$z핍I&ɐ2*= cb*O𿑰XJ"BgRp!&2&!0 aE,w#li.l4tH9ŕh= *pv*b--#Id(%U>` aҰdO[-akLuacfC$Rg{wJLf[^mDw_XtApL7XmFSvȮ(][FfKI&$gP"QНM$h_ibbeEyݬ]PR1&jQZ5xCZ=fr=5A,5iB Q5 L[OsFroyVx- =DBt)3nsF؂<<\#hBw]WVo.>oՁC<-;ڬ{k$]& =1@:0 ZK$O P(40դwםgc;.ϼp;:Ņ}&D=rX{5ϡԸy0ƚ ,gI}_ '~@7"  R=Unv7Q!Eһ\~Yl!|ie™r/g:X E2oʵvGcLU%[gxƫ܌ݛ{9\t .BU/E(rBV:!`NŪ{F^& d#v+ƋO&+K"q&>QT>x 99t `HPH* [ ,RP;g7S!q-vjČ7^گEKU?΂xW{}4%%+s>MPH+[ "HP-bUeؽW3m/.&K;hҪ1jt57])I=z/Y4@ v703r#>NQm'QRϪ@^RQxo~JXQ0*XarY 1E9w5:SܸB^%7C0ȭ\Yԝ* 1& (Ngp/fb4.ړ,,&}L"g$K~mߖ{YH{r՚)*D*P0PU"]wZ"]wESP-*[cWb:NU;WBTtV1ɒRa2ι(dh.Rs¼t8lp2nhib\LojZZb ƌx&9Hg a[*!r)Sez-ΐ{*1` uj-,Bl$vүk(ZS#6w4*33sATn%yP;sBAcB;Qfw[3t!9ZB])`gUrn7S3XgHk"-t)*TR0d/]=w߬S&$Y}[a Q9E笆f3% )׽>Tn]Jvl_KfCAFӧWOq FJAo*fdvN)cO8w^b<kz̺~ԅQ%MhGf3S35.nǗaNگJ:g^%i-pxnX$L ,sB@$;\,|?I(:*"_ɝz?Ljc}2~Xqe-Uǜe7cPݜ?{ IH*pp;H69 )G! U咺LS;u&PNwLC(N㥰/*CahrYZLwk')af4 YKl_@e7!c˗{ XԠIYsKӒ Ǻй T+P)N罨wa2)L[,"J0l?cɎmē)aE_# Fl_5 atV_p iεY<8HnegJwqAJq˒u#8@ } Ԇ^JlRNV"J?3u$[fp?0p﫤u=&h:rɹIJ R#Ǵa):{hpO0,C^c !9ὙիBu_vYͦ~LR3i (?6}u('ǧf? 8m`=e(|&ke(׍]nuq \ٔe1:;$ %9PØxdKPwŜ/#۶u,^ߏ_R1,Wr:BVi!OZ浈'Hy~VBb濙!g~zK~Bk ; =63ÇIYQ 4,z>!/p- GE b%aJ5g۱hV$8]ǛJ'pI~\i;wU o] ᓷfjTιGcDR c uybaSg*4\j1g!cXF{Z2-Jס#C`c6>ٔ]Huntʜ2*KYudoaפ_O~{LGfz٧~h?28}FBS E,ep\۔^K! N8EjWw0Hk^5MZ[W]HSW Qu ^uGRjO&`Ae҈"]Ⱦ5W.Trvo˓H^0RNfrw^/ ULyۃǻMbdA%>Lɇey! kY2񻜹\͡(Z$yOf|9gQ8erY}}ĸ1Z8\f9j,wZ,DyfK Cf}i(uӡww6$ oPўmD([z3fwX“2-{M\.>|B(U}GnOBCaӛIk8RI9Ti+[|~'/\S#Rj唶9:,QaeriXĜ8M8n/VrmU@P|qrNV:ZYjMyFdƌf֠ZQ\ʇ(X2/E$y؟9f0s"f)H*Նy¼T xqI1"ZallsRA-J0 48 ÚY0I{kL pD鞞_H*'k, g\OA-30鴓+M|X 3,Omkp7;i9c cU)H6ae1 PĀAHL9?yB%lZROwAn} Cv:6ᦕ7EAY&~i.Nl#?@Ï_f!I?!7oJL!ބ2oc\ҩ];HNt- M4x2]`~6@P݃OqZ tU_L/4`w4@c!h{vr@H]p16FLݻ .p:(#[/ǹ#"f*5Wsf1f1_/qo64L;F's\"Ȗ⏪@1BRM-,((|fgײ3lp/_9.JVWHHp5 j-'J0K'L7^qÁJq#b{0VJpA6h(l&ZhL^1Ф:/6|C1Җ-uJІi4|p*V  F(}4(Q8F''nѶ 8qp Pk4SXNRmEb烰:y  -dP_&s PmUx' ;y{ԒKoi𬱔25c6aqT%+)6S  %PbT㧣M@JIAo$/'& Ppșs^e kem6E rJ[0r2© N!ejVDku v! ;,0$a7Q\ZioB Nc CXD2u+ױoª놝"#l sbVÏ Su `![1䓵6A(HL+P;9&5%`)hbo(h%}E4Dm:40޶3 "<>sCQ鈂fV)5:o+N!/Nke G(+(gAVF&@e zb*案Qq"66haEYC8cU؊ A8ev0?G NiS6?i2e[ jB ޢ:T4"I&2qX5iaX]37=Gu(?b|qlcuc| y]*FON|~1嵫w<.$(ı~xJF_;LAb룢z9-IPT)mQ}htGR:gPNyKbd0>:F= yI݈|=&1B;(Sy_9TOZ+||vu^\uwal;|NުIڏOs> O'88|"McGi_}?N/[xVL/OBag2iA~ɬ?ehr~vupO깔P1}D~uw ʡVv'J5|o8~?3w;::Vvܽx:~:10>;M>{R~)Ѕw;pqp+7yjo]_ՠ;]-oN]yyAuq,^LelTsDr+`V/uw/x_m_b`c*wOy m7o BoVzsOC^/_ozRgO]i|5 K}rOqyﯥ1ߜMPw}8>ϫnOO6{Nw4M7 N7lWGP\+Z7C&/Mq&=뛜hsڄ | zprh]3:xuIޠd:2T^_I[lYX{y᜕k GQٓg% #av_Q8扐=gU}B>s`;C%vC5__^d4UW^xb ?guX=ݟ?_~5kx/跞mr ;'x_ּ好pTmEv)_aL5}M_3jk̯ޔJ֒g;_ǯe2ՅU&X?hd6T;:ԯW~~wz_iw5y._I?9.ݝ;]@ӫ g#s0HHuQap#UJzb&Rν2)ð< J+dII+\4OtCN$Fd85D.]4Q ]KFpdrnDjk+šZ=igDX'DLR8FMۈ7$<PGm= @YVM]GuT~$++T#4{hGs)R,tQF$emZ6f53̷bf{!mo S`$hK(]֨T -s%T,t#q 'SGDw{vwӪ/4{TB{Txsor4弻ˇkĽ|(6D=dc+EI$D3A3bOrbV8mDHTZV%YdK1_UJYbIݻ{-˹FUJ1z8EC)Jb×CĨuJVq+\v"Q1RΑB3( %5ߔդIvoO;7]xRwxnY/x EIF5:1P0XuÌV{&c 62A$HwΑ3X7ʥJu1GOڋʜ?Λzy];U\{Q 4')dRAjA:dJe o{KɁk!nJ&`i㨗ʇaWen>r=4[\c+2yLYJ6ip)!L,dc|*m}gZcW(-i~>ڮ/~u<FڽS`+ڽ;ml^+~;4GչENfmռ.6~nRT/ejdK!M(+kEve]㸚uŶK8O5,vjےvD퓹3-Åhˬ~S_#d'YZd5C[Ҝ= %cY]y 7L_\RUղZ ?faQB+Qi|dYle%;)2θmx d'$'< >'qN$FIrUT<$C$h"V1ʙb6gSmoݪR.46ΛRv;NNG93TMAh H2q63b;iD5:1>z@(U8NH6H"$撊ܹ 4~q!p- bkSZ`aiR-'c͕*U`L M\2'8"gӖp 9R` %>DejX <"9}PlOwhZdIn@#Q,[GD[d4M81VRO`A7Z6R䣑8"1:$ZhP#@L P谩|hAZ 9IrysG" sin5A1 Tݒk@ՊFG$ " !tv#xCyOq`$ FyԢ!j!vi;gil鯿]ZԠv6ߌgmwIC,tЪt{p hҭᱟJ Mٕs:VZ Tl[5]j~[u^U;`=vj礥39^`UoR̷lT-̨6G+m$Gb27 BcF<`VU JRSy%4P]p3_A2 Jvu)K+a0Eh1$Q lkjv>b{&KIp̌U:F1RC!-p %.L>ZD /~b8:<=^c9>K';q\>xD@H2 D:i&*;Ƞ57O9~r>[?-0J>y=kx^~|/5cv|Mւ㬼zsN=gB,~ $t? EY;ӁSq(!\Lf;͞V1=/Ys~ҵW|B&s fƀJxZTb#]H2EjH_F'MMvT:&,>L)X,sλ2M\hi=tjP/s$O^ 3H)3mn5XJiW*-i!"4cϰ һX*>Hlť>h{9s)2m=*Y:և tgxfA|-xju m%Ŷ P3x's-'LS3 In}ic.P1H5XSJ(/5y׆=[Rv&_k4ۙGldL֕3PJ#8!0t2H\nr Fȶ``I3^ˋC#Q@FYY (G=dOͷ-d߱W '"T=+³ /7%ݚ,pw-u5uV8gK?on6E`vwpg4WOd+*Ԥw59|XJnonRĿX{˶ੋ٢X HTd]]]/[FJF~ Qy7/<(l NykCZ;QÛ\9SI=@9h>7P!W< (Wc íƇ;xʬcE@t] zXS0b<~pH$4u{FE 3iRJ϶ġrޟ}};}\}6Iowr!@h'K @Z*;g#a3nBCm[oHiS2øJ B룽J=>8ȱ H\\|z];׺vK*PX|8EzoWF2Bl 'EwP2'2FY p0@hfcmz;6lSYՅC]O5RiG,+O%԰GFuKMdf'cIxOH&Czk-̀rkemGh ()Gn[r*@ݻF `맲k@ P畧'ӟm2edcUG񆖑Y$VDPD 2raֶT1&P-bi4ޒ=BXe)9'DFC[AGG\d׊?uv.κpoXl\\zF?˶?pݝ?8п/ajt06 UWxE?/w+;H3Uݿ[y-CZ]93W%DwXtAr7%<<]&V9B5lr@+Mud3-i>mJcH%F oڭ6RK;v\iҷu]CJ29DZi!ORL3lqgC(q\#W!$˜]w{Wmb<4cWuR;I*9/ݝ \Vpfa]?tGPl`r!f&ػ'"Rn"UȌ/M],B`nsTHW,U=B쾣:%|*iČ\{<53d}$ 2=\fRjf4u>P 7( /!އgioԇ^>]=8ad\$lWnN}* -((-(i5Dz,q.,EU^-aYȌB?p;CXL;m Ri+޲V<5i$zcj ӏi<\3G ${\ 5`q|5huu ?3Q隨w8F$ du5'QJ5W9{{UϋK$yqJBV,Q%M`E ͣ$ xϋq}d yE&2ItL5qB@ D&i}iԖ`p5JXlXSr4X*[ndZNVPڠ<)oF?Ͻ@yrQMLȢ֎هϸII zGHK[`1ҕhJx2DA6RwUB{PLTa˶n!L_im\VA`hh+*C~ T?]Z@޺S_-@zUPe0:OH&d/g%. Z:Z47M`[,%ĀCЦ{Lfg`j($"!ޗ 8 BK5FKʁK-29qRִkZUxOkA *1 Eq,.^Y H J9*|ΨDY`ppR0%B>YgTs bP< -E%N41|}@fBarYDᔩy.ƻXV_[RCd*V` LU:o4Xjr"9>x0*^yoQZ4ÔO7}o[y嵐lBMYaL[Z'.(N's.LSޝ3E!!}uG\y嵐W$!Fϔɂy7& Q x` v2>"~;Xd=o񽼌x?Km)B^1^?=C5 .YhHSMp,t/!,t˧W`5Ш^79=~n0TRI>)uK7{kBEl5=kFp q:0OwzEDh [& ,915͹=ӊrSi/ihLM6Q'FG!JF Ȯ!"f[E pj(Akpp}ꅺWָn0[g.S9:XLTؿ:p 7˧߆3eT5uRn E #<O[2JFT+[%w:r@Ua];2ThKcQSRRMUg+M1:Z F]Vd t J%Jk85 5Dw DϨBx_^yO{kVN!-iCT+GݠzFu cb-ȸ>ޕwUs8O[5 L߽TN3J35)E]2EU34˥jL2p,ߓBä;?ռX}"ݐg\bA t/`D#gPv:AfPW 5&EFeh/c 5Ά`#S3<*}Z}T͹+Ѻ(6x[R:BH +ui3[/13Ə -e*:8xJD8lD ˒˲,wgMx&i73 2Fڊl MH'A5IY:/U9t Lng * -G9*li;EK!xɕ*KVH vsjڡe}t - L?C"y蠉BJ萷ZԐ"#PzRd91k̉361i Sf9^=0hjt}fDjr91f6GɌ}lĉn;00t$*7+[pB{4X#};:zk]Tjws)Oe}Χȁ`0ՆfXNJ1ҹhDH1u~Z*0eY`dwѴ,,.]"ܢ(v|ȥT_r>R{ҵ#D(S-Eeǧ TQ(s49քՓ&N3!AGWI( d:E4~[+:[:/JJ#f*|q-Ks;>4t}{uqgڿ)>T5$(2>#-u%YGNU3 ;q2wm_e|IdZl藻0IN]$G%xN*Q\/s|Ap&D#s4JOaV{l!M \2c)L420b ie0ͽīAɹ֒ޮʵ#Eq-92uVdN beE"%SDty!ř+!]Jc50)-U_eYtڵkWOn6U'hi2Lv`Y)t1(UcAxA]e\31DM93{(5mͯ@1'dS?qZ돵$ V*o@@ lJPFw`BFa4`Q` ,ҏHO>6M>C+Mًd D`ˎH]%w1\]kK= zmlյ_W^{T$ mdgWøn B7Ь@e\>4#cI,Յ*(Ն|Z Fh.r td GeϹOhiU }lÕxyd~3%q\Q),{1[xfCO!sdt6إIoZ}iQZ4 %i6Í%A)-Wxegq%A<՟dP֧HHH(\x9 2Pfn C6  50XBhAx]j՟|yU>o>yc[I7(lkr.ejϬF3ԦnS+'X&Ni!Qb#BB! &R|;t|Ɇ ۳LCƛqz&xT J=q377q4ƋHPqLSC@R>o~LEѪ_^3`It"vJgQt(($>P< ,6ܱE 0R-z>>gd َTd41,+_}_[֏nԏ5~LBPfL{])%SWn6 ,Tj&LԐ 1t\jsŅ$A^z:JN[P¢C.\CƐn*/nO/*9H YVD۪T9)3m [-m[Rq'n讍[}s5碃H٪?e=zUFC,T*! c"X;g[G[ϳU4o`to E\hBY^` HmkϮdRn~jhɵ|)WJ! dhT¾OHn([^t_ع2RAUNL%7Buoivo]d5H q5F²+L4ǝo rt։\JeY gc2*%"9Zq%tWA ?j۶ĭq', )QtiSVhu_M h 7 xZSrm_ߜ29K+h SдX=3Sυ-)Gv-sJ݁< ~M ,PuyIb#SV>08\2gM|X(iOLRw񁓮N9\Ԣ BJ;T/OG!5)`;L?(#Iӈ[Ӟ C#hZOge)fq v&j<%g,ؗ !Nm^Ccqtw ]h<ԗ!Rʃ:/d}M5/gmo{󬜣=O˜::2u]*}~&/D+B KhF.8LK;VgB=j]T"a F #,Xd">O޿{ X4j2>}t#iw'E :(3oϢ2O vu5G#+yodVۓG?y5g4N/(*`=c_d+#"w(O[F$ >HY,I@U+#BŇ19Rku3=t cLv=O<$}6Kڃn-=a0j.?6BJf1F GEԲT Lia.^d7rmhuop-~taQp4g|#0ƍǒAA,/ȑk.ÃJbRa0.$ʂ#98`WQy ]ڇ&h1eW4byx\Tp .M?,-inTFLvƒAe@Tkbi_Ela_Ɠ}X7"}b8]J!D$@W,3?fp< 5AGjϫ fUbp33rCɀ O.=SPL~8x8{u{ GuP9댏EpH XnC 5Fp `jU$[5E90sAf_?.Y~nq$sf֚4'SJc=qQC U/bX,EpE`^ kA 5;1*e(8J=+1 z`!D@V3LX1QU533VΒ#{#W%5@P"AB@ \+vV*f;HfAK'["XTR\T[*Y*9"kL:fO8)tH~ f9sP"f [DJ +l٠~2T`\0V)Ҕ ь'ef kK}Ƌphq|}}Lq|l$iHWOaR^4x_5٥G [1bGUL6Rޢ?`Ί^ɀ.cyzR%;8Cr:;vXwSS{z7Ȃd$fյP=DL%aܬ* _UthYw8\\ J/iECz2Dƽ`h BA{//]s;.r\h۾hZ8^S`_"f }`$]Cpk$55n4/0]|hnx7?͟6#<-uiښ͕=;v>h񈼼N'3ht08V oCxߙGL^2| dR 3r?~.'㹽E-AqvJn~;6/~I뉶!-,ռ:z3@Kq%zg!gn)4{g{KD$"و#)&ecrָQwhI,<#wϜA^޳gB*Z}㚭YR!A#L8"W݆PI*kG0! gI9Z BsB\$Ư6XdTU!QU dN1qpGl4x1F9N#Rq"GNVhp: x`$Q3!L3һYpJz)>-m';P:wGV vEaC Q5xM0CRt_Y-P3EK*1aQj50-8lvšԯNhp|ՎašUǑeR0FDg6XѦ->jM~jE0U Y???YF<β*\,wUEL]^kgs89u\JTw.&+oB9r%(TrR,6ȆHE Ϳ@9+Ž ^, -J3δ$Ņ(֚JL%!̋KtsIG;>Hb7N{S1x xcrב?}£ gSᎺdP1b=B- B=O#,*t3Kӆ q4M+/'-_+ !ijf=BzIt[^ƀ $5 ?XjEZ#'&!üb/^"}o 3Äw\:#Om$}uRk!_Rxvs?!%֗ 8UJѕ\`-e2ߌoI5Pf-0?͑Х]tLPCRTbsv&J7Q9eTs[Z!A^XaL4n}~s@NY6ehs.d ͌yzW]Xw:)v֕:{h}Xh5iƣt/DU!,JLJ_Lq#2%^,&ĘY)1Ɇ6(9 2F`Kp@iɼa ~NzAj0,?d淓}zml o}THɓvnbiW%OUp%G2goԂl?:Q,}'@U:;N_]:cXkUgU ׈ Ocɽ:Puoꁼa-A}1W/٧_פχS2_2Ȅ^@^t Af #_7UWnU3>Uzv7-usmQ $뚣YÃ?Ǽ-2OA 0=9\wvm^|>QT,n >_\zz9$dI8' gӳHB۳)ݰc38n.-q8LbS|vmLM[<+V|lj1dkO) ⿅כ$.^)!?Cݮy5=|:0š@.dNP\Akv Իg<bXa<(~KG~)CBX&[n<[r22D%.(4ňj`*Y\ /6J25,: ZFF)؂׊ '`QGc,-Osbj=gP,I &ix&.KJű:ԁ&/ԈV$74 WDFB!ȷ<,8>a;3a?%k=*A@zTVU!&P)D wJFnE)U+t#hoQBEo g!1b1F QV3'Xzɵ+"U+j>Zx5\t1!1XP "I'QFys2#](2ϣZ4dk2P Or.8u:Kdb4%J8T\jFQ+<(Gwx 2\(xM,{4(y (c Jjzq$o ˇIxu+vϿ;XL7̐aB:9DNYy/u f'yWrj +Xc|t ^ד٥qkKO!, Gn]&SN{A@^&̀V 'M%";,bJ@w険^$V%K_"u^{UPYy{^@UEW?WoqzFRg b5j]ؒ%{%{,Z᮹uVɨ{g1i=і9D`uW9]et ,'* 8JEKs|;z}xDB;#VkyŮL'mz`an#"^EI3=]з'\ c68B3g`=s+RԽ^gbʬ?):[n~eeX3$|d4!|=P~A3 r3+Jޠ2}b\vTs&1HƅЍrΠ\i L "۷]!MH{x,V1l~4%3^XKu*wіT?>܇]V67~~@ -gKXP9 *cdpabMHebqi7)"?<ulO*=`fӁpGmqoRrd wLe&o o@{NAܹөfDf+z"VuFjdP}tqV{45zi'm7uٻFr$Wz`Rś ;۳Xlc3 (wm}-o0#lJU}ș/ ݇ww>| pכş< q{sp-Ss[}5ZǗyiPy@5(r5_%A֩2+UO]>}۸8_ĄXlM?ZElQ٢.\nIҨ#F-\L9I& nJh6x\Jξ[979)MyXm]!AD/$,c?5*T|gd_{_%mc-V}\,_SD7rGbi C,6깱̃g@kmZ-6{8{ShY.R$NEÔ9Cs ڂ5hV0eu@P᜘ FF`+heûce:m3hq/yKEI$7כV!J!. nCsϟn񶉛ì5Ay|qȷ5o<~3&T:-TRȭ |IgjmB+A1Z#&hcrB4$O*:&,n%|:H`P(=@F a YPLmn3Z1m ^Szuܕ3 Pf$*fF+ :pV5%dC=nz"Rз1]hcc֮w}-gS@jTB:@^17hCs?|fϞ]gpݪW9ڒi>z,RεFW"gpIMȈ#cyo@mP/<0F98wdr]81ȷVY߇QǓÒ(}|k 6 jq#y1C+ wOP~^w1JխW?xlzhRmW_=RC!R :{oW+\~zr-ncv4qS$ͿbRsk/3 -F740pm '< on9I7s6R$jwnk5 ǹ_9m.Qh C -5S' &h1AiRԈ(HΦJ ;SqJ+H| G6BJv$U:q?]ʄۈO~Qˢ~KmF텹K]%)/n?+窾-CXWfWI?F%'šJhfsNRPjzy2 x'jozy"e{%ūI4Qll#bo74,>logjЊQgn>dM9>]6}tvFYT(0B}DL}i&lB*7YCɥ47+\Cpsc>lzcpޒ/P=\Ñ?Ky[*~"ߓFJ4}i†Bʱ9O:Aqgog*ЄTuRf[ GfD3nari6F%vXBM晊F $-"sg Y YY&3/D)IwLڠ ,*TG:*fCkp)*#DRf fJZ%F:][Rj ̤RB]M~P A 3t ?Wч;?ty$+5;2~5$blcg>=e ͖w|klt?'װc0c,NtzrhP>>I C` p߭fpDJJJJ*K@%&-x$Ie1N$uB_nEB'rW) /" }y>оa8oʫ!%s0ae2=0$E#γ|{ypdC6㹽I9_ƱSgqosjvu哏Egu`Y󘧚Fû -xD_zn4^|^-'ڴϸ><}TN57q75 w+]}`m6PTE YhbQ3k/k8^,Q3Pͪ.ʘf}ц>8b8$I?<:(}g PC=@]u)UZh1[NQE;OQǥ9HkL'D+`JOQ9RSמg!%TZǰi! :E}0ލ݆$.\9̈d($mÍ_؟l\%F.etSG!xؓ!jйXh2)}!,q%\8S򄍅3\ 6H̶"uyKېg­[|/5PdZ(2alfQӪ14+8oRbr.PJmVXaրP3"p>WƎ7pk p[i jQ"W%ʹ-s~f׈,sQdI9#O_9s Y"in!Γ3֦oM!zRȃ0@n`"ɽ<-z2QX;[-,C!nptT|4d8,`I^H3N눽Wr2 %!+$bD=*nbeכoC+Jľwl+~EX۷D4רɺ fߣ+:?lB+ckNY~zrwYQZ Ҩs6zI@s/<0vxe{&WuU:%fp@q*y¶+F&9KZTe,F:7# FCzE .H\N\^y׺vI~|5h3R $: VHDAuFCTJ+Йe IXf< k ;lBHg9ԝš4"4V`ĊpX2R'c$RՑxFGAk 0R9+ZMFrwb܀bܧn8 &0}fYYl9gp.E^b?[⪲u{2_bkQ"L.凛"~z(j[Ad5E`[ݧwӓf1FQ ثAo'u{3wFuy"HI{N%" \$r$8)h.:=s 'KDRBI>@\T I 5@I}y|FVzG%'7G))Ec2m-QA)0KdRgWq{'<2!FDyl'W&qqEѽJH ]vW#$| }Lc([^Jc(y4^Hې?mm^We<ǿlx|ZQ]d(Ȭ[gs*d.ҎI?/G( iC|#sMZiݯ6ܕDezuFH@ UQzMh`T-BiaO>C'rz&{pu^7\ހ(nu$r盃?_d#}^/{vy _sXJZHƱ?(BvօakJyDont .8Jqv `ʴݧx8pKJp<8/|b}ξƁmMwv1(S{{_d8/{2gq/ū3_J&wtښҁ.Il|(y_W;6 jj[qf*G n"poH2C@ncḊ4$myd 9 a7atߢJAh7xeEKY^] ʏ@~&pYfTQy@` TCmX:ԐqQ8@cQN"DIC>ᚾIXCVFlwx!M=ݨ5tQ3x0o,~CiWG<ӃS 1$RK,JP;'L 6eU^bύ#e'F>(ȽRHhOeĢo&E`1`t %}1(?ʢ ~آ5Zߢ5"98!#*lM\=7j&_KᖫO14p q*,:l@}rz(g="bds4溩jKzkɉ$4ﳃ{ZȗXB[x~]b$:n9r!u#/0+RNN% yzµ& nx_{AMdg3zP#^ O)]#ޣ  'דPwJidsDw'.lW%EЄc'yBr?vuVp} ~O~K/ zxDݯrBٞ׀%o!Ck:8(ƫA;g8nɩ&`ojsT%|fߪS#-U5_-K3w~w)ލFgvnoQ<3uPϿR|ށ:W+ :BБywAeqr'6<-]xZIGhi:18E^wC5X8ܿXЗ^]}-Ą_uY X?`ݥ)o"W7twW|Pp8"ճ]_CDxʛOĆjI3qU C; XWi7Z SL\(i_\E/T{1:Cqu8UewmxdOBC*:D@g/1:Cqu(B:ͺ ?ӺА>ө9뀓zzn oպn7fO8PcUQk^Z>zAIX^]/z+Ův=lCV~g|V|.)E}7x%u^4jcs 1U$8UP"IfQ\&?Cq8މo;v5 ^UA TFۏV12?ZL8гD$55z>*TN_gZd& w0D-E֪XsrzsBy-Nu\ݚ84P$Zmd".C(XxZZ{Pd q@v֬r;&gYY zħ-DSvCq)IQY:̣_vP8Zkc EgNd() J*S,9M1"Hqzv{=~CF־Dd'V|?+-w${ Mr%P*4C]AɚDZZD~P^X-4΍4w&YG`8ۖp[I;ܖC#ZF%SZEɍm<_(g X`K LKxk`C/I8#twqKr煶ZRpշ^"̹h@C}a/j΋R) JtHg?}l1ma[V(*㾹&X@ *hLYVrM1?>i4FJ? Ec 86P͢3 UEӑ @8EдNpT<@>o7S5&Bxkt!Q$+d+g2YR<%N4~];pW\7dv* Z7`i?dkcCaW#!86QDS(()n#h(2D5g'' H2-/cpfee7:Ptrü@Ό(9Y>= {Dtuv鯼u `w#/(F(Z,& %S*Me$?W dhK>6xQg-~DZ/&{F$3I%{.YzUuqDe? UĨ0,EC$B}АX_do,t!)nf# OB   /t *[|t{JvkgF=`)j,Y6QKpBO0ܠ1v8V_βLhKBe 2o7 7ˋgd)Q`vcg 1M`dUbp;{Pk̗%KeuRבj2#Cfbs0ik=XMnja_(ԭ}W}tE>>'Cr6)xf7}r0>_h̹ L8 ^撅4?3Njlr̺,Ō++*(q{E$R̀q 9ʺ @>k',86J X(NgFJ?)jl#:圱SAxFRzt0:+iD0[ꞲʅyF6rD^Vq'gBDlͱ? .jA#Y*gLqYC2cjlVJȞ ;W[תۯYkM|o= "HN0 r*H3|^$KTDֳ!Az/1}k][&"Z]%%J^;UmW;i5h3*&HU ixS`D a!;pǵD.k& z°n?|MxeJqN#ړR3 뉊.x  Rk_ F-GZ=Vq)cZo!\na#B'FʲȜI/Uy`(0h(fdK̄ӧz0cIB"g&j"^(G%gNYW( baWH޵5迲k7ER<6NԼLM+ ~$!fm ؋/R) :p;a0@{qR_ުe-% JJIfQ1+kS>mPvM2'\{ijzɅՆ,E~')ƐbJF'y]v$P{FO^z@gqiz`AZ< 5SǔΗQN5gV%gAH$E]\%_ϑ hmw+2.1BE@ѪVWFh'GfMX0׵<ꓧF n"k/~S'mV IfV2CtP$1\<2!Zd gq`h!$5^7ըk-(،-`֥<V[IgL$jf䨍Gx/ҦQ<5iv\8n^Bą~ŷ5%"Wdm'yMu&7BfQ$vX _/uF!VK9;U|!C:Y_<xbb&npus1-ߢȳ=CEK#%AGZ`ܮ$+}Edn:Ya^UCpǴ* [lw7 me(ǡDb X )8? @= ɋSCE,"Jzt͆e{ȒIr[]^5DP~gI2!W`wbBsJRҨL$W <U鍊J ] k; (T%Ra 6A {+w&ؒ^ ˭r&ֶ^~> 1XC <N? Nrk% J:b^1GGޙ#aU2 9(YhD@c w<8}`ty:D:΢ U 1Lߙ!;*o^g9a<-iE:fK%)[T6FS(JEX*%Lx` a؋uM>y:jW7,%2ǂsY<[,TyJL|,榊Z{3VQwFKW57B=' Fჸ.7^tu"T&o)OF`?֩螉}9Wg=Й&lq*'D )X>x3bg^2b3G"=Wd,*T$11mO^_ (9~~)Hf\bŶw?1٨%oM*df}%|oM|Bk3NYUlm_zBWbNbё;rnn{i}a9/?~()xEwӡd]p##O\t ~[E<`wbTFR&#nAL\,닱k.? MngC,t(bb\l9Kl_0k5]z,N:=] nߗLHbӕ&,My/Y_ <ɜE6!TԠ1Q"=֓Rv'?oԽ0gܢ|{9pX)qB5v7d`셭ynblQ,vI*6/wwoɚ|U"!u}u iFQ#N^+Bf^}yQ.4{i'OG a)M/Eȃz3-gsRWÃU*i,ZSS(ug $*4+' , ~ր-|wY{MHiQe.rsx#iEGkȜr(K͹Yk S_%<| r, Ͻi'OGZ~{EZQnmQFY^f))BI#ֿQX c {s{XAJcmKJ{[Mswj:飢~V!u5d#&P{4Jk9<~QFLb|kSaOJ$m (Z%`.Ual~ma #gCza+^_;ŧ*)Ly|_Twv`C{Ehr7R9~axd&< ,8+}.}E<[U*ԿI'}-JI,ݢ|gdHÇMZ'iP{jGE5^553&Q =gOZ7߱I(rpYήm/ˊ!iⱌ]c#!Ơ'<,GBA(>Λ1 IP S~HS8έ~ ը0xӄbF_?cum'}Rk$NY2l s~' %,CK|̓5-۾~,y04/o)jVn>8ڗDBz*:lSou=W(xDt{~0&Ca?;5Fcƨё5")Q?rmCH!5m- +[AE';qWŢ) S^;ml?lE]o!Z\!$0ٮ|ӨO~uX;to1<~.O:x{ԗ^ڹ Ē0INR$CT%:5/E%Ir-:sđ&_FM@ߍ9i. <16Pz>{'FVVkdJ, Z̶fqRSO,^k}eiYf+Tb{<_U9FsJn K)~ވHXjVԵ.'Q~r}A‰Pk P` va!~~ xTȶ$}V h %nl%.}GH]F rؾ4 R[F3^кULj=~anv\B -c6y*e!a_ZJ*magȟ0$ /j_Ff{!T Af|>[J9V]vW5~|ua\JX4)FbF/GdGBꝅ,,3r BQ<5 Bq1Hܷ\>QSǿ<׿W\zxURטRynJ2;M??^Q-q/k矉Ӭ?N3fx;68 kס;~_<9^8psǧ 7Q JȗnTyD ZU5#>ymy:~_˘'??_5O 0(ÓQUJ ~~F數x-l3RYg1@L:/X*qCɔ>\`P)l$j O(Ws9y[ jo\?Y 0#L0#5 2i S_ygd6 ]mKbVOyRJ#:[ V^@v Z (ƊM*ŚT5Mdԫ1O2)ڥuAL&: 9v NFgG@;/4J&Vmc !EG +weћWĬ"")aQ{: r_#[xA8'!Gw0 $?ћ7.&܄^``st_v,ydʭ܅ȥ!X,Yq4~]\ "gfm@/ T$T $iA h4 x H;=e}3-l3HK _"Tb,aUy-ИOL$'=4SԉGܷ넒L]7exY:{vzkhW̃ދv G`4]/T6O74 uqi1tE0* lJDuchmvHH[BbW8@1^Z$AА~?Uq։[BKƼz)MH!x f{om.CTTh%a]R|k~絜~(yޓ6q#WX@V=oU*ҦRo_J|Jʅ02C#3-%آhA|fYYh-24ϴPꂇE[qDmIo)!z/d U-+-eg?p#3KCsdP+@EMUj.x u~@ -1kR-}E6EI@dz̻B\E$Y)M `R@Cd+zlD6I DTِ*8o7gs%\-PdZ޷5feX;P@`]tsx/9-/ MYaKi(YNtEiJ_~/D)e_/RzROIR+P-" F.1n/oyWD7!P\gƀ[Fi&MH``K`["1Q_b6.}fB|4r94r_:w[l-~ܘm@AA`@VC4^+wkWolѧHA zUѝՐR$5JX W|tq?®FwtyneS_3Մ1^!P|^n:&^jW屃qL/*\e%h'~4c7%jawB8nB$B+\pBLOIf[RAI*SSblyIyy0cO_q1aK{`> &㰓 Ӏ!3ed\aM@e9|;T~-h Fe*$Zr `fO4)y;q<6y GpW)̶㫒%)NjЈ9FpL)z̏i[qtl}Gw.u“ vz64(V?$$b|ܒ[ ݒi5|?± b4fUN,Wn |rwJ o0zK|W+̈́>b/= G#~k@.oqo0G; 0<&whUʕMK `TG v"&DЕG\G<ȍYub2α7ыe );uHUUk b-?|1[f2V15M&7v/iT }_~ZßdqQ5ϣb,! ~;9] |64}i\O¬lޛ<0Mw>Fm^XyrLm4V3|n<ƈw5@|VΖ׎3Rˌ 3'6(J*f9m%W%)e2c1B|ӽY [2sV2XDIYPI/]\2.bOeshTJx]Q.?^UCpDʲ֨‹!^o}SgԴLTZJZ;|xKx"Tu"zHN=!MXzyP5/\)mB0 + Z#ȥ Bk lg%`hT&YIXIR %~G@l{P.(AL7A9H$thJbv3Ǒ_FY)?b|H?S$h;QF^_?%1gC8B ȼ?agG}FdтDNp(4b.ҹ 9|Piꄤß6}UU=0FU뱌4DR)c ƀv qA5,X;c G >c 3+0R # t5E7ժ1Ar1h4‚ZZF^($oȅɕ0+ $˛-`qҢ>6nzS}/>ot?.S!p~vY^ o/X |uȌЬ7 *l~Ñ7auDW dpigd׋VTR>I};!O׊mZ9ka= ;|sEoɻiQa(DU˟2!rT2jtJ`g<!$T Rj;am'&ҰH%6Ppx@Cɠ=id1~Vij 92c5PED+ Ab^C9%ƠlAzV%f03d m:J<ƒV"A )XN0εy?] x%%V!'iDdPf+ E=h~Fmۦ!`xS9vIfs9;ć7boؐT7ZT'OLtH`mHJRg7!,T$RBHE>Pp[?q3k RHzx۬W4bu7ewm<et\ FKQp4goossqw$Oܹ7&Ai}On>eoD`Pj]F: ˆSx2 x+qehʍNƘY RkՋhdygz$ k | ytmפa%9֭Ec7ZVȏ8TL*aٜ H{1nL| #DzX;L!0MĦ$B4DBj'ZVrHpΘR%tAEć(qQ\Uf'q7Nj#WI7)4fǹ?M5-Hs-瑲–yu7Pa򀣚 eNwt8,TF55?z$jjCB6)mz怜k7];vkJi:p'T [5/KnmH3F2vГ1 i4ð;j< y*Ywc67KقGx>+Mez'IjϵtV~NSF'V_XaTew77Ox,^D!ڭ41OG8~xw<~Ô&UWЬTɬtzHr.JHϯO3S79Iw__[I^~f"M`}\d|sQ FKE58㊃VNv}o~h݉*0OPZ)cWqm wBC$G߯4$ٮ%Dk]K6{=5R#@mQ@0vU@-Qm"fFƽFwHY9U?:^d64Aʵqwl8Y<ܙ;ߓІ7T}c<m8֢۝Ck3S|hʋ©޹ nj iPSEh'$6+;3g.v,YqQFhꁧbe\NCGwv.}C^`xV>I ^mr&)PP!V(iYPTd5RAkx\EB^ZF^j P6Ҍ\g& )$} < yC"{K3Z6WŲpZh?H.a T,6J$, q23@*hda #& yn^h!d{bseǣKI7-b]5 0bZfp ift@RF&GRd,~[0E%[oP/"3yYa4 pEĩǶh ͷ(gZ12kEMgE?ބhȳŷT<| ȫSB'ynH5grh쩻ۋrT] E^+7 n]O'gIwmmXzY~1"M44ۗZR۴lƒŹJvIlQ3眏<<$sP7U ( |u~ibR]?.o.̴,&~mˇP\*Ս+6?rۛ8L4zL,UU.nOg8yףg3dI_)TlMy²ٍv }͓= H~Ta֒%0PWty?n(ΓχnockU,?,'G@Vy+9NyK~|p8LDh[/fsUomҥm~34 =xH)?$;A^&VS #z{$١,?C-g%<J1h_ `Gغ9Es9vh`}qR1 tRv^;]ue?YV5Uu~ݚ:^?ETD C \ZvyfI=+ 6&kr-L^b<\{r} m4@W{an58|ۯ e`I &AqlG,fbF28;㑀  oRk8h.NRH X6;GKpyѤAj$L&5.Z 3!b c *IX"))'hDU0DcFK)BҚ q!F'ccܢb5XJ 0ЎY'G yH  85T*HchmcLR4(\d>dxA`( Y ^Bk p%Byƨk\j$(Cp:gX P&𗎦KZlzʢZvmcG*c'~.९}}d_ d`-E@0X }9a^V*Ht%8M(xC]M8 K}hMbcӚ4yUf)ǤPG4cN*AgK*A 伛B t QQ^A |)OBB>Ľ /@ iBK^rR[<6 }Juԣ>$(&cy֊L&;F2цr(,r%fE ;܉xB^/F1 A8L20a4N(äKI FN8/w,j~yn͗Hyѿ}sbB\Uq԰[]G]dfB#^GJ0Q.rBwǿ_K5"oWa;1\e|{tsLq@jvxpQ,EE=Lqux4[߫خE$!!F; !Bd\l|еyw>IN &ƽ` 8C\ T&F04Bh, {1xA`7=%h˃Uv#џ,B/*E];9A'1ѻuC]U#ΪwF[[tݎbn[=E+}*1TP084m/TД'I f(1G*Q~<];q Qc@+Hr3~ݭF*H;;#%Lv1QciuFrZ^Fԏ?x!7T]f>nw[g)_-o6/-Qicycޢb\&4:vl4Sl&R;lMf eaɄ2N h}kae2!)c@~5J%ݗrr#ɶwMEFpiW/6&4S rywTmG߾:Z,no#R=$x'fYJjJ)<6J)KRS^Ue(?E(?$h&aggCdmfTv]ɆgTSzFqoã.8CV::GY5P}՟:TO9g6%_Q}pR0 z.rѠQx ?(& br)=,-BcR[!W:X s$ 1Qsl_ VA5'cjB!2JbhB z$R3%Azh-tKl`|Q*Ew~afH*Yx;)k 0%cV|%'Kna]ګ$vX ְ: )5{* 0j YA!Q JK)myՒa, c3)4A+P+Vഖb Ją'*5M(ݰ)'J頶}&`YyO,1ˁM0|&U) VTDYeB^q5Wgu=:ۋl~ο Em ?ps `B{48Xb` .%tvZC -lKNc2 k]e)Y3ͤ5LR}L3嗷Z-ʙ~m/d-=f)ږ >6N\BY*배ԳȪؼ+%*w b=DRRxϙ@avZ>̢O`is6/-)Ke3{!mZ;1aVsA@u~StZvm#[fvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000006347215115137053577017723 0ustar rootrootJan 30 05:07:43 crc systemd[1]: Starting Kubernetes Kubelet... Jan 30 05:07:43 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:43 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:44 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 30 05:07:45 crc kubenswrapper[4931]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:45 crc kubenswrapper[4931]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 30 05:07:45 crc kubenswrapper[4931]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:45 crc kubenswrapper[4931]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:45 crc kubenswrapper[4931]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 05:07:45 crc kubenswrapper[4931]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.143826 4931 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148753 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148771 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148777 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148782 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148787 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148793 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148799 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148804 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148809 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148814 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148819 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148824 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148828 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148834 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148840 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148845 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148850 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148855 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148861 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148866 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148871 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148875 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148880 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148884 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148889 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148893 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148898 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148903 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148914 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148919 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148925 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148929 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148934 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148939 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148944 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148949 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148954 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148960 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148967 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148972 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148980 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148986 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148991 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.148997 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149002 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149007 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149013 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149018 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149023 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149028 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149033 4931 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149038 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149043 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149049 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149055 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149062 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149068 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149073 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149078 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149083 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149088 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149094 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149099 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149104 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149109 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149114 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149119 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149124 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149129 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149133 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.149140 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151265 4931 flags.go:64] FLAG: --address="0.0.0.0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151287 4931 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151309 4931 flags.go:64] FLAG: --anonymous-auth="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151324 4931 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151332 4931 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151338 4931 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151350 4931 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151358 4931 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151363 4931 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151369 4931 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151377 4931 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151382 4931 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151388 4931 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151393 4931 flags.go:64] FLAG: --cgroup-root="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151399 4931 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151405 4931 flags.go:64] FLAG: --client-ca-file="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151411 4931 flags.go:64] FLAG: --cloud-config="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151432 4931 flags.go:64] FLAG: --cloud-provider="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151438 4931 flags.go:64] FLAG: --cluster-dns="[]" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151446 4931 flags.go:64] FLAG: --cluster-domain="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151452 4931 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151458 4931 flags.go:64] FLAG: --config-dir="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151464 4931 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151470 4931 flags.go:64] FLAG: --container-log-max-files="5" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151479 4931 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151485 4931 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151491 4931 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151497 4931 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151503 4931 flags.go:64] FLAG: --contention-profiling="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151508 4931 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151514 4931 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151520 4931 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151526 4931 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151533 4931 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151539 4931 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151545 4931 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151551 4931 flags.go:64] FLAG: --enable-load-reader="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151558 4931 flags.go:64] FLAG: --enable-server="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151563 4931 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151571 4931 flags.go:64] FLAG: --event-burst="100" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151577 4931 flags.go:64] FLAG: --event-qps="50" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151583 4931 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151589 4931 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151594 4931 flags.go:64] FLAG: --eviction-hard="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151601 4931 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151607 4931 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151612 4931 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151618 4931 flags.go:64] FLAG: --eviction-soft="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151624 4931 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151629 4931 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151635 4931 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151640 4931 flags.go:64] FLAG: --experimental-mounter-path="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151647 4931 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151652 4931 flags.go:64] FLAG: --fail-swap-on="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151658 4931 flags.go:64] FLAG: --feature-gates="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151665 4931 flags.go:64] FLAG: --file-check-frequency="20s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151671 4931 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151677 4931 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151682 4931 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151688 4931 flags.go:64] FLAG: --healthz-port="10248" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151693 4931 flags.go:64] FLAG: --help="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151699 4931 flags.go:64] FLAG: --hostname-override="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151704 4931 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151710 4931 flags.go:64] FLAG: --http-check-frequency="20s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151716 4931 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151723 4931 flags.go:64] FLAG: --image-credential-provider-config="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151728 4931 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151736 4931 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151742 4931 flags.go:64] FLAG: --image-service-endpoint="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151747 4931 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151753 4931 flags.go:64] FLAG: --kube-api-burst="100" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151758 4931 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151764 4931 flags.go:64] FLAG: --kube-api-qps="50" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151770 4931 flags.go:64] FLAG: --kube-reserved="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151775 4931 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151781 4931 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151789 4931 flags.go:64] FLAG: --kubelet-cgroups="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151794 4931 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151800 4931 flags.go:64] FLAG: --lock-file="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151806 4931 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151811 4931 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151817 4931 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151826 4931 flags.go:64] FLAG: --log-json-split-stream="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151832 4931 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151837 4931 flags.go:64] FLAG: --log-text-split-stream="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151843 4931 flags.go:64] FLAG: --logging-format="text" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151849 4931 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151855 4931 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151861 4931 flags.go:64] FLAG: --manifest-url="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151866 4931 flags.go:64] FLAG: --manifest-url-header="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151874 4931 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151879 4931 flags.go:64] FLAG: --max-open-files="1000000" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151886 4931 flags.go:64] FLAG: --max-pods="110" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151893 4931 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151898 4931 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151905 4931 flags.go:64] FLAG: --memory-manager-policy="None" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151911 4931 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151916 4931 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151922 4931 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151928 4931 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151941 4931 flags.go:64] FLAG: --node-status-max-images="50" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151947 4931 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151952 4931 flags.go:64] FLAG: --oom-score-adj="-999" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151958 4931 flags.go:64] FLAG: --pod-cidr="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151964 4931 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151975 4931 flags.go:64] FLAG: --pod-manifest-path="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151980 4931 flags.go:64] FLAG: --pod-max-pids="-1" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151986 4931 flags.go:64] FLAG: --pods-per-core="0" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151991 4931 flags.go:64] FLAG: --port="10250" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.151997 4931 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152004 4931 flags.go:64] FLAG: --provider-id="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152010 4931 flags.go:64] FLAG: --qos-reserved="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152018 4931 flags.go:64] FLAG: --read-only-port="10255" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152023 4931 flags.go:64] FLAG: --register-node="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152029 4931 flags.go:64] FLAG: --register-schedulable="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152035 4931 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152045 4931 flags.go:64] FLAG: --registry-burst="10" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152051 4931 flags.go:64] FLAG: --registry-qps="5" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152057 4931 flags.go:64] FLAG: --reserved-cpus="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152064 4931 flags.go:64] FLAG: --reserved-memory="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152071 4931 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152077 4931 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152083 4931 flags.go:64] FLAG: --rotate-certificates="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152089 4931 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152095 4931 flags.go:64] FLAG: --runonce="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152101 4931 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152107 4931 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152113 4931 flags.go:64] FLAG: --seccomp-default="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152121 4931 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152127 4931 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152134 4931 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152140 4931 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152146 4931 flags.go:64] FLAG: --storage-driver-password="root" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152152 4931 flags.go:64] FLAG: --storage-driver-secure="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152158 4931 flags.go:64] FLAG: --storage-driver-table="stats" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152164 4931 flags.go:64] FLAG: --storage-driver-user="root" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152170 4931 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152176 4931 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152182 4931 flags.go:64] FLAG: --system-cgroups="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152188 4931 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152200 4931 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152206 4931 flags.go:64] FLAG: --tls-cert-file="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152212 4931 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152219 4931 flags.go:64] FLAG: --tls-min-version="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152225 4931 flags.go:64] FLAG: --tls-private-key-file="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152231 4931 flags.go:64] FLAG: --topology-manager-policy="none" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152236 4931 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152242 4931 flags.go:64] FLAG: --topology-manager-scope="container" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152250 4931 flags.go:64] FLAG: --v="2" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152259 4931 flags.go:64] FLAG: --version="false" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152268 4931 flags.go:64] FLAG: --vmodule="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152276 4931 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152283 4931 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152413 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152437 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152442 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152448 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152453 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152458 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152495 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152501 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152506 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152511 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152516 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152520 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152525 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152529 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152535 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152541 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152546 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152551 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152556 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152561 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152566 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152570 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152575 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152579 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152584 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152591 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152596 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152601 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152607 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152613 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152618 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152624 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152629 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152634 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152639 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152644 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152648 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152652 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152657 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152663 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152668 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152673 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152679 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152684 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152689 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152694 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152699 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152704 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152709 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152714 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152720 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152727 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152733 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152738 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152744 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152751 4931 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152756 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152762 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152767 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152774 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152779 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152785 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152791 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152796 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152801 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152806 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152812 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152820 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152825 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152830 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.152835 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.152856 4931 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.167410 4931 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.167481 4931 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167637 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167655 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167664 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167674 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167686 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167698 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167708 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167718 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167727 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167736 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167745 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167753 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167761 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167771 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167779 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167787 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167795 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167802 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167811 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167819 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167827 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167835 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167843 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167851 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167859 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167867 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167876 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167884 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167892 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167899 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167908 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167916 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167926 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167936 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167946 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167955 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167963 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167972 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167980 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.167990 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168000 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168009 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168017 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168025 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168033 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168040 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168049 4931 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168056 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168065 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168072 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168081 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168089 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168097 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168105 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168112 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168120 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168128 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168140 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168150 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168159 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168168 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168178 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168187 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168196 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168204 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168213 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168222 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168231 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168239 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168248 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168257 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.168270 4931 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168551 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168566 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168575 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168585 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168597 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168606 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168615 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168623 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168631 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168640 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168649 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168657 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168665 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168675 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168684 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168693 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168701 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168710 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168719 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168728 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168737 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168745 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168753 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168761 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168769 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168779 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168788 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168796 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168803 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168811 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168821 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168832 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168840 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168849 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168858 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168866 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168877 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168886 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168895 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168903 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168912 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168921 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168930 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168938 4931 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168946 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168954 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168961 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168969 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168977 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168985 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.168992 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169000 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169008 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169015 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169023 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169031 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169038 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169047 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169055 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169063 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169070 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169078 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169086 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169094 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169101 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169109 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169116 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169124 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169132 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169140 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.169148 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.169160 4931 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.170988 4931 server.go:940] "Client rotation is on, will bootstrap in background" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.177579 4931 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.177711 4931 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.179976 4931 server.go:997] "Starting client certificate rotation" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.180030 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.183605 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-11 10:41:40.527569933 +0000 UTC Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.183715 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.207300 4931 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.210792 4931 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.211576 4931 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.231039 4931 log.go:25] "Validated CRI v1 runtime API" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.276835 4931 log.go:25] "Validated CRI v1 image API" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.279362 4931 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.287134 4931 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-30-05-03-06-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.287211 4931 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.318297 4931 manager.go:217] Machine: {Timestamp:2026-01-30 05:07:45.313151778 +0000 UTC m=+0.683062095 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:babf1111-baa6-43bf-8e98-8707b9d18072 BootID:9d83649b-6a34-4b83-bc96-3ff1ac14c758 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ce:65:43 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ce:65:43 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:06:ab:4b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:46:dd:b6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6b:76:50 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:61:3d:06 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:48:9c:e3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:23:04:bc:a6:8c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:f2:cc:3f:32:f5:a0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.318786 4931 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.318971 4931 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.319447 4931 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.319772 4931 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.319826 4931 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.320135 4931 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.320179 4931 container_manager_linux.go:303] "Creating device plugin manager" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.320787 4931 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.320832 4931 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.321746 4931 state_mem.go:36] "Initialized new in-memory state store" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.321843 4931 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.327458 4931 kubelet.go:418] "Attempting to sync node with API server" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.327485 4931 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.327506 4931 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.327523 4931 kubelet.go:324] "Adding apiserver pod source" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.327540 4931 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.332384 4931 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.333644 4931 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.334256 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.334258 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.334417 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.334473 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.336670 4931 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339048 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339070 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339079 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339087 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339099 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339108 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339116 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339137 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339146 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339155 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339172 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339180 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339204 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.339644 4931 server.go:1280] "Started kubelet" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.344923 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.345261 4931 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.345200 4931 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.346823 4931 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 05:07:45 crc systemd[1]: Started Kubernetes Kubelet. Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.351593 4931 server.go:460] "Adding debug handlers to kubelet server" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.352046 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.352100 4931 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.358472 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 06:29:52.187862943 +0000 UTC Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.358618 4931 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.358971 4931 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.358845 4931 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.358831 4931 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.357359 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f69f1afef1f3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:07:45.339612989 +0000 UTC m=+0.709523246,LastTimestamp:2026-01-30 05:07:45.339612989 +0000 UTC m=+0.709523246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.359836 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.360947 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="200ms" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.360995 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.368473 4931 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.368587 4931 factory.go:55] Registering systemd factory Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.368612 4931 factory.go:221] Registration of the systemd container factory successfully Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.369082 4931 factory.go:153] Registering CRI-O factory Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.369127 4931 factory.go:221] Registration of the crio container factory successfully Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.369168 4931 factory.go:103] Registering Raw factory Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.369189 4931 manager.go:1196] Started watching for new ooms in manager Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.369919 4931 manager.go:319] Starting recovery of all containers Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.371788 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.371882 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.371918 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372008 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372039 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372069 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372099 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372126 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372159 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372192 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372222 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372250 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372279 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372381 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372411 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372477 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372509 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372548 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372576 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372637 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372666 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372693 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372725 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372753 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372783 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372811 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372846 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372876 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372907 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372935 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.372972 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373000 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373025 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373050 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373078 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373105 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373131 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373159 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373187 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373215 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373246 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373278 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373306 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373332 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373362 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373389 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373414 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373475 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373502 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373531 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373559 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373586 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373622 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373653 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373682 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373711 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373738 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373773 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373795 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373816 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373840 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373864 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373890 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373915 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373941 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.373974 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374001 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374028 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374056 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374082 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374110 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374136 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374163 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374194 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374222 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374247 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374274 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374299 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374325 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374350 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374378 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374401 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374457 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374490 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374512 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374537 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374563 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374592 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374621 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374651 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374680 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374705 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374730 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374754 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374779 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374803 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374827 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374856 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374884 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374910 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374934 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374959 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.374984 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375012 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375065 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375100 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375128 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375158 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375186 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375216 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375243 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375270 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375298 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375329 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375355 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375383 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375408 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375471 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375498 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375524 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375551 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375580 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375606 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375634 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375659 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375685 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375712 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375736 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375759 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375793 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375815 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375839 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375861 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375884 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375907 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375934 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375956 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.375982 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376005 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376031 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376055 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376080 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376110 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376131 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376153 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376175 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376199 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376221 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376245 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376270 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376337 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376366 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376393 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376416 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376472 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376496 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376520 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376543 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376566 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376588 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376610 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376634 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376658 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376686 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376712 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376739 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376765 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376792 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376817 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376848 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376877 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376905 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376933 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376960 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.376986 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377015 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377043 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377069 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377099 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377151 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377179 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377206 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377232 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377259 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377285 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377314 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377341 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.377371 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380620 4931 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380668 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380692 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380714 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380734 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380755 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380774 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380794 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380813 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380833 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380863 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380883 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380903 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380925 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380946 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380965 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.380983 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.381003 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.381022 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.381039 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.381058 4931 reconstruct.go:97] "Volume reconstruction finished" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.381072 4931 reconciler.go:26] "Reconciler: start to sync state" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.405734 4931 manager.go:324] Recovery completed Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.416754 4931 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.420587 4931 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.420655 4931 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.420701 4931 kubelet.go:2335] "Starting kubelet main sync loop" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.420846 4931 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.423384 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.423510 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.424605 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.429177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.429235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.429255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.431930 4931 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.432088 4931 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.432211 4931 state_mem.go:36] "Initialized new in-memory state store" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.453658 4931 policy_none.go:49] "None policy: Start" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.455310 4931 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.455377 4931 state_mem.go:35] "Initializing new in-memory state store" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.459938 4931 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.521483 4931 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.523068 4931 manager.go:334] "Starting Device Plugin manager" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.523167 4931 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.523199 4931 server.go:79] "Starting device plugin registration server" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.524029 4931 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.524053 4931 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.524475 4931 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.524601 4931 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.524617 4931 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.533907 4931 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.561910 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="400ms" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.598011 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f69f1afef1f3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:07:45.339612989 +0000 UTC m=+0.709523246,LastTimestamp:2026-01-30 05:07:45.339612989 +0000 UTC m=+0.709523246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.625126 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.626334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.626364 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.626375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.626404 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.626724 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.722416 4931 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.722828 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.727271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.727347 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.727369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.727682 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.728065 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.728151 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729399 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729559 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729752 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729910 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.729976 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731389 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.731839 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.732090 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.732150 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.745173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.745300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.745332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.745834 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.745953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.746013 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.746048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.746708 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.746810 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.747448 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.747496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.747590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.748023 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.748094 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.748661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.748728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.748751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.749826 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.749892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.749915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: W0130 05:07:45.773834 4931 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/cpuset.cpus.effective": open /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/cpuset.cpus.effective: no such device Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787696 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787793 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787833 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787870 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787906 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787937 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.787998 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788032 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788068 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788104 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788138 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788173 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.788333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.827368 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.829089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.829142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.829163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.829202 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.830123 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.890938 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891027 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891062 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891095 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891128 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891158 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891189 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891187 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891219 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891256 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891254 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891312 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891335 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891286 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891345 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891380 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891370 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891463 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891403 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891461 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891539 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891653 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891681 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891758 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891785 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891845 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.891935 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: I0130 05:07:45.892047 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4931]: E0130 05:07:45.963037 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="800ms" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.069531 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.092024 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.103403 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.125008 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.130284 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8ed59f42dfaaf99e9b9d7c2f3d6cd22ca5c6357d0efc979f70f29613bee11d7b WatchSource:0}: Error finding container 8ed59f42dfaaf99e9b9d7c2f3d6cd22ca5c6357d0efc979f70f29613bee11d7b: Status 404 returned error can't find the container with id 8ed59f42dfaaf99e9b9d7c2f3d6cd22ca5c6357d0efc979f70f29613bee11d7b Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.131915 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-dd16176df9a25b194f64bbc285ca871fe83a9157995e45ac99634671f9447392 WatchSource:0}: Error finding container dd16176df9a25b194f64bbc285ca871fe83a9157995e45ac99634671f9447392: Status 404 returned error can't find the container with id dd16176df9a25b194f64bbc285ca871fe83a9157995e45ac99634671f9447392 Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.137054 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.141248 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8438d93fed226e78450fc88faaa36b5308080acfa232ce5075cab3515e4413d1 WatchSource:0}: Error finding container 8438d93fed226e78450fc88faaa36b5308080acfa232ce5075cab3515e4413d1: Status 404 returned error can't find the container with id 8438d93fed226e78450fc88faaa36b5308080acfa232ce5075cab3515e4413d1 Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.149230 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b122131d6ceb64d1480bf519eefdb964b30420cf7e966c732893e7759853bffb WatchSource:0}: Error finding container b122131d6ceb64d1480bf519eefdb964b30420cf7e966c732893e7759853bffb: Status 404 returned error can't find the container with id b122131d6ceb64d1480bf519eefdb964b30420cf7e966c732893e7759853bffb Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.156128 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4454be09f6ce2236d995dd359b2f7d824be5104d246fe37e1f4005fe65439ce0 WatchSource:0}: Error finding container 4454be09f6ce2236d995dd359b2f7d824be5104d246fe37e1f4005fe65439ce0: Status 404 returned error can't find the container with id 4454be09f6ce2236d995dd359b2f7d824be5104d246fe37e1f4005fe65439ce0 Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.231003 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.234659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.234732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.234756 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.234805 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:46 crc kubenswrapper[4931]: E0130 05:07:46.235678 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.346001 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4931]: E0130 05:07:46.346117 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.346685 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.358847 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 17:35:35.561640113 +0000 UTC Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.427080 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b122131d6ceb64d1480bf519eefdb964b30420cf7e966c732893e7759853bffb"} Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.428685 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8438d93fed226e78450fc88faaa36b5308080acfa232ce5075cab3515e4413d1"} Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.432143 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dd16176df9a25b194f64bbc285ca871fe83a9157995e45ac99634671f9447392"} Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.433573 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8ed59f42dfaaf99e9b9d7c2f3d6cd22ca5c6357d0efc979f70f29613bee11d7b"} Jan 30 05:07:46 crc kubenswrapper[4931]: I0130 05:07:46.435029 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4454be09f6ce2236d995dd359b2f7d824be5104d246fe37e1f4005fe65439ce0"} Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.666671 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4931]: E0130 05:07:46.666840 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:46 crc kubenswrapper[4931]: E0130 05:07:46.764201 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="1.6s" Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.852735 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4931]: E0130 05:07:46.852861 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:46 crc kubenswrapper[4931]: W0130 05:07:46.880008 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4931]: E0130 05:07:46.880882 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.036240 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.037726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.037768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.037779 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.037843 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:47 crc kubenswrapper[4931]: E0130 05:07:47.038173 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.347268 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.359309 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 01:19:25.102159452 +0000 UTC Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.408871 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:47 crc kubenswrapper[4931]: E0130 05:07:47.410847 4931 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.442706 4931 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.442884 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.442865 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4"} Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.444610 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.444680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.444703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.447694 4931 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8e913d425338029033aa1073be9668185b9f9c4dbb2560466d086b52aa6ce17c" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.447780 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8e913d425338029033aa1073be9668185b9f9c4dbb2560466d086b52aa6ce17c"} Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.447805 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.449048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.449089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.449104 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.451006 4931 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.451296 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.451292 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733"} Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.453415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.453546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.453568 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.454088 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a"} Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.454119 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07"} Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.456940 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.456972 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0"} Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.457108 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.458263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.458298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.458389 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.466304 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.467828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.467865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4931]: I0130 05:07:47.467878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4931]: W0130 05:07:48.311146 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:48 crc kubenswrapper[4931]: E0130 05:07:48.311261 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.346906 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.359589 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 06:06:22.504439466 +0000 UTC Jan 30 05:07:48 crc kubenswrapper[4931]: E0130 05:07:48.365491 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="3.2s" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.468875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ddb77e9defc8c4121eae34daeca1948ee8aef2d6c884fb05b2a5c53e85cbe9c8"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.469003 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.470076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.470095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.470103 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473042 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473069 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473139 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473989 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.473997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.477330 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.477400 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.477451 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.479157 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.479183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.479195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.481151 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.481214 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.481233 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.483070 4931 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7" exitCode=0 Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.483108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7"} Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.483178 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.483807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.483835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.483846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4931]: W0130 05:07:48.601117 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Jan 30 05:07:48 crc kubenswrapper[4931]: E0130 05:07:48.601231 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.639725 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.643782 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.643818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.643828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4931]: I0130 05:07:48.643853 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:48 crc kubenswrapper[4931]: E0130 05:07:48.646332 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.360472 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:52:39.314119283 +0000 UTC Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.489664 4931 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c" exitCode=0 Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.489762 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c"} Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.489818 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.491735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.491776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.491788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.494265 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.494734 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64"} Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.494767 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944"} Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.494868 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.494953 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.495019 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.495357 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.494880 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.495843 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.495869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.495880 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496384 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496647 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496624 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4931]: I0130 05:07:49.496726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.361033 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 03:17:33.985053481 +0000 UTC Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.460911 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.471682 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.504769 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.505573 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0"} Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.505636 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57"} Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.505659 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0"} Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.505782 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.506572 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.507247 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.507284 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.507300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.507606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.507682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:50 crc kubenswrapper[4931]: I0130 05:07:50.507703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.361651 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 19:38:44.348541321 +0000 UTC Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.412938 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.513132 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca"} Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.513342 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.513273 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.513355 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e"} Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.513231 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515185 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515295 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.515850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.774791 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.846933 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.849037 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.849092 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.849106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4931]: I0130 05:07:51.849142 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.362551 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 03:56:38.316154103 +0000 UTC Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.393946 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.478289 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.516107 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.516149 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.516239 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.517790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.517831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.517844 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.518999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.519049 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.519080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.519049 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.519156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:52 crc kubenswrapper[4931]: I0130 05:07:52.519169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.364530 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 11:04:10.730996966 +0000 UTC Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.501872 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.518646 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.518679 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.520286 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.520340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.520358 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.520354 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.520578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:53 crc kubenswrapper[4931]: I0130 05:07:53.520613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:54 crc kubenswrapper[4931]: I0130 05:07:54.365380 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 18:28:02.892088524 +0000 UTC Jan 30 05:07:54 crc kubenswrapper[4931]: I0130 05:07:54.809862 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:54 crc kubenswrapper[4931]: I0130 05:07:54.810156 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:54 crc kubenswrapper[4931]: I0130 05:07:54.812018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:54 crc kubenswrapper[4931]: I0130 05:07:54.812090 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:54 crc kubenswrapper[4931]: I0130 05:07:54.812114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:55 crc kubenswrapper[4931]: I0130 05:07:55.366069 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 09:26:09.387342625 +0000 UTC Jan 30 05:07:55 crc kubenswrapper[4931]: I0130 05:07:55.394581 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 05:07:55 crc kubenswrapper[4931]: I0130 05:07:55.394704 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:07:55 crc kubenswrapper[4931]: E0130 05:07:55.534059 4931 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.055909 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.056230 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.058258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.058310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.058329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.366528 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 21:09:07.797859589 +0000 UTC Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.567598 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.567910 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.569654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.569731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:56 crc kubenswrapper[4931]: I0130 05:07:56.569755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:57 crc kubenswrapper[4931]: I0130 05:07:57.367463 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 16:22:21.663547629 +0000 UTC Jan 30 05:07:58 crc kubenswrapper[4931]: I0130 05:07:58.368303 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:14:05.187180104 +0000 UTC Jan 30 05:07:59 crc kubenswrapper[4931]: I0130 05:07:59.347483 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 30 05:07:59 crc kubenswrapper[4931]: I0130 05:07:59.368895 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 04:21:25.527037797 +0000 UTC Jan 30 05:07:59 crc kubenswrapper[4931]: W0130 05:07:59.617444 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 05:07:59 crc kubenswrapper[4931]: I0130 05:07:59.617599 4931 trace.go:236] Trace[1759184357]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:49.615) (total time: 10002ms): Jan 30 05:07:59 crc kubenswrapper[4931]: Trace[1759184357]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:07:59.617) Jan 30 05:07:59 crc kubenswrapper[4931]: Trace[1759184357]: [10.002101739s] [10.002101739s] END Jan 30 05:07:59 crc kubenswrapper[4931]: E0130 05:07:59.617639 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 05:07:59 crc kubenswrapper[4931]: W0130 05:07:59.776450 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 05:07:59 crc kubenswrapper[4931]: I0130 05:07:59.776578 4931 trace.go:236] Trace[1486905665]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:49.774) (total time: 10001ms): Jan 30 05:07:59 crc kubenswrapper[4931]: Trace[1486905665]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:07:59.776) Jan 30 05:07:59 crc kubenswrapper[4931]: Trace[1486905665]: [10.001673098s] [10.001673098s] END Jan 30 05:07:59 crc kubenswrapper[4931]: E0130 05:07:59.776609 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 05:08:00 crc kubenswrapper[4931]: I0130 05:08:00.233806 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 05:08:00 crc kubenswrapper[4931]: I0130 05:08:00.233928 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 05:08:00 crc kubenswrapper[4931]: I0130 05:08:00.239221 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 05:08:00 crc kubenswrapper[4931]: I0130 05:08:00.239297 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 05:08:00 crc kubenswrapper[4931]: I0130 05:08:00.369137 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 11:02:04.927131676 +0000 UTC Jan 30 05:08:01 crc kubenswrapper[4931]: I0130 05:08:01.370290 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 02:38:47.567372124 +0000 UTC Jan 30 05:08:01 crc kubenswrapper[4931]: I0130 05:08:01.420854 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:08:01 crc kubenswrapper[4931]: I0130 05:08:01.421027 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:01 crc kubenswrapper[4931]: I0130 05:08:01.422240 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:01 crc kubenswrapper[4931]: I0130 05:08:01.422276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:01 crc kubenswrapper[4931]: I0130 05:08:01.422286 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.370705 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 20:00:18.305538746 +0000 UTC Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.487406 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.487691 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.489677 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.489752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.489773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.495270 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.550224 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.552273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.552341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.552360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:02 crc kubenswrapper[4931]: I0130 05:08:02.983762 4931 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:03 crc kubenswrapper[4931]: I0130 05:08:03.371229 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:01:31.453542205 +0000 UTC Jan 30 05:08:04 crc kubenswrapper[4931]: I0130 05:08:04.372180 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:30:05.610266032 +0000 UTC Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.229442 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.229867 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.230325 4931 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.230643 4931 trace.go:236] Trace[806998970]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:54.235) (total time: 10995ms): Jan 30 05:08:05 crc kubenswrapper[4931]: Trace[806998970]: ---"Objects listed" error: 10995ms (05:08:05.230) Jan 30 05:08:05 crc kubenswrapper[4931]: Trace[806998970]: [10.995515157s] [10.995515157s] END Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.230665 4931 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.231120 4931 trace.go:236] Trace[653608471]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:52.528) (total time: 12702ms): Jan 30 05:08:05 crc kubenswrapper[4931]: Trace[653608471]: ---"Objects listed" error: 12702ms (05:08:05.230) Jan 30 05:08:05 crc kubenswrapper[4931]: Trace[653608471]: [12.702132969s] [12.702132969s] END Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.231143 4931 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.236609 4931 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.268996 4931 csr.go:261] certificate signing request csr-p6576 is approved, waiting to be issued Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.293626 4931 csr.go:257] certificate signing request csr-p6576 is issued Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.304400 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50822->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.304459 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50834->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.304532 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50834->192.168.126.11:17697: read: connection reset by peer" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.304531 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50822->192.168.126.11:17697: read: connection reset by peer" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.304937 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.305023 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.340659 4931 apiserver.go:52] "Watching apiserver" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.346492 4931 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.346852 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.347454 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.347505 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.347575 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.347850 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.347879 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.347923 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.347947 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.347991 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.348298 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.350916 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.352511 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.352552 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.352939 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.353029 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.353115 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.353032 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.353040 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.353268 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.363711 4931 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.372492 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:43:00.723969905 +0000 UTC Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.391193 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.395655 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.395730 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.404155 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.414256 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.425506 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431669 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431703 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431733 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431750 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431770 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431786 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431896 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431919 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431936 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431952 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431969 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.431985 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432004 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432022 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432043 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432058 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432078 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432099 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432114 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432130 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432145 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432161 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432177 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432192 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432227 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432245 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432263 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432281 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432307 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432323 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432339 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432356 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432390 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432409 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432448 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432465 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432481 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432496 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432511 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432526 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432541 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432559 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432572 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432589 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432617 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432634 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432650 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432668 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432686 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432703 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432720 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432737 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432753 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432771 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432788 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432805 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432821 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432839 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432856 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432874 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432893 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432912 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432928 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432944 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432978 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.432995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433016 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433048 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433063 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433078 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433102 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433119 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433138 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433153 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433171 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433187 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433204 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433220 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433237 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433251 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433266 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433282 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433298 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433313 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433345 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433362 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433378 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433396 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433413 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433442 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433458 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433473 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433488 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433503 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433520 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433536 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433551 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433568 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433585 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433600 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433616 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433633 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433648 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433666 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433681 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433698 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433714 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433749 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433767 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433783 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433801 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433817 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433834 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433852 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433867 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433883 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433900 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433917 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433932 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433950 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433970 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.433986 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434006 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434023 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434039 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434055 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434073 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434091 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434109 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434127 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434144 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434161 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434181 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434197 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434212 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434229 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434246 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434261 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434277 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434293 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434310 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434327 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434342 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434357 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434375 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434392 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434410 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434443 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434460 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434476 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434493 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434509 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434526 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434543 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434561 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434580 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434597 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434614 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434652 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434669 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434699 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434716 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434732 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434748 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434766 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434801 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434818 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434836 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434853 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434869 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434887 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434903 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434920 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434936 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434952 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434969 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.434986 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435002 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435020 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435052 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435071 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435089 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435106 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435124 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435141 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435160 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435176 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435192 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435236 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435282 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435338 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435358 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435380 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.435408 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436162 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436245 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436284 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436339 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436370 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.436396 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.437076 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.437467 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.437507 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.438015 4931 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.438346 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.438491 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.438776 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.438821 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.439070 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.439069 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.439233 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.439353 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.439606 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.439616 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:05.939581335 +0000 UTC m=+21.309491592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.439760 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440003 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440098 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440440 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440462 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440453 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.440941 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441127 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441086 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441311 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441675 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441704 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441869 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441875 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.441984 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.442140 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.442301 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.442334 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.443480 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.443610 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.443746 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.443765 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.443777 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.444159 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.444254 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.444774 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.444809 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.444882 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.444973 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.445296 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.446405 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.446463 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.446950 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.447156 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.447727 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.447389 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.448038 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.448243 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.448457 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.448647 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.448897 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.449025 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.449322 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.449458 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.449586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450265 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450466 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450499 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450594 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450658 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450651 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.450917 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451260 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451334 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451277 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451370 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451727 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451779 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.451989 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.452164 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.452648 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.452938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453034 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453070 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453456 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453491 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453517 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453773 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454005 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.453909 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454116 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454370 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454367 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454383 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454631 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454640 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.454764 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.455139 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.455336 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.455595 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.455893 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.455959 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.456109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.456436 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.456587 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.456643 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.456678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.456987 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.457377 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.457877 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.457927 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.457981 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.458513 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.458598 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.458607 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.459093 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.460128 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.461123 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.461601 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.461774 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.462031 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.462047 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.462652 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.462973 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.462995 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.463367 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.463606 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.463756 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.463805 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.463837 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.463991 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.464152 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.464919 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.465019 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.464678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.465192 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.465413 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.465650 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.466026 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.466122 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.466324 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.466908 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.465156 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.466910 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.467117 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.467132 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.467276 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468019 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468160 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468249 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468264 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.469070 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.469146 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:05.969125776 +0000 UTC m=+21.339036033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468441 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468783 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.468739 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.469629 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.469279 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.470364 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.470121 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.470697 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.470936 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.471098 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.471703 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.471761 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.471809 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.471856 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:05.971844177 +0000 UTC m=+21.341754434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.472455 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.472527 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.472553 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.472834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.473486 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.473992 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.472751 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.472926 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.473196 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.473581 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.473603 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.473375 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.474270 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.474562 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.474600 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.474607 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.477745 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.484156 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.485075 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.485338 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.485380 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.485433 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.486027 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:05.985990931 +0000 UTC m=+21.355901188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.486271 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.487747 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.487946 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.488053 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.488094 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.488998 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.489057 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.489273 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.490786 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.490989 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.492534 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.492678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.496127 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.496184 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.496942 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.498278 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.498475 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.498531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.498542 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.498803 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.498837 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.498877 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.498941 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:05.998918643 +0000 UTC m=+21.368828900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.503124 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.503371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.506037 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.506894 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.507665 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.507740 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.508043 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.519703 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.532705 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.533535 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537381 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537459 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537524 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537541 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537558 4931 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537572 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537585 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537599 4931 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537613 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537626 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537640 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537654 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537667 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537681 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537696 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537709 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537723 4931 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537736 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537749 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537762 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537775 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537819 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537834 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537848 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537862 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537876 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537891 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537907 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537919 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537932 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537946 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537959 4931 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537971 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537985 4931 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.537999 4931 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538011 4931 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538024 4931 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538036 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538049 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538062 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538076 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538088 4931 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538101 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538114 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538128 4931 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538142 4931 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538154 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538170 4931 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538182 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538195 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538207 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538220 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538233 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538246 4931 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538259 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538272 4931 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538285 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538297 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538311 4931 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538324 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538338 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538350 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538362 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538375 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538389 4931 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538401 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538437 4931 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538450 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538464 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538476 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538490 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538503 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538516 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538529 4931 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538541 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538553 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538566 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538580 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538596 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538610 4931 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538623 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538636 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538648 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538662 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538675 4931 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538688 4931 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538702 4931 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538715 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538728 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538742 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538755 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538768 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538783 4931 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538797 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538811 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538824 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538836 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538851 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538868 4931 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538882 4931 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538894 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538907 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538920 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538934 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538948 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538963 4931 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538976 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.538990 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539002 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539015 4931 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539028 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539040 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539052 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539065 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539079 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539092 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539105 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539119 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539134 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539147 4931 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539159 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539172 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539185 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539197 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539210 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539222 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539236 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539306 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539319 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539334 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539353 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539366 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539378 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539391 4931 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539403 4931 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539432 4931 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539446 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539459 4931 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539472 4931 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539486 4931 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539498 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539510 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539524 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539537 4931 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539550 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539564 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539578 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539591 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539603 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539617 4931 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539630 4931 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539645 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539659 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539672 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539685 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539698 4931 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539710 4931 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539722 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539736 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539748 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539760 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539774 4931 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539791 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539804 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539819 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539832 4931 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539845 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539857 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539872 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539884 4931 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539896 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539909 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539922 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539934 4931 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539948 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539961 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539974 4931 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.539988 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540001 4931 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540015 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540030 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540044 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540058 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540072 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540086 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540099 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540111 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540124 4931 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540137 4931 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540149 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540162 4931 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540175 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540190 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540202 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.540282 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.541130 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.541760 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.550062 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.553632 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.559450 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.560344 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.561515 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64" exitCode=255 Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.561565 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64"} Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.573928 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.591026 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.602705 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.614007 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.629792 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.640829 4931 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.640873 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.641817 4931 scope.go:117] "RemoveContainer" containerID="13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.641859 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.672040 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.674372 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.677729 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:05 crc kubenswrapper[4931]: W0130 05:08:05.682832 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6b834b3df303d8dc92f569b168d2312b27cd03de53fee3233ef600d7c8f06593 WatchSource:0}: Error finding container 6b834b3df303d8dc92f569b168d2312b27cd03de53fee3233ef600d7c8f06593: Status 404 returned error can't find the container with id 6b834b3df303d8dc92f569b168d2312b27cd03de53fee3233ef600d7c8f06593 Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.687818 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.690195 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.702538 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: W0130 05:08:05.709232 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ab6cac5ae8910baa78fd8ebe52c8ca97d361d058afd87642d36c42f0c7e2f80e WatchSource:0}: Error finding container ab6cac5ae8910baa78fd8ebe52c8ca97d361d058afd87642d36c42f0c7e2f80e: Status 404 returned error can't find the container with id ab6cac5ae8910baa78fd8ebe52c8ca97d361d058afd87642d36c42f0c7e2f80e Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.717558 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.735095 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.746799 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4931]: I0130 05:08:05.944620 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:05 crc kubenswrapper[4931]: E0130 05:08:05.944817 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:06.944777329 +0000 UTC m=+22.314687576 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.045143 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.045188 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.045209 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.045230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045345 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045402 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:07.045387788 +0000 UTC m=+22.415298045 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045466 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045486 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:07.045480641 +0000 UTC m=+22.415390898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045547 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045557 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045569 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045589 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:07.045582633 +0000 UTC m=+22.415492890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045631 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045640 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045646 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.045664 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:07.045658265 +0000 UTC m=+22.415568522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.105155 4931 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.295072 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-30 05:03:05 +0000 UTC, rotation deadline is 2026-10-23 05:51:26.216198548 +0000 UTC Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.295162 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6384h43m19.921038667s for next certificate rotation Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.373553 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:10:03.857174746 +0000 UTC Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.567836 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.570709 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.571077 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.572517 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"100862e7f6ab10748cb8df309c37999e9df3c5b541b5fa2ae9eca60d280d80fe"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.576322 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.576398 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.576415 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab6cac5ae8910baa78fd8ebe52c8ca97d361d058afd87642d36c42f0c7e2f80e"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.588235 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.588318 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6b834b3df303d8dc92f569b168d2312b27cd03de53fee3233ef600d7c8f06593"} Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.592144 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.609013 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.630279 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.646555 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.662101 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.672394 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.711511 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.747593 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.762360 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.777583 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.796636 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.821138 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.836901 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.867015 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.885601 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.898814 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.917497 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.932516 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:06Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:06 crc kubenswrapper[4931]: I0130 05:08:06.953060 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:06 crc kubenswrapper[4931]: E0130 05:08:06.953313 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:08.953260887 +0000 UTC m=+24.323171144 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.054232 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.054727 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.054929 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.055114 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.054501 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.054880 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055474 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055490 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055554 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.05553653 +0000 UTC m=+24.425446777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.054997 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055575 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055582 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055603 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.055597352 +0000 UTC m=+24.425507609 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055203 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.055824 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.055775927 +0000 UTC m=+24.425686214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.056117 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.056031793 +0000 UTC m=+24.425942330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.200702 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xjfpj"] Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.201093 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.208370 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.210748 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.210846 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.228444 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.252913 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.265086 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.327192 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.358064 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c26ef8ba-80e9-4ce4-a950-9333ceda4fab-hosts-file\") pod \"node-resolver-xjfpj\" (UID: \"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\") " pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.358451 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6rf\" (UniqueName: \"kubernetes.io/projected/c26ef8ba-80e9-4ce4-a950-9333ceda4fab-kube-api-access-zs6rf\") pod \"node-resolver-xjfpj\" (UID: \"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\") " pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.365221 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.374538 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 21:15:29.738660423 +0000 UTC Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.407671 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.421527 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.421527 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.421656 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.421707 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.421936 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:07 crc kubenswrapper[4931]: E0130 05:08:07.422179 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.425114 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.425633 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.426885 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.427490 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.428460 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.428924 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.429501 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.430507 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.435519 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.436117 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.440075 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.440287 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.441282 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.442609 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.443134 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.444107 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.444643 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.445613 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.446034 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.446638 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.447725 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.448174 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.448742 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.450327 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.450989 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.452263 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.452961 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.453983 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.454740 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.455839 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.456322 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.456789 4931 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.456891 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.458918 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.459389 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c26ef8ba-80e9-4ce4-a950-9333ceda4fab-hosts-file\") pod \"node-resolver-xjfpj\" (UID: \"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\") " pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.459526 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6rf\" (UniqueName: \"kubernetes.io/projected/c26ef8ba-80e9-4ce4-a950-9333ceda4fab-kube-api-access-zs6rf\") pod \"node-resolver-xjfpj\" (UID: \"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\") " pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.459581 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c26ef8ba-80e9-4ce4-a950-9333ceda4fab-hosts-file\") pod \"node-resolver-xjfpj\" (UID: \"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\") " pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.459441 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.460565 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.461738 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.463403 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.463949 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.465251 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.465971 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.467014 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.467743 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.470233 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.470878 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.473487 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.474278 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.475467 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.476316 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.476593 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.477776 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.478382 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.478928 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.482023 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.482632 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.483622 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.490283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6rf\" (UniqueName: \"kubernetes.io/projected/c26ef8ba-80e9-4ce4-a950-9333ceda4fab-kube-api-access-zs6rf\") pod \"node-resolver-xjfpj\" (UID: \"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\") " pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.493312 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.514112 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xjfpj" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.593657 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xjfpj" event={"ID":"c26ef8ba-80e9-4ce4-a950-9333ceda4fab","Type":"ContainerStarted","Data":"59c98c3321fd454e9316234349f3454942f46a14ada507055eb594f5606ec0be"} Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.639320 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wfdxs"] Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.639732 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.642162 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.642345 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.642823 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.643146 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.643361 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.659919 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.690202 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.710762 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.746246 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.762508 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/189be3dc-d439-47c2-b1f2-7413fc4b5e85-mcd-auth-proxy-config\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.762577 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/189be3dc-d439-47c2-b1f2-7413fc4b5e85-proxy-tls\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.762609 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/189be3dc-d439-47c2-b1f2-7413fc4b5e85-rootfs\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.762626 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6mq\" (UniqueName: \"kubernetes.io/projected/189be3dc-d439-47c2-b1f2-7413fc4b5e85-kube-api-access-xl6mq\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.773529 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.790235 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.810176 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.827477 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.842562 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.854321 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:07Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.863358 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6mq\" (UniqueName: \"kubernetes.io/projected/189be3dc-d439-47c2-b1f2-7413fc4b5e85-kube-api-access-xl6mq\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.863462 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/189be3dc-d439-47c2-b1f2-7413fc4b5e85-mcd-auth-proxy-config\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.863524 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/189be3dc-d439-47c2-b1f2-7413fc4b5e85-proxy-tls\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.863601 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/189be3dc-d439-47c2-b1f2-7413fc4b5e85-rootfs\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.863686 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/189be3dc-d439-47c2-b1f2-7413fc4b5e85-rootfs\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.864483 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/189be3dc-d439-47c2-b1f2-7413fc4b5e85-mcd-auth-proxy-config\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.868047 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/189be3dc-d439-47c2-b1f2-7413fc4b5e85-proxy-tls\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.879778 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6mq\" (UniqueName: \"kubernetes.io/projected/189be3dc-d439-47c2-b1f2-7413fc4b5e85-kube-api-access-xl6mq\") pod \"machine-config-daemon-wfdxs\" (UID: \"189be3dc-d439-47c2-b1f2-7413fc4b5e85\") " pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:07 crc kubenswrapper[4931]: I0130 05:08:07.951828 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.071962 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cdsw5"] Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.074166 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.079869 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.081824 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.081920 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.081961 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.081867 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.082604 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bshbf"] Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.083514 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lm7vv"] Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.084259 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.083701 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.086593 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.087209 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.087444 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.088157 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.088220 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.093739 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.094101 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.095392 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.096185 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.104586 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.121165 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.132851 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.146128 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.164154 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166453 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbjv\" (UniqueName: \"kubernetes.io/projected/556d9fc5-72b4-4134-8074-1e9d07012763-kube-api-access-rwbjv\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166496 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-os-release\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166528 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-node-log\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166548 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kkfl\" (UniqueName: \"kubernetes.io/projected/b17d6adf-e35b-4bf8-9ab2-e6720e595835-kube-api-access-5kkfl\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166563 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-ovn\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166579 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-bin\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166600 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556d9fc5-72b4-4134-8074-1e9d07012763-ovn-node-metrics-cert\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166625 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-log-socket\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166642 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-etc-kubernetes\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166662 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-netd\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166678 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-socket-dir-parent\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166697 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-conf-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166713 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-var-lib-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166732 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-daemon-config\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166747 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-multus-certs\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166764 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-cnibin\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166779 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-systemd\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166792 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-netns\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166807 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-systemd-units\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166823 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-cni-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166867 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-cni-bin\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-slash\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166899 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-k8s-cni-cncf-io\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166915 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06cb3786-294c-45f0-b414-66d84f8d5786-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166953 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166968 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-script-lib\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.166986 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-system-cni-dir\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167007 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-ovn-kubernetes\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167141 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfvtg\" (UniqueName: \"kubernetes.io/projected/06cb3786-294c-45f0-b414-66d84f8d5786-kube-api-access-gfvtg\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b17d6adf-e35b-4bf8-9ab2-e6720e595835-cni-binary-copy\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167229 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-os-release\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167270 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167287 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-cni-multus\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-kubelet\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167348 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-kubelet\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167367 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167390 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-config\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167407 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-hostroot\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167439 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06cb3786-294c-45f0-b414-66d84f8d5786-cni-binary-copy\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167468 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-netns\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167483 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-etc-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167498 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-env-overrides\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167513 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-system-cni-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.167546 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-cnibin\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.179669 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.195177 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.206594 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.220592 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.235209 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.251886 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-system-cni-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268380 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-cnibin\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268406 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06cb3786-294c-45f0-b414-66d84f8d5786-cni-binary-copy\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268442 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-netns\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268462 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-etc-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268480 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-env-overrides\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268502 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-node-log\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268524 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbjv\" (UniqueName: \"kubernetes.io/projected/556d9fc5-72b4-4134-8074-1e9d07012763-kube-api-access-rwbjv\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268546 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-os-release\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268601 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-cnibin\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268640 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-netns\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268691 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-node-log\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268651 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-system-cni-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kkfl\" (UniqueName: \"kubernetes.io/projected/b17d6adf-e35b-4bf8-9ab2-e6720e595835-kube-api-access-5kkfl\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268768 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-etc-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268836 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-ovn\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-bin\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268911 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556d9fc5-72b4-4134-8074-1e9d07012763-ovn-node-metrics-cert\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.268947 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-etc-kubernetes\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269002 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-log-socket\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269034 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-netd\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269060 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-socket-dir-parent\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269087 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-conf-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269118 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-var-lib-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269145 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-daemon-config\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269174 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-multus-certs\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269203 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-netns\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269233 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-os-release\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269240 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-cnibin\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-cnibin\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269292 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-systemd\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269316 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-systemd-units\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269325 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-etc-kubernetes\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269334 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-cni-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269358 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-cni-bin\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269366 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-log-socket\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269378 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-k8s-cni-cncf-io\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269403 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-slash\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269443 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06cb3786-294c-45f0-b414-66d84f8d5786-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269464 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269411 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-netd\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269487 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-script-lib\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269512 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-ovn-kubernetes\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269532 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-system-cni-dir\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269553 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b17d6adf-e35b-4bf8-9ab2-e6720e595835-cni-binary-copy\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269569 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-socket-dir-parent\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269584 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfvtg\" (UniqueName: \"kubernetes.io/projected/06cb3786-294c-45f0-b414-66d84f8d5786-kube-api-access-gfvtg\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269621 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-conf-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269637 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-cni-multus\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269654 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-env-overrides\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269652 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06cb3786-294c-45f0-b414-66d84f8d5786-cni-binary-copy\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269708 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-kubelet\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269707 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-slash\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269756 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-multus-certs\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269665 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-var-lib-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269663 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-kubelet\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269835 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-ovn-kubernetes\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269887 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-cni-multus\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269948 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-os-release\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269973 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.269999 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-hostroot\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270051 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-ovn\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270084 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-bin\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270145 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-cni-dir\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-system-cni-dir\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270219 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-systemd\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270274 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-os-release\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270382 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06cb3786-294c-45f0-b414-66d84f8d5786-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270436 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-systemd-units\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270458 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-var-lib-cni-bin\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270477 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-openvswitch\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270518 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-hostroot\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270543 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-netns\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270570 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b17d6adf-e35b-4bf8-9ab2-e6720e595835-host-run-k8s-cni-cncf-io\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270604 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-kubelet\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270639 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270620 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-script-lib\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270662 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-config\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270723 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-kubelet\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270728 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270782 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b17d6adf-e35b-4bf8-9ab2-e6720e595835-multus-daemon-config\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.270907 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06cb3786-294c-45f0-b414-66d84f8d5786-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.271018 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b17d6adf-e35b-4bf8-9ab2-e6720e595835-cni-binary-copy\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.271188 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-config\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.274833 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556d9fc5-72b4-4134-8074-1e9d07012763-ovn-node-metrics-cert\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.279258 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.290096 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kkfl\" (UniqueName: \"kubernetes.io/projected/b17d6adf-e35b-4bf8-9ab2-e6720e595835-kube-api-access-5kkfl\") pod \"multus-lm7vv\" (UID: \"b17d6adf-e35b-4bf8-9ab2-e6720e595835\") " pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.290877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbjv\" (UniqueName: \"kubernetes.io/projected/556d9fc5-72b4-4134-8074-1e9d07012763-kube-api-access-rwbjv\") pod \"ovnkube-node-bshbf\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.291867 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfvtg\" (UniqueName: \"kubernetes.io/projected/06cb3786-294c-45f0-b414-66d84f8d5786-kube-api-access-gfvtg\") pod \"multus-additional-cni-plugins-cdsw5\" (UID: \"06cb3786-294c-45f0-b414-66d84f8d5786\") " pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.294150 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.308229 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.321196 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.339828 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.363667 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.374729 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 16:07:38.807879612 +0000 UTC Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.388868 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.391992 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: W0130 05:08:08.401797 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cb3786_294c_45f0_b414_66d84f8d5786.slice/crio-c11516796e3f42abaa2b9fe28fc2d0fca97a48759d8fc7805f4a30e93d339fc8 WatchSource:0}: Error finding container c11516796e3f42abaa2b9fe28fc2d0fca97a48759d8fc7805f4a30e93d339fc8: Status 404 returned error can't find the container with id c11516796e3f42abaa2b9fe28fc2d0fca97a48759d8fc7805f4a30e93d339fc8 Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.402382 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.411945 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.414134 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lm7vv" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.425785 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.441877 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.456292 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.474144 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.499975 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.600168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.600226 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"ae003bf2c8441af0b322798040d7d0e26c38e678b0b4800e8ee8c379eec9e42a"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.606306 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.606381 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.606403 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"439d57bdeba26e03a9c77905edbb1cc2c5562b619239519cce547d019fdd2647"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.609489 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xjfpj" event={"ID":"c26ef8ba-80e9-4ce4-a950-9333ceda4fab","Type":"ContainerStarted","Data":"8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.610850 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerStarted","Data":"c11516796e3f42abaa2b9fe28fc2d0fca97a48759d8fc7805f4a30e93d339fc8"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.613163 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerStarted","Data":"71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.613204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerStarted","Data":"83792258b3ac60be35c11a507467e9e1fc774a91e583e6daa617a47d7f261e8d"} Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.619638 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.637046 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.652081 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.671228 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.700004 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.727029 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.740129 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.754614 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.767395 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.786115 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.839816 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.858450 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.874547 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.900348 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.924160 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.948066 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.963928 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.976476 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.981242 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4931]: E0130 05:08:08.981550 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:12.981501129 +0000 UTC m=+28.351411426 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:08 crc kubenswrapper[4931]: I0130 05:08:08.993220 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:08Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.016242 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.032042 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.044578 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.055971 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.067876 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.081222 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.082589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.082655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.082700 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082759 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082788 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082810 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082826 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.082766 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082891 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:13.082862488 +0000 UTC m=+28.452772965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082912 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082923 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082943 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.083031 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.082917 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:13.082907099 +0000 UTC m=+28.452817616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.083097 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:13.083072414 +0000 UTC m=+28.452982661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.083110 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:13.083103105 +0000 UTC m=+28.453013362 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.100194 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.375716 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 11:35:00.155291584 +0000 UTC Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.421506 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.421661 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.421745 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.422190 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.421930 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:09 crc kubenswrapper[4931]: E0130 05:08:09.422354 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.620256 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.624342 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" exitCode=0 Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.624463 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.624517 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.624533 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.624542 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.624551 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.626884 4931 generic.go:334] "Generic (PLEG): container finished" podID="06cb3786-294c-45f0-b414-66d84f8d5786" containerID="7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe" exitCode=0 Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.626974 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerDied","Data":"7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe"} Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.637517 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.656124 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.675074 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.692082 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.718238 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.735084 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.751693 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.767396 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.790469 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.814477 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.832817 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.849690 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.864328 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.876961 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.896084 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.915261 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.928484 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.950050 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:09 crc kubenswrapper[4931]: I0130 05:08:09.977947 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:09Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.056587 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.075759 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.094597 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.115041 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.130566 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.148758 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.168604 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.377412 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 22:45:19.04930344 +0000 UTC Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.444308 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vtnpc"] Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.445052 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.448854 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.449178 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.449651 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.449958 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.473903 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.496605 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.497001 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99cb8b56-06fb-4497-82f8-d2ba1887be6a-host\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.497077 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99cb8b56-06fb-4497-82f8-d2ba1887be6a-serviceca\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.497152 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqgf\" (UniqueName: \"kubernetes.io/projected/99cb8b56-06fb-4497-82f8-d2ba1887be6a-kube-api-access-2qqgf\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.516320 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.542790 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.567825 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.584518 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.598086 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqgf\" (UniqueName: \"kubernetes.io/projected/99cb8b56-06fb-4497-82f8-d2ba1887be6a-kube-api-access-2qqgf\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.598196 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99cb8b56-06fb-4497-82f8-d2ba1887be6a-host\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.598213 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.598264 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99cb8b56-06fb-4497-82f8-d2ba1887be6a-serviceca\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.598463 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99cb8b56-06fb-4497-82f8-d2ba1887be6a-host\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.600377 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99cb8b56-06fb-4497-82f8-d2ba1887be6a-serviceca\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.622859 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.634236 4931 generic.go:334] "Generic (PLEG): container finished" podID="06cb3786-294c-45f0-b414-66d84f8d5786" containerID="79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3" exitCode=0 Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.634338 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerDied","Data":"79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3"} Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.634747 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqgf\" (UniqueName: \"kubernetes.io/projected/99cb8b56-06fb-4497-82f8-d2ba1887be6a-kube-api-access-2qqgf\") pod \"node-ca-vtnpc\" (UID: \"99cb8b56-06fb-4497-82f8-d2ba1887be6a\") " pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.641794 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.642749 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.659031 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.681099 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.705001 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.727906 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.749284 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.761927 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vtnpc" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.774454 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.790780 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.817159 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.834328 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.852755 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.882362 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.923208 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.940755 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.958671 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.972810 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:10 crc kubenswrapper[4931]: I0130 05:08:10.992749 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:10Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.005389 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.022489 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.036302 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.055197 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.378605 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 13:48:58.010254189 +0000 UTC Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.421248 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.421254 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.421387 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.421484 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.421541 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.421651 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.630586 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.649737 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.649786 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.649800 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.649988 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.654873 4931 generic.go:334] "Generic (PLEG): container finished" podID="06cb3786-294c-45f0-b414-66d84f8d5786" containerID="f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f" exitCode=0 Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.654956 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerDied","Data":"f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.656348 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vtnpc" event={"ID":"99cb8b56-06fb-4497-82f8-d2ba1887be6a","Type":"ContainerStarted","Data":"fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.656413 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vtnpc" event={"ID":"99cb8b56-06fb-4497-82f8-d2ba1887be6a","Type":"ContainerStarted","Data":"bba9207d024c7e38d49589dca195a930a1fcedd09392415e9254f73018a143e0"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.661105 4931 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.661490 4931 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.663362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.663394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.663407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.663442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.663458 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.671504 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.687172 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.695894 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.704667 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.704720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.704736 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.704758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.704772 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.717493 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.720482 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.724638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.724686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.724698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.724721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.724734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.735273 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.738643 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.743027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.743075 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.743086 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.743111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.743128 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.759339 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.762828 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.764182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.764228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.764238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.764259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.764269 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.777366 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: E0130 05:08:11.777495 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.779669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.779703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.779712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.779728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.779740 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.787621 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.801539 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.816781 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.828864 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.841205 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.854351 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.868791 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.882754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.882799 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.882812 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.882831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.882846 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.888101 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.903367 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.925823 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.944957 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.967664 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.983797 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.986599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.986664 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.986677 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.986704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:11 crc kubenswrapper[4931]: I0130 05:08:11.986722 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:11Z","lastTransitionTime":"2026-01-30T05:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.000197 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:11Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.018549 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.036864 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.064743 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.080807 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.089636 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.089671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.089681 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.089698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.089710 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.096665 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.111291 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.128453 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.148160 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.168801 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.192439 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.192485 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.192503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.192521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.192533 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.295612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.295676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.295694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.295719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.295743 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.379013 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 02:48:54.899378078 +0000 UTC Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.399755 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.399805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.399854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.399876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.399906 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.399929 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.406594 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.416935 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.437993 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.463643 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.485114 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.507962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.508056 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.508077 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.508106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.508128 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.509935 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.529293 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.551751 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.567534 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.586274 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.600819 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.611223 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.611285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.611304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.611334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.611356 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.615133 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.632052 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.651802 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.665120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.667879 4931 generic.go:334] "Generic (PLEG): container finished" podID="06cb3786-294c-45f0-b414-66d84f8d5786" containerID="668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630" exitCode=0 Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.669323 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerDied","Data":"668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.674011 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: E0130 05:08:12.678334 4931 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.695263 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.708380 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.714506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.714571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.714592 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.714620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.714640 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.722918 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.741607 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.761493 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.792630 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.818297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.818363 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.818382 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.818409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.818471 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.838223 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.879453 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.916371 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.922397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.922450 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.922462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.922480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.922494 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:12Z","lastTransitionTime":"2026-01-30T05:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.953965 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:12 crc kubenswrapper[4931]: I0130 05:08:12.993805 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:12Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.025931 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.026019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.026051 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.026064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.026087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.026102 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.026248 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:21.026203204 +0000 UTC m=+36.396113621 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.038289 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.079268 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.115393 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.127598 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.127669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.127718 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.127750 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.127919 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.127941 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.127956 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128015 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:21.127996485 +0000 UTC m=+36.497906742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128456 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128494 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:21.128484738 +0000 UTC m=+36.498394995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128564 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128635 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:21.128625971 +0000 UTC m=+36.498536238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128700 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128713 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128723 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.128750 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:21.128741695 +0000 UTC m=+36.498651952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.130219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.130298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.130320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.130350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.130368 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.156349 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.199386 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.233870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.233950 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.233978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.234014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.234037 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.337881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.337938 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.337952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.337975 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.337992 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.381112 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 06:03:46.919517525 +0000 UTC Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.421655 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.421728 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.421672 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.421911 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.422042 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:13 crc kubenswrapper[4931]: E0130 05:08:13.422201 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.441583 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.441653 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.441674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.441726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.441752 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.544913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.544977 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.544996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.545022 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.545040 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.648162 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.648267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.648287 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.648318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.648338 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.676329 4931 generic.go:334] "Generic (PLEG): container finished" podID="06cb3786-294c-45f0-b414-66d84f8d5786" containerID="dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af" exitCode=0 Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.677527 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerDied","Data":"dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.711201 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.735907 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.760381 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.760529 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.760554 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.760586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.760618 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.764257 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.782159 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.798447 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.819580 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.851216 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.863877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.863949 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.863968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.863997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.864015 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.885883 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.905806 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.925466 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.941254 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.963375 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.966918 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.966960 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.966971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.966990 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.967003 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.977842 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:13 crc kubenswrapper[4931]: I0130 05:08:13.995045 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.012764 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.072011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.072065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.072083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.072108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.072127 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.175733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.175783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.175802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.175829 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.175848 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.282788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.283371 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.283394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.283453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.283475 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.382193 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:51:42.196376228 +0000 UTC Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.386998 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.387058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.387072 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.387095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.387110 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.490715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.490763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.490774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.490792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.490805 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.593409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.593477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.593489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.593506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.593516 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.684131 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.685015 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.685074 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.696114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.696138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.696146 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.696170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.696180 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.697055 4931 generic.go:334] "Generic (PLEG): container finished" podID="06cb3786-294c-45f0-b414-66d84f8d5786" containerID="42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead" exitCode=0 Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.697089 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerDied","Data":"42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.715329 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.720174 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.737415 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.760391 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.782379 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.799965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.800027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.800047 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.800075 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.800148 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.802972 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.829383 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.858661 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.876020 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.893142 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.911121 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.911173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.911187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.911210 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.911224 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.912140 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.930940 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.944102 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.957862 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.970657 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:14 crc kubenswrapper[4931]: I0130 05:08:14.987878 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.001398 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:14Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.015316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.015492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.015508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.015533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.015549 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.020239 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.039605 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.053193 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.075971 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.088782 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.120501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.120563 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.120576 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.120598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.120613 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.142479 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.179708 4931 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.180913 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ovn-kubernetes/pods/ovnkube-node-bshbf/status\": read tcp 38.102.83.179:47136->38.102.83.179:6443: use of closed network connection" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.223460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.223507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.223521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.223542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.223555 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.228646 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.249457 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.261842 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.272138 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.288933 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.298439 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.312611 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.326629 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.326714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.326757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.326769 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.326788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.326800 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.382693 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:25:10.184092603 +0000 UTC Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.421777 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.421843 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:15 crc kubenswrapper[4931]: E0130 05:08:15.421978 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:15 crc kubenswrapper[4931]: E0130 05:08:15.422192 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.422285 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:15 crc kubenswrapper[4931]: E0130 05:08:15.422375 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.428909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.428962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.428981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.429008 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.429029 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.445021 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.467409 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.485882 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.507162 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.529764 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.531755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.531831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.531852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.531897 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.531916 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.554974 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.568380 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.593175 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.625771 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.635010 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.635081 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.635107 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.635142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.635167 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.660051 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.681853 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.702360 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.716155 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" event={"ID":"06cb3786-294c-45f0-b414-66d84f8d5786","Type":"ContainerStarted","Data":"06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.716256 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.721081 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.737355 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.737488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.737549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.737572 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.737599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.737620 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.751946 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.780757 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.798869 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.812562 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.830467 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.841276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.841330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.841344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.841368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.841389 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.844979 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.863231 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.881315 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.923151 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.944484 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.944542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.944562 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.944590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.944610 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4931]: I0130 05:08:15.967059 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.000918 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.038673 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:16Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.047608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.047665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.047679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.047702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.047717 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.075913 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:16Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.119261 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:16Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.151908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.151980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.151999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.152033 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.152054 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.162613 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:16Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.209323 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:16Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.255880 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.256339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.256596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.256760 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.257111 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.363169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.363229 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.363245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.363266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.363281 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.383101 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 16:05:46.415415386 +0000 UTC Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.467163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.467244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.467268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.467367 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.467443 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.570684 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.570759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.570782 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.570814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.570838 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.675245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.675306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.675318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.675343 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.675356 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.719890 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.779089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.779168 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.779199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.779232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.779255 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.883027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.883080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.883099 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.883123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.883142 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.985923 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.985960 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.985969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.985985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4931]: I0130 05:08:16.985997 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.088675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.088726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.088735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.088752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.088766 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.130713 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.153547 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.173366 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.190764 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.190804 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.190815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.190834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.190846 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.191144 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.214896 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.237601 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.262608 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.281320 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.294083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.294145 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.294160 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.294183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.294200 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.299245 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.315223 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.330404 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.348805 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.369560 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.384279 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 20:21:30.021768767 +0000 UTC Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.397257 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.397344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.397370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.397398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.397415 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.397325 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.417315 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.421000 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.421108 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.421022 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:17 crc kubenswrapper[4931]: E0130 05:08:17.421201 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:17 crc kubenswrapper[4931]: E0130 05:08:17.421305 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:17 crc kubenswrapper[4931]: E0130 05:08:17.421453 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.440013 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.500856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.501644 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.501668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.501696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.501714 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.604800 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.604909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.604927 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.604955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.604974 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.709724 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.709794 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.709815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.709845 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.709868 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.731141 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/0.log" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.735606 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4" exitCode=1 Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.735841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.738317 4931 scope.go:117] "RemoveContainer" containerID="87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.758185 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.777174 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.794356 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.814120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.814190 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.814215 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.814245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.814265 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.823327 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.845001 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.866114 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.886645 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.914877 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.917976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.918033 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.918338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.918369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.918381 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.943071 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.962236 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.982949 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:17 crc kubenswrapper[4931]: I0130 05:08:17.998159 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:17Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.014482 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.026002 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.026330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.026369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.026390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.026415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.026449 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.037513 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.129057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.129128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.129141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.129166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.129178 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.232878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.232940 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.232954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.232979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.232992 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.335189 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.335266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.335278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.335298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.335309 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.384611 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:46:32.857039471 +0000 UTC Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.437536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.437967 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.437984 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.438006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.438019 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.541108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.541158 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.541168 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.541186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.541197 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.644335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.644393 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.644407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.644429 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.644455 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.741174 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/0.log" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.743983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.744166 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.746126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.746179 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.746194 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.746216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.746232 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.802033 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.817555 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.832150 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.849389 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.849507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.849531 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.849565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.849584 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.849799 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.863917 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.876529 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.898471 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.934950 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.958877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.958955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.958975 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.959004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.959023 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.972343 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:18 crc kubenswrapper[4931]: I0130 05:08:18.994015 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:18Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.013639 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.033480 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.051406 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.062058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.062132 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.062153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.062185 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.062205 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.070227 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.086508 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.165923 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.165999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.166023 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.166058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.166085 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.269335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.269412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.269434 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.269494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.269517 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.374274 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.374360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.374385 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.374422 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.374454 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.385155 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 13:21:35.37487329 +0000 UTC Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.422216 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.422287 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.422237 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:19 crc kubenswrapper[4931]: E0130 05:08:19.422479 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:19 crc kubenswrapper[4931]: E0130 05:08:19.422608 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:19 crc kubenswrapper[4931]: E0130 05:08:19.422845 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.478906 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.478978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.479000 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.479033 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.479055 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.582413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.582551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.582581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.582619 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.582662 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.686418 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.686543 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.686564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.686597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.686619 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.751292 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/1.log" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.752418 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/0.log" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.757630 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431" exitCode=1 Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.757718 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.757871 4931 scope.go:117] "RemoveContainer" containerID="87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.759540 4931 scope.go:117] "RemoveContainer" containerID="dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431" Jan 30 05:08:19 crc kubenswrapper[4931]: E0130 05:08:19.759915 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.781507 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.790064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.790159 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.790185 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.790215 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.790234 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.800792 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.816852 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.836196 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.856875 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.882013 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.894055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.894116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.894129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.894154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.894169 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.905231 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.926952 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.950869 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.988376 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:19Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.997288 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.997371 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.997398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.997479 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4931]: I0130 05:08:19.997511 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.015236 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:20Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.043816 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:20Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.067858 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:20Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.090294 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:20Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.105647 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.105706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.105725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.105752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.105770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.131797 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:20Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.209531 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.209595 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.209617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.209648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.209668 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.313642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.314126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.314310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.314493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.314639 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.386388 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 02:11:43.939044063 +0000 UTC Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.418303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.418652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.418863 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.419007 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.419156 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.523394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.523488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.523509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.523540 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.523563 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.627674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.627753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.627774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.627804 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.627829 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.731393 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.731497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.731522 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.731552 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.731571 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.765933 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/1.log" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.834710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.834772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.834792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.834818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.834836 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.938471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.938538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.938552 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.938574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.938596 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.954211 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw"] Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.954885 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.958780 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.959426 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 05:08:20 crc kubenswrapper[4931]: I0130 05:08:20.993175 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:20Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.017273 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.032337 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.032598 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.032556449 +0000 UTC m=+52.402466716 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.032692 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dss26\" (UniqueName: \"kubernetes.io/projected/f069d6db-7396-4c40-9ea9-4cc66c499cb2-kube-api-access-dss26\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.032749 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f069d6db-7396-4c40-9ea9-4cc66c499cb2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.032824 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f069d6db-7396-4c40-9ea9-4cc66c499cb2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.032932 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f069d6db-7396-4c40-9ea9-4cc66c499cb2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.036751 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.041856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.041908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.041924 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.041947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.041959 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.067069 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.097765 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134561 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134680 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f069d6db-7396-4c40-9ea9-4cc66c499cb2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f069d6db-7396-4c40-9ea9-4cc66c499cb2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134793 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134844 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.134858 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.134902 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134908 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134951 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dss26\" (UniqueName: \"kubernetes.io/projected/f069d6db-7396-4c40-9ea9-4cc66c499cb2-kube-api-access-dss26\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.134989 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f069d6db-7396-4c40-9ea9-4cc66c499cb2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.134918 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135720 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.135685955 +0000 UTC m=+52.505596252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135513 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135794 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135890 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.13586578 +0000 UTC m=+52.505776077 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135894 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135593 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135947 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.135965 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.135951632 +0000 UTC m=+52.505861899 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.136081 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.136044274 +0000 UTC m=+52.505954571 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.136428 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f069d6db-7396-4c40-9ea9-4cc66c499cb2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.137172 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f069d6db-7396-4c40-9ea9-4cc66c499cb2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.137993 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.146523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.146583 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.146599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.146618 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.146630 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.147385 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f069d6db-7396-4c40-9ea9-4cc66c499cb2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.160360 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.175725 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dss26\" (UniqueName: \"kubernetes.io/projected/f069d6db-7396-4c40-9ea9-4cc66c499cb2-kube-api-access-dss26\") pod \"ovnkube-control-plane-749d76644c-rmtbw\" (UID: \"f069d6db-7396-4c40-9ea9-4cc66c499cb2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.179728 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.197411 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.211820 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.226420 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.240515 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.250513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.250609 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.250633 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.250670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.250696 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.257339 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.279832 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.281801 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.302490 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: W0130 05:08:21.303182 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf069d6db_7396_4c40_9ea9_4cc66c499cb2.slice/crio-c1c7d22a7d6fe8702f3429f3b1131d5621d02da180c142f6726c1722da402ce4 WatchSource:0}: Error finding container c1c7d22a7d6fe8702f3429f3b1131d5621d02da180c142f6726c1722da402ce4: Status 404 returned error can't find the container with id c1c7d22a7d6fe8702f3429f3b1131d5621d02da180c142f6726c1722da402ce4 Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.323683 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.354847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.354920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.354942 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.354971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.354993 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.387281 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:28:16.861045803 +0000 UTC Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.422001 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.422222 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.422533 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.422684 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.422914 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:21 crc kubenswrapper[4931]: E0130 05:08:21.423037 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.458403 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.458470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.458485 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.458507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.458521 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.562181 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.562258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.562276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.562307 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.562329 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.665242 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.665307 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.665325 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.665454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.665482 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.769752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.769805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.769814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.769835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.769851 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.777490 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" event={"ID":"f069d6db-7396-4c40-9ea9-4cc66c499cb2","Type":"ContainerStarted","Data":"61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.777542 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" event={"ID":"f069d6db-7396-4c40-9ea9-4cc66c499cb2","Type":"ContainerStarted","Data":"6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.777554 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" event={"ID":"f069d6db-7396-4c40-9ea9-4cc66c499cb2","Type":"ContainerStarted","Data":"c1c7d22a7d6fe8702f3429f3b1131d5621d02da180c142f6726c1722da402ce4"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.795291 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.811017 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.828043 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.848437 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.868603 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.872991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.873060 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.873076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.873101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.873116 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.889084 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.920017 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.949965 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.974108 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.975931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.975991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.976004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.976027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.976043 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4931]: I0130 05:08:21.990482 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:21Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.008216 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.019928 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.033793 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.046223 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.062823 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.079383 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.079498 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.079525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.079557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.079583 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.080938 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.099962 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gt48b"] Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.100621 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.100697 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.123817 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.136828 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.154854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.154892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.154901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.154920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.154932 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.169151 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.185994 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.187314 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.194026 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.194238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.194380 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.194589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.194754 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.209573 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.218252 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.224004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.224039 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.224050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.224071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.224086 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.228596 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.241992 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.243882 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.247180 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.247267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.247288 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.247348 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.247369 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.250456 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dkp\" (UniqueName: \"kubernetes.io/projected/1421762e-4873-46cb-8c43-b8faa0cbca62-kube-api-access-b5dkp\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.250607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.269313 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.270648 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.277939 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.278046 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.278108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.278142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.278207 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.296172 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.296481 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.299128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.299186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.299248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.299282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.299305 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.299745 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.328357 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.346652 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.351929 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dkp\" (UniqueName: \"kubernetes.io/projected/1421762e-4873-46cb-8c43-b8faa0cbca62-kube-api-access-b5dkp\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.352083 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.352302 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.352452 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:22.852384967 +0000 UTC m=+38.222295234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.359368 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.373947 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.378495 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dkp\" (UniqueName: \"kubernetes.io/projected/1421762e-4873-46cb-8c43-b8faa0cbca62-kube-api-access-b5dkp\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.388038 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 19:56:29.627971081 +0000 UTC Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.391760 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.401527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.401661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.401741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.401849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.401929 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.410444 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.426240 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.446260 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:22Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.505190 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.505254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.505277 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.505309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.505332 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.610679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.611194 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.611220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.611258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.611283 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.714888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.714961 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.714986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.715018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.715045 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.818726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.818819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.818853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.818884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.818906 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.859963 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.860163 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:22 crc kubenswrapper[4931]: E0130 05:08:22.860256 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:23.86023031 +0000 UTC m=+39.230140607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.922873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.922946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.922965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.922994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4931]: I0130 05:08:22.923013 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.027038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.027122 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.027145 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.027178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.027203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.131120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.131209 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.131229 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.131264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.131284 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.235361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.235463 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.235490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.235524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.235544 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.339071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.339496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.339670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.339815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.339942 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.388978 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:56:51.306995113 +0000 UTC Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.422045 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.422137 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.422050 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:23 crc kubenswrapper[4931]: E0130 05:08:23.422277 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.422332 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:23 crc kubenswrapper[4931]: E0130 05:08:23.422552 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:23 crc kubenswrapper[4931]: E0130 05:08:23.422770 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:23 crc kubenswrapper[4931]: E0130 05:08:23.422840 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.447230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.447321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.447352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.447388 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.447416 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.551538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.551610 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.551632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.551665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.551690 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.654996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.655078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.655103 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.655140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.655164 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.759185 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.759281 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.759302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.759334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.759356 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.862637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.862713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.862732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.862762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.862781 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.872616 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:23 crc kubenswrapper[4931]: E0130 05:08:23.872863 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:23 crc kubenswrapper[4931]: E0130 05:08:23.872976 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:25.8729457 +0000 UTC m=+41.242855987 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.966476 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.966549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.966573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.966606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4931]: I0130 05:08:23.966629 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.070394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.070516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.070536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.070567 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.070598 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.174092 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.174180 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.174201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.174232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.174256 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.278412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.278560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.278587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.278621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.278642 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.381843 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.381965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.381991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.382022 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.382046 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.390417 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:33:43.845410119 +0000 UTC Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.485802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.485873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.485894 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.485924 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.485953 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.589687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.589749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.589772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.589808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.589831 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.693550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.693634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.693654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.693685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.693711 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.797252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.797362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.797381 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.797409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.797458 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.901415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.901517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.901539 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.901566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4931]: I0130 05:08:24.901587 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.005026 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.005116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.005177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.005212 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.005276 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.109201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.109279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.109297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.109671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.109711 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.212995 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.213076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.213098 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.213129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.213150 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.316222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.316297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.316321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.316350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.316369 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.391378 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 11:06:08.700387544 +0000 UTC Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.419530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.419612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.419630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.419656 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.419676 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.421548 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.421760 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.421917 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:25 crc kubenswrapper[4931]: E0130 05:08:25.426938 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.427041 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:25 crc kubenswrapper[4931]: E0130 05:08:25.427560 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:25 crc kubenswrapper[4931]: E0130 05:08:25.428296 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:25 crc kubenswrapper[4931]: E0130 05:08:25.428672 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.450710 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.471474 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.490332 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.509132 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.522943 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.523036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.523055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.523084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.523106 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.525519 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.545187 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.570097 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.589381 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.609189 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.626783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.626835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.626853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.626881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.626902 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.631250 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.652612 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.688354 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.713912 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.730278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.730350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.730370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.730486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.730507 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.735968 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.758350 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.783018 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.819178 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87290a8fe38d346f2c0b20f0e53e42593ca5132dec04b7b2f163e3355d7865f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"message\\\":\\\"or.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610022 6226 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:17.610537 6226 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 05:08:17.611197 6226 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:17.611235 6226 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:17.611280 6226 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:17.611321 6226 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:17.611337 6226 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:17.611361 6226 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:17.611362 6226 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:17.611384 6226 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:17.611466 6226 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:17.611468 6226 factory.go:656] Stopping watch factory\\\\nI0130 05:08:17.611490 6226 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.833585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.833631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.833650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.833681 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.833701 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.897627 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:25 crc kubenswrapper[4931]: E0130 05:08:25.897912 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:25 crc kubenswrapper[4931]: E0130 05:08:25.898104 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:29.898058301 +0000 UTC m=+45.267968678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.936974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.937034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.937053 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.937087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4931]: I0130 05:08:25.937107 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.040352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.040458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.040480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.040509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.040533 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.149096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.149199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.149222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.149254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.149281 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.252792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.252875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.252901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.252934 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.252961 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.356908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.356996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.357016 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.357048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.357067 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.392183 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 10:33:32.404804631 +0000 UTC Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.460510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.460586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.460610 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.460648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.460674 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.563634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.563726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.563753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.563785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.563808 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.667375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.667486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.667517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.667542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.667563 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.771572 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.771627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.771644 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.771674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.771693 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.876196 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.876251 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.876263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.876282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.876295 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.980184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.980261 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.980281 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.980312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4931]: I0130 05:08:26.980334 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.083668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.083712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.083730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.083752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.083773 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.187964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.188044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.188064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.188094 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.188114 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.292189 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.292252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.292272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.292306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.292329 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.392993 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:16:52.966770067 +0000 UTC Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.395763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.395809 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.395826 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.395853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.395870 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.421391 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.421591 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:27 crc kubenswrapper[4931]: E0130 05:08:27.421818 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.422067 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.422150 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:27 crc kubenswrapper[4931]: E0130 05:08:27.422320 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:27 crc kubenswrapper[4931]: E0130 05:08:27.422626 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:27 crc kubenswrapper[4931]: E0130 05:08:27.422842 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.498841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.498929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.498962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.498998 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.499026 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.603097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.603179 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.603203 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.603235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.603268 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.707750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.707852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.707875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.707909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.707931 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.811682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.811777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.811801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.811834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.811855 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.915576 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.915652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.915671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.915705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4931]: I0130 05:08:27.915730 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.019109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.019578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.019734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.019872 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.020017 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.123276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.123356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.123375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.123404 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.123462 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.234149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.234262 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.234282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.234309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.234330 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.338527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.338601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.338620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.338650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.338672 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.393523 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:04:33.413322942 +0000 UTC Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.442221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.442699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.442945 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.443213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.443644 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.547885 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.548236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.548587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.548856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.549149 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.652397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.652483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.652502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.652529 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.652546 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.755517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.755608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.755639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.755678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.755703 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.859952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.860005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.860027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.860059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.860084 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.963746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.963834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.963865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.963900 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4931]: I0130 05:08:28.963923 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.067245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.067714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.067948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.068019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.068043 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.170994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.171068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.171089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.171118 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.171138 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.275226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.275300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.275318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.275347 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.275363 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.379036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.379111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.379132 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.379163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.379182 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.394446 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:38:40.375586001 +0000 UTC Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.421091 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.421208 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.421371 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.421458 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:29 crc kubenswrapper[4931]: E0130 05:08:29.421560 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:29 crc kubenswrapper[4931]: E0130 05:08:29.421664 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:29 crc kubenswrapper[4931]: E0130 05:08:29.421974 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:29 crc kubenswrapper[4931]: E0130 05:08:29.422137 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.483680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.483736 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.483761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.483788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.483803 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.586801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.586846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.586859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.586878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.586891 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.690478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.690555 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.690572 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.690600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.690617 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.794674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.794746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.794767 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.794793 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.794815 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.901540 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.902512 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.902535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.902565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.902587 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4931]: I0130 05:08:29.954637 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:29 crc kubenswrapper[4931]: E0130 05:08:29.954884 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:29 crc kubenswrapper[4931]: E0130 05:08:29.955017 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.954983128 +0000 UTC m=+53.324893425 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.006993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.007063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.007084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.007114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.007137 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.111956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.112021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.112040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.112067 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.112087 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.216166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.216220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.216237 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.216264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.216282 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.324400 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.324551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.324579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.324608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.324635 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.394580 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 05:33:40.440856961 +0000 UTC Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.427792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.427848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.427869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.427893 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.427913 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.531531 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.531593 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.531611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.531639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.531660 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.642350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.642406 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.642422 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.642454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.642465 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.754761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.754809 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.754828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.754850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.754867 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.858734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.859264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.859777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.860009 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.860222 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.964057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.964533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.964801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.965032 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4931]: I0130 05:08:30.965244 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.068256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.068303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.068320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.068343 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.068361 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.171326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.171375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.171395 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.171417 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.171472 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.273579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.273642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.273659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.273683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.273700 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.377010 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.377066 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.377087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.377110 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.377132 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.395484 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 09:14:08.533080136 +0000 UTC Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.421773 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.421801 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.421930 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:31 crc kubenswrapper[4931]: E0130 05:08:31.422002 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.422065 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:31 crc kubenswrapper[4931]: E0130 05:08:31.422157 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:31 crc kubenswrapper[4931]: E0130 05:08:31.422214 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:31 crc kubenswrapper[4931]: E0130 05:08:31.422401 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.479618 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.479672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.479689 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.479715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.479733 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.582652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.582717 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.582738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.582766 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.582783 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.685285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.685611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.685698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.685808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.685891 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.825608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.825692 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.825716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.826098 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.826641 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.931020 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.931091 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.931120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.931151 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4931]: I0130 05:08:31.931172 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.034869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.034949 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.034969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.034996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.035016 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.138209 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.138274 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.138305 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.138337 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.138361 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.241685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.241739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.241753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.241775 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.241795 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.344575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.344665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.344684 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.344710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.344728 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.396547 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 08:37:04.518855358 +0000 UTC Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.422992 4931 scope.go:117] "RemoveContainer" containerID="dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.443744 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.448127 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.448188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.448207 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.448235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.448259 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.466766 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.485871 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.512232 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.533220 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.551391 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.553060 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.553149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.553181 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.553216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.553241 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.572772 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.593168 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.615476 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.642955 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.650978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.651040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.651061 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.651094 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.651117 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.667615 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: E0130 05:08:32.671070 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.677249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.677307 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.677327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.677355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.677377 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.696122 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: E0130 05:08:32.698637 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.703776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.703824 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.703844 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.703869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.703889 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.719952 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: E0130 05:08:32.723917 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.735849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.735918 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.735942 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.735972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.735995 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.744507 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: E0130 05:08:32.753422 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.758506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.759031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.759054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.759079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.759109 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.763851 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: E0130 05:08:32.779086 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: E0130 05:08:32.779232 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.781489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.781302 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.781534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.781720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.781780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.781809 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.809336 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.836267 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/1.log" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.840581 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.840768 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.865568 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.885504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.885595 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.885620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.885658 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.885683 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.914173 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.938158 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.969377 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.984337 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:32Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.988398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.988488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.988510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.988541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4931]: I0130 05:08:32.988564 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.052362 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.075394 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.090545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.090575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.090585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.090603 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.090616 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.097060 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.119239 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.130219 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.141521 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.153084 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.165777 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.176922 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.187096 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.193163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.193199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.193211 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.193229 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.193241 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.200514 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.217282 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.295948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.296044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.296058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.296078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.296093 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.396851 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 04:18:22.04971318 +0000 UTC Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.399169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.399326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.399356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.399390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.399416 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.421058 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.421153 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.421194 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:33 crc kubenswrapper[4931]: E0130 05:08:33.421346 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.421638 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:33 crc kubenswrapper[4931]: E0130 05:08:33.421754 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:33 crc kubenswrapper[4931]: E0130 05:08:33.421983 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:33 crc kubenswrapper[4931]: E0130 05:08:33.422217 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.503348 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.503446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.503466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.503494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.503513 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.606507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.606582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.606603 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.606630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.606650 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.709322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.709388 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.709406 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.709508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.709575 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.813221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.813304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.813327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.813358 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.813382 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.846973 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/2.log" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.847977 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/1.log" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.851917 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26" exitCode=1 Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.851989 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.852054 4931 scope.go:117] "RemoveContainer" containerID="dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.853177 4931 scope.go:117] "RemoveContainer" containerID="43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26" Jan 30 05:08:33 crc kubenswrapper[4931]: E0130 05:08:33.853500 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.879451 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.905003 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.916452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.916568 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.916596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.916626 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.916650 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.926694 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.949209 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.972092 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:33 crc kubenswrapper[4931]: I0130 05:08:33.997223 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:33Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.018613 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.020965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.021096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.021167 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.021239 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.021300 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.050531 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.073300 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.089738 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.104037 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.119333 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.124006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.124057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.124071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.124096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.124110 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.133593 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.147801 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.168394 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.186382 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.203301 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.227466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.227532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.227552 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.227579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.227599 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.331816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.332222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.332376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.332585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.332751 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.397347 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 05:00:00.473161663 +0000 UTC Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.436066 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.436137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.436158 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.436184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.436205 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.539293 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.539373 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.539397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.539462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.539495 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.642877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.642950 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.642968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.642994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.643018 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.746727 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.746789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.746810 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.746834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.746856 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.816580 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.829134 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.840297 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.849908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.849968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.849993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.850026 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.850051 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.859780 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/2.log" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.860242 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.882812 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.904865 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.928890 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.943610 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.944774 4931 scope.go:117] "RemoveContainer" containerID="43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26" Jan 30 05:08:34 crc kubenswrapper[4931]: E0130 05:08:34.945006 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.949130 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.953095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.953195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.953253 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.953280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.953298 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4931]: I0130 05:08:34.974776 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:34Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.013085 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dffbdbccf345b155eeca2fb349ba6b9e2a9a2f1251e04b84dc4aac74331fb431\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:18Z\\\",\\\"message\\\":\\\"08:18.741461 6358 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:18.741588 6358 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 05:08:18.741211 6358 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 05:08:18.741681 6358 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:18.741705 6358 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:18.741783 6358 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:18.741792 6358 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:18.741800 6358 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:18.741808 6358 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:18.741833 6358 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:18.741843 6358 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:18.741853 6358 factory.go:656] Stopping watch factory\\\\nI0130 05:08:18.741870 6358 nad_controller.go:166] [zone-nad-controller NAD controller]: shutting down\\\\nI0130 05:08:18.741878 6358 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.049559 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.059051 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.059112 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.059132 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.059163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.059183 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.067728 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.085323 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.102119 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.121028 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.136797 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.150887 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.162852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.162929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.162949 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.162977 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.162997 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.168001 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.183171 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.204068 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.219834 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.282836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.282890 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.282902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.282928 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.282943 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.287616 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.306213 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.323727 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.345307 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.358226 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.368865 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.378566 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.385762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.385831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.385849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.385878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.385896 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.393636 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.398568 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 09:56:39.028715506 +0000 UTC Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.405161 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.417478 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.420947 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.421044 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:35 crc kubenswrapper[4931]: E0130 05:08:35.421129 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.421171 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:35 crc kubenswrapper[4931]: E0130 05:08:35.421209 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.421263 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:35 crc kubenswrapper[4931]: E0130 05:08:35.421410 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:35 crc kubenswrapper[4931]: E0130 05:08:35.421559 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.432076 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.448673 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.462581 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.482357 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.488802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.488858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.488877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.488903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.488948 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.502841 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.524710 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.550918 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.587779 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.591930 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.591999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.592019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.592044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.592063 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.618459 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.640221 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.660010 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.679684 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.693317 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.695384 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.695436 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.695449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.695466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.695479 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.709488 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.732815 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.754031 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.777123 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.797648 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.799026 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.799097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.799116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.799140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.799156 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.819094 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.837377 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.857333 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.881926 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.902768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.902839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.902859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.902888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.902908 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:35Z","lastTransitionTime":"2026-01-30T05:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.907376 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:35 crc kubenswrapper[4931]: I0130 05:08:35.923922 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:35Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.006639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.006733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.006754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.006787 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.006809 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.111796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.111863 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.111883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.111914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.111936 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.215953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.216025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.216045 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.216078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.216099 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.320650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.320707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.320718 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.320740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.320754 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.399022 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:51:54.501664458 +0000 UTC Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.424127 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.424184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.424203 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.424231 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.424251 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.528574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.528650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.528668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.528706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.528726 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.632252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.632317 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.632338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.632368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.632388 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.736481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.736564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.736586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.736614 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.736633 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.850760 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.850818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.850839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.850870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.850897 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.954477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.954532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.954550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.954577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:36 crc kubenswrapper[4931]: I0130 05:08:36.954595 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:36Z","lastTransitionTime":"2026-01-30T05:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.050617 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.051010 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:09:09.050949528 +0000 UTC m=+84.420859825 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.057513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.057571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.057591 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.057620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.057640 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.152095 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.152164 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.152188 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.152218 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152287 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152442 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152462 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:09:09.15239147 +0000 UTC m=+84.522301767 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152519 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:09:09.152497193 +0000 UTC m=+84.522407450 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152520 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152556 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152631 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152660 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152748 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:09:09.152714709 +0000 UTC m=+84.522624996 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152573 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152801 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.152906 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:09:09.152876803 +0000 UTC m=+84.522787260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.160637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.160704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.160723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.160753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.160776 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.264162 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.264215 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.264233 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.264262 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.264281 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.367678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.367739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.367757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.367791 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.367810 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.399690 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 23:57:20.171866399 +0000 UTC Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.421711 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.421798 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.421927 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.421942 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.422092 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.422223 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.422456 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.422561 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.470396 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.470489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.470509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.470538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.470561 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.573524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.573596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.573615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.573642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.573667 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.677187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.677235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.677253 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.677278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.677299 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.781490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.781577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.781606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.781639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.781663 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.884599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.884674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.884693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.884721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.884739 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.964264 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.964511 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: E0130 05:08:37.964613 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:53.96459396 +0000 UTC m=+69.334504217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.987241 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.987287 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.987299 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.987316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:37 crc kubenswrapper[4931]: I0130 05:08:37.987328 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:37Z","lastTransitionTime":"2026-01-30T05:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.090778 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.090907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.090931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.090962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.090985 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.194903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.194952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.194966 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.194985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.195000 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.298080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.298125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.298135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.298154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.298167 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.399860 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 22:00:42.489423639 +0000 UTC Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.401329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.401385 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.401402 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.401454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.401475 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.504355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.504493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.504514 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.504538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.504555 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.607640 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.607677 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.607689 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.607706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.607717 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.710564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.710625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.710642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.710666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.710684 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.813659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.813711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.813730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.813755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.813774 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.916229 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.916276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.916289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.916307 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:38 crc kubenswrapper[4931]: I0130 05:08:38.916320 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:38Z","lastTransitionTime":"2026-01-30T05:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.019734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.019788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.019806 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.019829 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.019847 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.122793 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.122879 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.122898 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.122925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.122945 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.226213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.226317 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.226341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.226371 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.226391 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.330028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.330113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.330138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.330178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.330198 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.400181 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:29:44.225109561 +0000 UTC Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.421671 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.421727 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.421862 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.422136 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:39 crc kubenswrapper[4931]: E0130 05:08:39.422139 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:39 crc kubenswrapper[4931]: E0130 05:08:39.422348 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:39 crc kubenswrapper[4931]: E0130 05:08:39.422397 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:39 crc kubenswrapper[4931]: E0130 05:08:39.422510 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.433040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.433083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.433095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.433121 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.433138 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.537311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.537361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.537380 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.537405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.537454 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.641023 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.641122 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.641140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.641167 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.641189 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.745263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.745343 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.745374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.745407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.745467 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.849415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.849545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.849575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.849611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.849633 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.953236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.953359 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.953389 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.953475 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:39 crc kubenswrapper[4931]: I0130 05:08:39.953510 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:39Z","lastTransitionTime":"2026-01-30T05:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.056948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.057029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.057054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.057085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.057107 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.160503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.160584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.160604 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.160635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.160659 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.264276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.264351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.264369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.264400 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.264446 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.369481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.369557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.369577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.369602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.369621 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.400802 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 05:42:44.225339105 +0000 UTC Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.473134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.473202 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.473223 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.473249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.473268 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.577645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.577705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.577722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.577748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.577767 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.681221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.681289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.681312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.681342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.681363 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.785492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.785553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.785571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.785596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.785620 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.889105 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.889206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.889233 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.889273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.889299 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.992847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.992928 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.992955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.992993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:40 crc kubenswrapper[4931]: I0130 05:08:40.993019 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:40Z","lastTransitionTime":"2026-01-30T05:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.096313 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.096407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.096465 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.096502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.096563 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.200511 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.200577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.200601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.200631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.200656 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.305504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.305578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.305597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.305626 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.305646 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.401341 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:23:00.576839445 +0000 UTC Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.409259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.409321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.409340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.409370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.409390 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.421526 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.421614 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.421527 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:41 crc kubenswrapper[4931]: E0130 05:08:41.421688 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:41 crc kubenswrapper[4931]: E0130 05:08:41.421854 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:41 crc kubenswrapper[4931]: E0130 05:08:41.422030 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.422721 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:41 crc kubenswrapper[4931]: E0130 05:08:41.422968 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.512564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.512627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.512646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.512672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.512692 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.617100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.617180 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.617199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.617234 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.617255 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.720755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.720808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.720817 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.720837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.720854 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.999276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.999363 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.999383 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.999413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:41 crc kubenswrapper[4931]: I0130 05:08:41.999478 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:41Z","lastTransitionTime":"2026-01-30T05:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.102970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.103032 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.103045 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.103101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.103118 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.206457 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.206544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.206561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.206590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.206614 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.310747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.310825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.310850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.310884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.310906 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.402311 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 09:49:51.382253107 +0000 UTC Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.414508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.414663 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.414694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.414740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.414767 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.519283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.519372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.519401 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.519466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.519493 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.623466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.623910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.624068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.624280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.624490 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.728289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.728357 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.728374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.728406 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.728453 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.832876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.833316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.833513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.833680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.833833 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.938030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.938548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.938957 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.939149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.939338 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.949914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.949994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.950021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.950058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.950088 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:42 crc kubenswrapper[4931]: E0130 05:08:42.973060 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:42Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.979665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.979734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.979759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.979792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:42 crc kubenswrapper[4931]: I0130 05:08:42.979817 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:42Z","lastTransitionTime":"2026-01-30T05:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.002980 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:42Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.008613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.008867 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.009161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.009330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.009523 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.037831 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:43Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.044841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.044901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.044922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.044952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.044975 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.067954 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:43Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.074144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.074205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.074223 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.074247 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.074265 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.098462 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:43Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.098627 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.101284 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.101362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.101391 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.101483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.101517 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.205388 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.205495 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.205515 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.205550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.205571 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.309598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.309665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.309684 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.309711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.309732 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.402747 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 05:11:21.851445623 +0000 UTC Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.413406 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.413520 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.413548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.413580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.413611 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.422016 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.422087 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.422207 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.422241 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.422616 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.422864 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.423038 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:43 crc kubenswrapper[4931]: E0130 05:08:43.423151 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.520537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.520650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.520676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.520708 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.520731 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.625106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.625192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.625215 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.625243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.625264 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.728963 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.729035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.729063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.729092 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.729113 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.833250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.833321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.833341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.833369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.833390 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.937771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.937829 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.937845 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.937868 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:43 crc kubenswrapper[4931]: I0130 05:08:43.937883 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:43Z","lastTransitionTime":"2026-01-30T05:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.041732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.041797 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.041816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.041843 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.041864 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.145971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.146037 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.146056 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.146087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.146110 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.249725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.249803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.249823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.249853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.249879 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.354195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.354276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.354296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.354327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.354351 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.403832 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:54:31.833241348 +0000 UTC Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.458207 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.458289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.458311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.458336 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.458354 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.562235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.562316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.562335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.562362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.562382 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.666256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.666306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.666316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.666334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.666346 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.769914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.769953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.769969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.769987 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.770002 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.874187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.874254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.874272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.874298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.874317 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.984900 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.984960 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.984971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.984994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:44 crc kubenswrapper[4931]: I0130 05:08:44.985008 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:44Z","lastTransitionTime":"2026-01-30T05:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.089279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.089348 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.089368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.089398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.089416 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.193356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.193445 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.193469 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.193497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.193518 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.296937 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.296999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.297018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.297043 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.297063 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.400319 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.400380 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.400398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.400475 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.400496 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.404312 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:41:08.507454626 +0000 UTC Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.421208 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.421238 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:45 crc kubenswrapper[4931]: E0130 05:08:45.421946 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.422030 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.421978 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:45 crc kubenswrapper[4931]: E0130 05:08:45.422187 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:45 crc kubenswrapper[4931]: E0130 05:08:45.422414 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:45 crc kubenswrapper[4931]: E0130 05:08:45.422666 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.446556 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.464240 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.485264 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.504224 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.504365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.504392 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.504481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.504510 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.505066 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.521868 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.543232 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.575100 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.594138 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.608917 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.609059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.609231 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.609267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.610241 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.614723 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.640708 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.665730 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.690178 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.714377 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.714487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.714510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.714539 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.714559 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.725504 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.764134 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.786585 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.809325 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.817005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.817060 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.817083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.817112 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.817134 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.828649 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.850364 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:45Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.921020 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.921089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.921106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.921135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:45 crc kubenswrapper[4931]: I0130 05:08:45.921154 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:45Z","lastTransitionTime":"2026-01-30T05:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.024876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.025227 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.025378 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.025571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.025708 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.129295 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.129944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.130148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.130357 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.130594 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.234883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.234945 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.234963 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.234992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.235013 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.338866 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.338923 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.338941 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.338966 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.338986 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.405304 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:34:33.20032674 +0000 UTC Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.442405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.442493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.442516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.442544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.442566 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.545629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.545687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.545703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.545731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.545749 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.649134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.649709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.649902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.650064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.650220 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.753881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.753944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.753962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.753989 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.754010 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.857752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.858137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.858286 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.858513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.858676 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.962154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.962211 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.962228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.962255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:46 crc kubenswrapper[4931]: I0130 05:08:46.962273 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:46Z","lastTransitionTime":"2026-01-30T05:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.071916 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.071982 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.072001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.072028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.072047 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.175713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.175784 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.175803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.175831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.175854 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.278806 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.278860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.278878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.278903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.278921 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.382221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.382287 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.382305 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.382370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.382390 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.406048 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 11:25:34.000414455 +0000 UTC Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.421717 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.421757 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.421765 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:47 crc kubenswrapper[4931]: E0130 05:08:47.421894 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.422162 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:47 crc kubenswrapper[4931]: E0130 05:08:47.423150 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.423559 4931 scope.go:117] "RemoveContainer" containerID="43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26" Jan 30 05:08:47 crc kubenswrapper[4931]: E0130 05:08:47.423256 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:47 crc kubenswrapper[4931]: E0130 05:08:47.423036 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:47 crc kubenswrapper[4931]: E0130 05:08:47.424098 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.485947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.486000 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.486011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.486031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.486046 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.589904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.589961 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.589977 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.589999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.590013 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.693266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.693819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.693978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.694161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.694318 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.797355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.797466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.797489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.797518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.797537 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.901720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.901791 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.901810 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.901836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:47 crc kubenswrapper[4931]: I0130 05:08:47.901855 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:47Z","lastTransitionTime":"2026-01-30T05:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.005608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.005703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.005730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.005763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.005786 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.108875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.108982 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.109001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.109031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.109050 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.211534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.211599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.211646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.211669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.211682 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.315621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.315704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.315721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.315780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.315804 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.407175 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 11:29:05.73141213 +0000 UTC Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.419184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.419227 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.419238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.419259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.419270 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.521775 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.521856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.521875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.521904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.521922 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.626726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.626805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.626826 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.626854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.626875 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.730315 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.730392 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.730417 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.730486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.730516 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.833978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.834055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.834081 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.834118 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.834146 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.937033 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.937111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.937135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.937169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:48 crc kubenswrapper[4931]: I0130 05:08:48.937190 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:48Z","lastTransitionTime":"2026-01-30T05:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.039970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.040059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.040083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.040116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.040144 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.143501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.143547 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.143560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.143577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.143589 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.247788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.247825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.247838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.247852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.247861 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.351140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.351213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.351243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.351279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.351304 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.407857 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 04:59:05.613225936 +0000 UTC Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.421507 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.421632 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.421821 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:49 crc kubenswrapper[4931]: E0130 05:08:49.421805 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:49 crc kubenswrapper[4931]: E0130 05:08:49.421962 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:49 crc kubenswrapper[4931]: E0130 05:08:49.422162 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.422531 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:49 crc kubenswrapper[4931]: E0130 05:08:49.423025 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.454478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.454578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.454602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.454632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.454653 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.558487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.558564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.558582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.558609 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.558628 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.662256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.662298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.662322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.662340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.662354 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.765157 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.765749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.765962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.766173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.766325 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.870598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.871186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.871345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.871532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.871696 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.974319 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.974362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.974374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.974390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:49 crc kubenswrapper[4931]: I0130 05:08:49.974400 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:49Z","lastTransitionTime":"2026-01-30T05:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.077502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.077538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.077547 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.077563 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.077572 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.180977 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.181025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.181040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.181056 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.181069 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.284006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.284622 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.284860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.285120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.285317 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.389186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.389614 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.389727 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.389818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.389900 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.408825 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:23:59.561893276 +0000 UTC Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.492550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.492600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.492612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.492630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.492641 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.596811 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.596877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.596898 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.596924 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.596944 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.699297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.699375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.699399 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.699481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.699505 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.803058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.803101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.803111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.803134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.803145 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.906589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.906661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.906686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.906716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:50 crc kubenswrapper[4931]: I0130 05:08:50.906738 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:50Z","lastTransitionTime":"2026-01-30T05:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.010586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.010774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.010854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.011013 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.011107 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.114594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.114651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.114678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.114696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.114707 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.217457 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.217525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.217539 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.217582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.217595 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.321100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.321143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.321154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.321172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.321180 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.409303 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 03:27:31.508773818 +0000 UTC Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.421859 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.421856 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.422002 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:51 crc kubenswrapper[4931]: E0130 05:08:51.422016 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.421989 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:51 crc kubenswrapper[4931]: E0130 05:08:51.422137 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:51 crc kubenswrapper[4931]: E0130 05:08:51.422276 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:51 crc kubenswrapper[4931]: E0130 05:08:51.422446 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.423869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.423903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.423915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.423935 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.423955 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.526571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.526621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.526631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.526649 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.526662 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.629017 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.629076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.629087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.629110 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.629124 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.733404 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.733492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.733504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.733524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.733536 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.836820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.836880 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.836892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.836914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.836929 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.939901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.939970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.939984 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.940011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:51 crc kubenswrapper[4931]: I0130 05:08:51.940029 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:51Z","lastTransitionTime":"2026-01-30T05:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.042342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.042462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.042489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.042523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.042543 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.145441 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.145522 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.145538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.145558 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.145569 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.248611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.248684 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.248703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.248734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.248753 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.352790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.352875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.352894 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.352925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.352946 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.409689 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:27:23.730982746 +0000 UTC Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.455739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.455828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.455848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.455879 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.455903 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.559170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.559249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.559267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.559299 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.559327 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.663412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.663493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.663510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.663534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.663550 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.766854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.766912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.766922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.766941 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.766952 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.870519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.870579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.870592 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.870615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.870628 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.974148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.974256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.974274 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.974300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:52 crc kubenswrapper[4931]: I0130 05:08:52.974322 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:52Z","lastTransitionTime":"2026-01-30T05:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.077361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.077398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.077407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.077443 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.077453 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.180912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.180996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.181015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.181078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.181094 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.283751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.283817 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.283834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.283860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.283881 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.343546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.343637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.343659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.343688 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.343707 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.365813 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:53Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.370922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.370972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.370983 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.371005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.371018 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.384768 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:53Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.388655 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.388718 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.388739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.388765 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.388785 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.407109 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:53Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.410057 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:03:57.288589207 +0000 UTC Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.410728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.410778 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.410797 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.410820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.410836 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.421302 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.421330 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.421334 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.421397 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.421531 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.421698 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.421823 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.421963 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.427181 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:53Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.431841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.431899 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.431916 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.431942 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.431961 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.444877 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:53Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.445006 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.452628 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.452980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.453089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.453109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.453126 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.555386 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.555458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.555473 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.555493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.555549 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.659083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.659115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.659125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.659138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.659148 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.761620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.761662 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.761670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.761689 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.761699 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.865164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.865228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.865247 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.865273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.865294 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.965626 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.965833 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:53 crc kubenswrapper[4931]: E0130 05:08:53.965895 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:09:25.965877133 +0000 UTC m=+101.335787400 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.968866 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.968923 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.968932 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.968949 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:53 crc kubenswrapper[4931]: I0130 05:08:53.968959 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:53Z","lastTransitionTime":"2026-01-30T05:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.071365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.071432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.071446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.071461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.071472 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.174333 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.174382 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.174391 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.174405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.174440 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.278018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.278076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.278085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.278101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.278112 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.380872 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.380946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.380961 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.380979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.380989 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.410654 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 02:31:54.034087616 +0000 UTC Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.483759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.483819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.483828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.483845 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.483856 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.587315 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.587380 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.587397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.587451 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.587470 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.690370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.690411 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.690448 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.690469 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.690481 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.793655 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.793722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.793744 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.793774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.793795 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.895933 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.896000 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.896025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.896055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.896076 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.997950 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.998005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.998014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.998030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:54 crc kubenswrapper[4931]: I0130 05:08:54.998042 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:54Z","lastTransitionTime":"2026-01-30T05:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.100481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.100518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.100559 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.100579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.100589 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.203608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.203672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.203690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.203717 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.203736 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.306797 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.306865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.306888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.306915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.306935 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.409732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.409772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.409786 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.409803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.409816 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.411059 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 02:04:50.291938 +0000 UTC Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.421416 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.421503 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:55 crc kubenswrapper[4931]: E0130 05:08:55.421548 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.421577 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.421576 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:55 crc kubenswrapper[4931]: E0130 05:08:55.421757 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:55 crc kubenswrapper[4931]: E0130 05:08:55.421947 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:55 crc kubenswrapper[4931]: E0130 05:08:55.422487 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.445488 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.460802 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.477823 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.498360 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.512857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.512925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.512939 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.513361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.513399 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.522152 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.536231 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.552367 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.588737 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.610347 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.616748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.616798 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.616815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.616841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.616858 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.630250 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.645513 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.663043 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.676684 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.692826 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.707330 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.720386 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.720606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.720688 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.720771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.720817 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.724367 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.740068 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.757031 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:55Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.823384 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.823470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.823488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.823510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.823528 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.925814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.925871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.925882 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.925903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:55 crc kubenswrapper[4931]: I0130 05:08:55.925916 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:55Z","lastTransitionTime":"2026-01-30T05:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.030175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.030247 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.030269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.030296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.030317 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.054298 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/0.log" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.054383 4931 generic.go:334] "Generic (PLEG): container finished" podID="b17d6adf-e35b-4bf8-9ab2-e6720e595835" containerID="71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899" exitCode=1 Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.054455 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerDied","Data":"71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.055063 4931 scope.go:117] "RemoveContainer" containerID="71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.073077 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.088005 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.101066 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.116889 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.128821 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.132352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.132385 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.132395 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.132411 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.132447 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.144486 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.159395 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.172700 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.187550 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.208963 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.224396 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.236563 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.236588 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.236597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.236613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.236625 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.237157 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.256546 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.284243 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.311902 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.329525 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.339183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.339275 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.339295 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.339322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.339367 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.344516 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.361507 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:56Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.411118 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:31:36.976584002 +0000 UTC Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.442637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.442713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.442726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.442743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.442755 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.544953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.544990 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.545001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.545016 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.545027 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.647561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.647604 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.647615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.647633 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.647647 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.751743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.751796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.751813 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.751838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.751855 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.855302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.855344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.855372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.855394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.855410 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.958438 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.958500 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.958516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.958539 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:56 crc kubenswrapper[4931]: I0130 05:08:56.958557 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:56Z","lastTransitionTime":"2026-01-30T05:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.060493 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/0.log" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.060557 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerStarted","Data":"c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.062781 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.062848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.062868 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.062899 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.062922 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.086504 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.102881 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.118109 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.133831 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.156343 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.165405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.165477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.165489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.165506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.165533 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.195606 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.251013 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.269199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.269244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.269255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.269271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.269281 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.279899 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.306896 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.318929 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.336074 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.350133 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.362481 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.371803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.371869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.371891 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.371920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.371935 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.377215 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.390146 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.403228 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.412036 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 23:53:39.472330303 +0000 UTC Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.416053 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.421047 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.421052 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.421316 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:57 crc kubenswrapper[4931]: E0130 05:08:57.421195 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:57 crc kubenswrapper[4931]: E0130 05:08:57.421589 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.421630 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:57 crc kubenswrapper[4931]: E0130 05:08:57.421884 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:57 crc kubenswrapper[4931]: E0130 05:08:57.421691 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.433871 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:57Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.475537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.475985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.476242 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.476385 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.476540 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.579483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.579534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.579546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.579570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.579586 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.682676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.682747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.682766 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.682796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.682815 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.786594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.786660 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.786680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.786705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.786723 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.889796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.889858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.889877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.889904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.889924 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.992455 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.992502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.992519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.992542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:57 crc kubenswrapper[4931]: I0130 05:08:57.992569 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:57Z","lastTransitionTime":"2026-01-30T05:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.094747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.094804 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.094824 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.094848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.094867 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.198486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.198532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.198544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.198563 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.198575 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.301666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.301721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.301738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.301762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.301779 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.405144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.405221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.405243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.405273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.405296 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.413248 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:12:52.093603075 +0000 UTC Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.508128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.508174 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.508193 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.508219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.508236 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.613892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.614155 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.614228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.614301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.614370 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.718103 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.718260 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.718286 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.718318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.718339 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.821544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.821631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.821658 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.821687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.821708 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.924972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.925050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.925196 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.925230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:58 crc kubenswrapper[4931]: I0130 05:08:58.925252 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:58Z","lastTransitionTime":"2026-01-30T05:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.028313 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.028355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.028365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.028385 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.028396 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.131055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.131107 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.131120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.131140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.131150 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.234205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.234260 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.234271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.234292 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.234306 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.337753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.337822 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.337837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.337861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.337873 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.414301 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 06:12:54.271271346 +0000 UTC Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.421812 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.421880 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:59 crc kubenswrapper[4931]: E0130 05:08:59.422035 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.422100 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.422108 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:59 crc kubenswrapper[4931]: E0130 05:08:59.422280 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:59 crc kubenswrapper[4931]: E0130 05:08:59.422527 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:08:59 crc kubenswrapper[4931]: E0130 05:08:59.424520 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.425190 4931 scope.go:117] "RemoveContainer" containerID="43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.440001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.440044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.440059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.440077 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.440096 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.543611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.543695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.543721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.543755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.543780 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.647280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.647350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.647376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.647408 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.647509 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.751013 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.751052 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.751066 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.751084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.751097 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.854735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.854790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.854802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.854825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.854839 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.957514 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.957564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.957574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.957597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:59 crc kubenswrapper[4931]: I0130 05:08:59.957609 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:59Z","lastTransitionTime":"2026-01-30T05:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.060407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.060476 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.060486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.060548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.060562 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.074460 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/2.log" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.077802 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.079000 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.096098 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.109440 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.124017 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.135550 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.149650 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.163165 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.163220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.163232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.163250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.163264 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.165314 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.180215 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.194036 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.212151 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.229855 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.245504 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.264553 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.265743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.265788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.265804 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.265823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.265840 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.281206 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.303989 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.320781 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.343344 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.364668 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.368639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.368704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.368722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.368747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.368765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.380690 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:00Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.414885 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:58:29.37915117 +0000 UTC Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.471710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.471738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.471749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.471762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.471772 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.574156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.574217 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.574236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.574259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.574278 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.676944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.677011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.677024 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.677046 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.677073 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.780849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.780926 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.780946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.780970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.780987 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.884359 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.884415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.884458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.884482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.884499 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.986749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.986777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.986788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.986802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:00 crc kubenswrapper[4931]: I0130 05:09:00.986813 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:00Z","lastTransitionTime":"2026-01-30T05:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.083817 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/3.log" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.084765 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/2.log" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088794 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" exitCode=1 Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088840 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088844 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088886 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088888 4931 scope.go:117] "RemoveContainer" containerID="43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.088897 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.090070 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:09:01 crc kubenswrapper[4931]: E0130 05:09:01.090354 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.121827 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.151620 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.170117 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.190104 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.192518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.192569 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.192580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.192599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.192611 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.209785 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.225593 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.253623 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.273293 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.293063 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.295031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.295105 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.295132 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.295164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.295185 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.312268 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.333847 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.362042 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43394fdb8a93566a07f6555221a2aa2a7990c65a6623e03d361d12fdafb9be26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:33Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI0130 05:08:33.522075 6561 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 05:08:33.522092 6561 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 05:08:33.522157 6561 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 05:08:33.522163 6561 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 05:08:33.522187 6561 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 05:08:33.522204 6561 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 05:08:33.522222 6561 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 05:08:33.522225 6561 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 05:08:33.522242 6561 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 05:08:33.522257 6561 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 05:08:33.522264 6561 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 05:08:33.522272 6561 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 05:08:33.522290 6561 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 05:08:33.522388 6561 factory.go:656] Stopping watch factory\\\\nI0130 05:08:33.522398 6561 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:09:00Z\\\",\\\"message\\\":\\\"ints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0130 05:09:00.314584 6959 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 05:09:00.314585 6959 services_controller.go:444] Built service openshift-dns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314592 6959 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314395 6959 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0130 05:09:00.314653 6959 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.379689 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.398970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.399038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.399049 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.399072 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.399088 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.399951 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.415455 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 20:37:59.067283235 +0000 UTC Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.416686 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.420925 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.420976 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:01 crc kubenswrapper[4931]: E0130 05:09:01.421064 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.421168 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.421204 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:01 crc kubenswrapper[4931]: E0130 05:09:01.421265 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:01 crc kubenswrapper[4931]: E0130 05:09:01.421627 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:01 crc kubenswrapper[4931]: E0130 05:09:01.421785 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.429791 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.447379 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.462804 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:01Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.502576 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.502647 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.502673 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.502717 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.502739 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.606795 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.606868 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.606881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.606900 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.606936 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.711658 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.711744 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.711762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.711793 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.711813 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.815397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.815508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.815528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.815558 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.815584 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.919220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.919275 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.919297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.919321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:01 crc kubenswrapper[4931]: I0130 05:09:01.919339 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:01Z","lastTransitionTime":"2026-01-30T05:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.023246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.023328 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.023352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.023383 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.023404 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.097273 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/3.log" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.103352 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:09:02 crc kubenswrapper[4931]: E0130 05:09:02.103606 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.118565 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.127279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.127341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.127365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.127387 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.127404 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.137528 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.152957 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.173572 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.194597 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.207547 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.224237 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.230320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.230415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.230464 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.230495 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.230516 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.239967 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.255837 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.270633 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.286963 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.306390 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.330122 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.335543 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.335612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.335631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.335666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.335688 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.365990 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:09:00Z\\\",\\\"message\\\":\\\"ints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0130 05:09:00.314584 6959 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 05:09:00.314585 6959 services_controller.go:444] Built service openshift-dns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314592 6959 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314395 6959 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0130 05:09:00.314653 6959 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.402044 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.415856 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:59:37.441258095 +0000 UTC Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.429624 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.439812 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.439891 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.439912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.439947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.439966 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.450553 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.474365 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:02Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.543149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.543218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.543237 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.543266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.543287 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.647030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.647104 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.647124 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.647149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.647168 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.751642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.751705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.751723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.751751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.751785 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.855046 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.855136 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.855160 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.855192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.855216 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.959339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.959416 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.959471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.959501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:02 crc kubenswrapper[4931]: I0130 05:09:02.959526 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:02Z","lastTransitionTime":"2026-01-30T05:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.063271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.063361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.063465 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.063491 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.063506 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.167236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.167320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.167338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.167366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.167388 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.271563 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.271715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.271736 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.271766 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.271788 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.376079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.376153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.376171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.376200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.376219 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.416317 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 04:59:52.372201449 +0000 UTC Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.421894 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.421991 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.422006 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.422236 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.422667 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.422854 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.423244 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.423541 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.480277 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.480344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.480365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.480394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.480418 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.557757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.557976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.558000 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.558029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.558049 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.583207 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:03Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.589774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.589850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.589870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.589899 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.589918 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.613182 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:03Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.619117 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.619214 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.619236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.619262 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.619281 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.641324 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:03Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.651300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.651612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.651759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.651910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.652104 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.674358 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:03Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.680747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.680995 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.681142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.681289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.681488 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.704339 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:03Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:03 crc kubenswrapper[4931]: E0130 05:09:03.705623 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.711216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.711279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.711297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.711347 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.711368 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.814381 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.814490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.814508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.814532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.814550 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.917327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.917390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.917409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.917461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:03 crc kubenswrapper[4931]: I0130 05:09:03.917480 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:03Z","lastTransitionTime":"2026-01-30T05:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.020668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.020886 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.021039 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.021190 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.021340 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.124025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.124114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.124145 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.124182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.124208 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.227706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.227756 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.227772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.227793 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.227812 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.331244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.331314 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.331337 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.331366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.331387 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.417091 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 11:41:30.839384919 +0000 UTC Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.436353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.436408 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.436452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.436478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.436496 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.540838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.540904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.540915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.540939 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.540953 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.645268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.645368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.645387 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.645415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.645651 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.750106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.750216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.750245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.750277 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.750301 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.854666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.854732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.854749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.854773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.854792 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.958617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.958678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.958696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.958721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:04 crc kubenswrapper[4931]: I0130 05:09:04.958739 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:04Z","lastTransitionTime":"2026-01-30T05:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.063962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.064051 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.064075 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.064119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.064146 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.169764 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.169855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.169891 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.169924 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.169942 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.274178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.274306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.274397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.274597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.274670 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.378660 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.378762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.378783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.378811 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.378831 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.417852 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 09:54:15.215758887 +0000 UTC Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.421385 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.421671 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:05 crc kubenswrapper[4931]: E0130 05:09:05.421839 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:05 crc kubenswrapper[4931]: E0130 05:09:05.421621 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.421951 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:05 crc kubenswrapper[4931]: E0130 05:09:05.422040 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.422217 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:05 crc kubenswrapper[4931]: E0130 05:09:05.422311 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.448413 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.471881 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.482320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.482373 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.482393 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.482456 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.482498 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.490940 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.510739 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.529002 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.547825 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.572101 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.585508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.585580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.585601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.585634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.585656 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.590822 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.613400 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.634260 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.659674 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.684922 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.688956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.689603 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.689901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.690142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.690342 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.724727 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.752233 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.775135 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.794375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.794816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.794967 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.795090 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.795194 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.798120 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.822361 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.862540 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:09:00Z\\\",\\\"message\\\":\\\"ints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0130 05:09:00.314584 6959 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 05:09:00.314585 6959 services_controller.go:444] Built service openshift-dns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314592 6959 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314395 6959 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0130 05:09:00.314653 6959 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:05Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.900111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.900268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.900290 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.900321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:05 crc kubenswrapper[4931]: I0130 05:09:05.900342 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:05Z","lastTransitionTime":"2026-01-30T05:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.003575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.004101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.004354 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.004641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.004866 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.108316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.108387 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.108408 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.108467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.108493 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.212729 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.212816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.212835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.212866 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.212886 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.316714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.317142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.317355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.317560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.317748 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.418848 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 14:12:20.556816041 +0000 UTC Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.422868 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.422954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.422979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.423015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.423036 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.527818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.527970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.527995 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.528024 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.528042 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.631501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.631590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.631613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.631644 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.631663 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.735513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.735587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.735606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.735637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.735657 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.839690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.839761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.839780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.839814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.839833 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.943181 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.943269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.943296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.943334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:06 crc kubenswrapper[4931]: I0130 05:09:06.943392 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:06Z","lastTransitionTime":"2026-01-30T05:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.046746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.046829 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.046858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.046898 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.046925 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.150785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.150842 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.150857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.150877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.150891 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.254260 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.254325 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.254346 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.254375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.254398 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.357981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.358045 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.358067 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.358092 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.358110 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.420067 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 07:06:45.289250952 +0000 UTC Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.421473 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.421528 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.421549 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:07 crc kubenswrapper[4931]: E0130 05:09:07.421712 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.421777 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:07 crc kubenswrapper[4931]: E0130 05:09:07.422087 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:07 crc kubenswrapper[4931]: E0130 05:09:07.422538 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:07 crc kubenswrapper[4931]: E0130 05:09:07.431995 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.445189 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.462291 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.462376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.462404 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.462464 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.462486 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.566031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.566098 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.566120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.566148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.566166 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.670250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.670312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.670330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.670355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.670375 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.774230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.774302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.774346 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.774376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.774399 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.878290 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.878345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.878364 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.878396 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.878418 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.981320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.981383 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.981401 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.981450 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:07 crc kubenswrapper[4931]: I0130 05:09:07.981469 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:07Z","lastTransitionTime":"2026-01-30T05:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.084946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.085028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.085051 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.085082 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.085105 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.189046 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.189133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.189157 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.189197 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.189222 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.292679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.292761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.292780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.292816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.292840 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.397394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.397502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.397532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.397568 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.397596 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.420563 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 14:59:50.12951867 +0000 UTC Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.500518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.500611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.500639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.500675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.500703 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.604414 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.604539 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.604561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.604591 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.604614 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.708172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.708251 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.708269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.708330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.708354 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.811642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.811694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.811711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.811739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.811756 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.914199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.914264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.914280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.914303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:08 crc kubenswrapper[4931]: I0130 05:09:08.914322 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:08Z","lastTransitionTime":"2026-01-30T05:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.018176 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.018257 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.018275 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.018304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.018326 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.058153 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.058803 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.058748406 +0000 UTC m=+148.428658713 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.122226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.122300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.122324 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.122357 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.122378 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.159572 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.159664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.159720 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.159768 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.159912 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.159959 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160114 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160136 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.159976 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.159980 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160257 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160273 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160087 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.160060244 +0000 UTC m=+148.529970541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160347 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.160319902 +0000 UTC m=+148.530230189 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160370 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.160357453 +0000 UTC m=+148.530267740 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.160392 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.160379934 +0000 UTC m=+148.530290221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.226798 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.226880 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.226899 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.226931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.226953 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.329832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.329918 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.329947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.329984 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.330010 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.421284 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 14:54:37.472449516 +0000 UTC Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.421618 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.421695 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.421695 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.421821 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.421955 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.421970 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.422038 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:09 crc kubenswrapper[4931]: E0130 05:09:09.422154 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.432560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.432601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.432616 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.432634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.432647 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.535791 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.535870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.535890 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.535920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.535939 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.639798 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.639873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.639884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.639905 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.639918 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.744059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.744125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.744135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.744157 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.744169 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.847339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.847394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.847404 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.847585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.847618 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.951702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.951789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.951816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.951851 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:09 crc kubenswrapper[4931]: I0130 05:09:09.951881 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:09Z","lastTransitionTime":"2026-01-30T05:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.055806 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.055881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.055902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.055934 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.055956 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.159480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.159559 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.159579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.159611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.159635 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.263485 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.263554 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.263577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.263606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.263627 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.367687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.367808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.367837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.367953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.367990 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.421595 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 22:32:04.19626699 +0000 UTC Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.474845 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.474912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.474933 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.474966 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.474987 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.577349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.577413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.577455 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.577483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.577498 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.680889 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.680952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.680971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.680998 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.681016 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.784104 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.784183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.784201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.784227 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.784246 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.887014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.887093 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.887113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.887141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.887162 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.990869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.990953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.990985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.991015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:10 crc kubenswrapper[4931]: I0130 05:09:10.991037 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:10Z","lastTransitionTime":"2026-01-30T05:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.094846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.095348 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.095378 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.095412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.095487 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.199047 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.199115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.199140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.199174 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.199198 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.301902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.302132 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.302156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.302190 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.302212 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.407072 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.407133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.407151 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.407184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.407203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.421217 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.421226 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.421408 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.421493 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:11 crc kubenswrapper[4931]: E0130 05:09:11.421625 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.421701 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 23:32:26.575580243 +0000 UTC Jan 30 05:09:11 crc kubenswrapper[4931]: E0130 05:09:11.421840 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:11 crc kubenswrapper[4931]: E0130 05:09:11.422029 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:11 crc kubenswrapper[4931]: E0130 05:09:11.422165 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.510358 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.510458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.510481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.510510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.510529 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.613114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.613195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.613214 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.613244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.613265 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.716985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.717050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.717069 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.717096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.717116 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.820123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.820166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.820184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.820205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.820239 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.924396 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.924493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.924510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.924534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:11 crc kubenswrapper[4931]: I0130 05:09:11.924551 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:11Z","lastTransitionTime":"2026-01-30T05:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.027996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.028042 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.028061 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.028084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.028102 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.131695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.131747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.131765 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.131790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.131806 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.234828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.234879 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.234897 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.234920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.234938 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.337451 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.337506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.337523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.337546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.337562 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.422015 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 22:38:54.591546017 +0000 UTC Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.440756 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.440816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.440836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.440861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.440881 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.544093 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.544144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.544161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.544186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.544204 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.647940 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.648040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.648065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.648097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.648119 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.751390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.751481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.751501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.751525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.751544 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.854692 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.854758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.854776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.854802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.854824 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.958733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.958803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.958830 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.958860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:12 crc kubenswrapper[4931]: I0130 05:09:12.958885 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:12Z","lastTransitionTime":"2026-01-30T05:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.061951 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.062006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.062022 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.062043 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.062060 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.164952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.164997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.165014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.165036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.165057 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.268695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.268757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.268777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.268807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.268825 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.372460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.372518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.372536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.372561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.372579 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.421521 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.421557 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.421521 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.421727 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.421793 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.421959 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.422064 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.422136 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 04:22:02.572922745 +0000 UTC Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.422259 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.475645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.475707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.475725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.475750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.475768 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.579166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.579235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.579252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.579273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.579290 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.683027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.683090 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.683109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.683131 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.683149 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.737353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.737415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.737461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.737486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.737503 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.756122 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.761252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.761293 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.761302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.761316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.761327 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.783758 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.788908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.788973 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.788989 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.789027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.789052 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.809802 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.815596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.815648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.815672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.815698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.815717 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.828565 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.833395 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.833580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.833602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.833628 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.833650 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.853757 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:13Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:13 crc kubenswrapper[4931]: E0130 05:09:13.854019 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.856854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.856920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.856947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.856979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.857003 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.961570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.961661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.961689 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.961770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:13 crc kubenswrapper[4931]: I0130 05:09:13.961802 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:13Z","lastTransitionTime":"2026-01-30T05:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.065582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.065661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.065686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.065714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.065734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.169731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.169795 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.169809 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.169836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.169852 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.272909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.273468 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.273651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.273792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.273945 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.377867 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.378123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.378265 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.378413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.378616 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.422856 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 23:00:18.680465006 +0000 UTC Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.481639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.481718 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.481744 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.481771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.481791 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.585492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.585554 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.585571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.585597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.585617 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.688721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.688796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.688818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.688847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.688868 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.793030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.793140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.793170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.793208 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.793233 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.896887 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.897340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.897462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.897586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:14 crc kubenswrapper[4931]: I0130 05:09:14.897673 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:14Z","lastTransitionTime":"2026-01-30T05:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.001669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.001755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.001784 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.001823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.001850 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.105931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.106000 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.106019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.106047 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.106070 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.210192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.210288 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.210311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.210341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.210362 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.313803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.313861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.313878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.313901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.313918 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.417800 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.417870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.417884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.417908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.417923 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.421138 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.421179 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.421308 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:15 crc kubenswrapper[4931]: E0130 05:09:15.421463 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.421599 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:15 crc kubenswrapper[4931]: E0130 05:09:15.421760 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:15 crc kubenswrapper[4931]: E0130 05:09:15.422055 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:15 crc kubenswrapper[4931]: E0130 05:09:15.422249 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.423131 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 16:02:54.45188772 +0000 UTC Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.448218 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.467071 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.488818 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.504803 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.521314 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.521390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.521409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.521467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.521489 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.525316 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.583514 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.617412 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.624566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.624629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.624649 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.624677 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.624697 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.643840 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.662622 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.681488 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.712138 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.727777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.727835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.727849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.727872 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.727885 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.747912 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:09:00Z\\\",\\\"message\\\":\\\"ints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0130 05:09:00.314584 6959 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 05:09:00.314585 6959 services_controller.go:444] Built service openshift-dns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314592 6959 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314395 6959 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0130 05:09:00.314653 6959 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.764557 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.780498 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cba59fc-ee1f-450a-ab9e-2743c1bbb933\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb77e9defc8c4121eae34daeca1948ee8aef2d6c884fb05b2a5c53e85cbe9c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e913d425338029033aa1073be9668185b9f9c4dbb2560466d086b52aa6ce17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e913d425338029033aa1073be9668185b9f9c4dbb2560466d086b52aa6ce17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.798787 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.818293 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.831888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.831946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.831965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.831994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.832015 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.834629 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.854154 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.872939 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:15Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.935652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.935728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.935746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.935773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:15 crc kubenswrapper[4931]: I0130 05:09:15.935795 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:15Z","lastTransitionTime":"2026-01-30T05:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.038749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.038848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.038871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.038907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.038931 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.142848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.142900 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.142920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.142946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.142967 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.247375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.247496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.247620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.247771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.247805 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.351280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.351358 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.351377 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.351405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.351455 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.422796 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:09:16 crc kubenswrapper[4931]: E0130 05:09:16.423084 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.423399 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 23:45:04.421852804 +0000 UTC Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.455118 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.455384 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.455621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.455770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.455902 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.559488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.559767 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.559895 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.560070 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.560229 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.664085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.664196 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.664218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.664811 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.665031 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.769647 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.769762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.769786 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.769822 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.769844 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.873352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.873496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.873526 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.873565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.873594 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.977218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.977316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.977353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.977387 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:16 crc kubenswrapper[4931]: I0130 05:09:16.977409 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:16Z","lastTransitionTime":"2026-01-30T05:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.081341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.081415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.081483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.081510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.081530 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.184638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.184710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.184728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.184754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.184801 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.288369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.288505 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.288544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.288578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.288610 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.391857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.391935 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.391960 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.391991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.392016 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.421948 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.421987 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:17 crc kubenswrapper[4931]: E0130 05:09:17.422514 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.422296 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.422063 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:17 crc kubenswrapper[4931]: E0130 05:09:17.422979 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:17 crc kubenswrapper[4931]: E0130 05:09:17.423096 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:17 crc kubenswrapper[4931]: E0130 05:09:17.423338 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.423653 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:41:34.444683837 +0000 UTC Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.495483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.495951 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.496213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.496405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.496665 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.600813 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.600874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.600893 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.600921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.600940 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.705517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.705586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.705604 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.705630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.705649 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.809399 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.809521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.809544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.809576 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.809748 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.914388 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.914551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.914587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.914741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:17 crc kubenswrapper[4931]: I0130 05:09:17.914777 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:17Z","lastTransitionTime":"2026-01-30T05:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.018519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.018631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.018652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.018678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.018744 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.123511 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.123588 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.123609 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.123641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.123669 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.226349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.226413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.226470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.226503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.226524 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.329707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.329781 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.329807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.329839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.329858 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.425514 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 19:53:11.113606302 +0000 UTC Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.434804 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.434874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.434892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.434917 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.435349 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.539185 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.539254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.539273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.539301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.539320 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.643570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.643629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.643647 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.643670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.643688 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.747674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.747742 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.747755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.747778 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.747792 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.852543 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.852708 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.852729 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.852752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.852772 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.955814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.955901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.955920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.955943 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:18 crc kubenswrapper[4931]: I0130 05:09:18.955963 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:18Z","lastTransitionTime":"2026-01-30T05:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.060617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.060698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.061000 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.061300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.061359 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.165782 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.165867 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.165884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.165907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.165921 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.270127 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.270193 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.270216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.270246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.270269 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.374271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.374362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.374382 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.374855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.374921 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.422194 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.422237 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:19 crc kubenswrapper[4931]: E0130 05:09:19.422494 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.422580 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.422237 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:19 crc kubenswrapper[4931]: E0130 05:09:19.422692 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:19 crc kubenswrapper[4931]: E0130 05:09:19.422813 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:19 crc kubenswrapper[4931]: E0130 05:09:19.422900 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.427142 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 01:21:24.386510162 +0000 UTC Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.479037 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.479117 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.479138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.479177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.479201 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.581612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.581670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.581698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.581729 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.581753 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.684316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.684376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.684400 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.684471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.684497 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.788287 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.788372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.788397 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.788488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.788517 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.892101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.892193 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.892214 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.892238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.892257 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.995549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.995663 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.995685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.995715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:19 crc kubenswrapper[4931]: I0130 05:09:19.995734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:19Z","lastTransitionTime":"2026-01-30T05:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.099822 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.099904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.099925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.099954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.099977 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.203140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.203203 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.203220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.203245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.203265 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.306563 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.306699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.306719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.306748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.306768 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.409707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.409774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.409799 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.409832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.409859 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.427676 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 13:02:08.75939317 +0000 UTC Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.512570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.512680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.512699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.512754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.512770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.615741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.615831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.615861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.615894 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.615916 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.719268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.719378 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.719411 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.719472 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.719499 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.823270 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.823390 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.823462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.823524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.823546 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.930377 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.930534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.930700 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.930892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:20 crc kubenswrapper[4931]: I0130 05:09:20.931280 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:20Z","lastTransitionTime":"2026-01-30T05:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.034964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.035028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.035048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.035076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.035095 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.138991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.139057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.139078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.139111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.139132 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.242085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.242159 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.242178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.242204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.242223 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.345891 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.345958 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.345976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.346005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.346026 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.421887 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.421955 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.422009 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.422020 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:21 crc kubenswrapper[4931]: E0130 05:09:21.422142 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:21 crc kubenswrapper[4931]: E0130 05:09:21.422354 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:21 crc kubenswrapper[4931]: E0130 05:09:21.422788 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:21 crc kubenswrapper[4931]: E0130 05:09:21.422934 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.428286 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 22:04:18.362374625 +0000 UTC Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.449453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.449506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.449527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.449550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.449570 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.553725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.553809 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.553829 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.553859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.553881 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.657589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.657681 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.657701 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.657734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.657752 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.760750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.760808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.760827 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.760857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.760887 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.863807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.863874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.863894 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.863922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.863941 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.967666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.967741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.967750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.967770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:21 crc kubenswrapper[4931]: I0130 05:09:21.967780 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:21Z","lastTransitionTime":"2026-01-30T05:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.071028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.071095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.071114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.071195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.071219 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.174545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.174623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.174641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.174670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.174694 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.278175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.278246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.278268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.278296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.278317 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.382029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.382114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.382128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.382148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.382161 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.429083 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 03:19:05.046308926 +0000 UTC Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.485602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.485691 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.486099 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.486394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.486452 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.590134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.590203 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.590222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.590248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.590269 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.694086 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.694155 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.694174 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.694204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.694224 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.798235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.798302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.798321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.798353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.798372 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.901599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.901730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.901748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.901813 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:22 crc kubenswrapper[4931]: I0130 05:09:22.901831 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:22Z","lastTransitionTime":"2026-01-30T05:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.005106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.005175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.005197 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.005226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.005246 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.151597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.151667 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.151707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.151774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.151834 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.257315 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.257406 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.257434 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.257456 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.257470 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.361981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.362058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.362078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.362109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.362174 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.422025 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.422261 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.422463 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.422575 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.422626 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.422814 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.422954 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.423093 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.429236 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 22:59:21.062135588 +0000 UTC Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.465119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.465195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.465221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.465250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.465274 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.568321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.568492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.568523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.568555 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.568638 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.671584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.671694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.671714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.671743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.671765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.775052 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.775115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.775134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.775160 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.775182 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.878080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.878164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.878185 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.878209 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.878227 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.879999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.880062 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.880082 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.880110 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.880128 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.901688 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:23Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.908030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.908106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.908119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.908144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.908161 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.933637 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:23Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.941227 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.941326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.941375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.941405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.941465 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.963680 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:23Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.969841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.969910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.969930 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.969959 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.969980 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:23 crc kubenswrapper[4931]: E0130 05:09:23.991871 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:23Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.997179 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.997234 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.997255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.997278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:23 crc kubenswrapper[4931]: I0130 05:09:23.997295 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:23Z","lastTransitionTime":"2026-01-30T05:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: E0130 05:09:24.017583 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:09:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9d83649b-6a34-4b83-bc96-3ff1ac14c758\\\",\\\"systemUUID\\\":\\\"babf1111-baa6-43bf-8e98-8707b9d18072\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:24Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:24 crc kubenswrapper[4931]: E0130 05:09:24.017817 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.019850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.019902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.019922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.019943 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.019979 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.123849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.123932 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.123953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.123978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.123997 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.227237 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.227296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.227316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.227341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.227360 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.331049 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.331271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.331301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.331336 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.331358 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.430453 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 10:12:04.03570343 +0000 UTC Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.435489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.435560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.435581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.435610 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.435628 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.538712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.538771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.538788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.538809 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.538827 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.641578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.641625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.641639 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.641657 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.641669 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.746356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.746551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.746583 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.746618 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.746643 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.850338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.850454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.850482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.850515 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.850538 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.953958 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.954065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.954083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.954111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:24 crc kubenswrapper[4931]: I0130 05:09:24.954134 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:24Z","lastTransitionTime":"2026-01-30T05:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.056914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.056980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.057006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.057051 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.057085 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.161011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.161076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.161095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.161119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.161137 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.263954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.264016 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.264035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.264058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.264076 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.367001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.367088 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.367109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.367138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.367161 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.421164 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.421224 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.421301 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:25 crc kubenswrapper[4931]: E0130 05:09:25.421333 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.421370 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:25 crc kubenswrapper[4931]: E0130 05:09:25.421594 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:25 crc kubenswrapper[4931]: E0130 05:09:25.421727 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:25 crc kubenswrapper[4931]: E0130 05:09:25.421892 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.430742 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:23:05.381151445 +0000 UTC Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.444710 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b05b3-9d05-4908-8700-10648cb914ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cc9d5056c2d95a99de13c3f3f4a492966c6933e1bcc23fcc1e705ba2b9271a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09deb2eebd0cef929f31cb9252562c74dbe7ddce16b0db7c43e094b0f23c9aae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb56a6c73b5d4a29e34aaa83e0be6892cf5f5ba3284b18153a8fcfb60343dcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.468267 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7a23337-f7f5-446b-8a26-1a92225474df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae502c18c3cc1944f67850104e8d7819cc3893d52b877b8f23c81387b745b7d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbdbc8b311c9de87b1d5cf57e19c4283e735b6d4579e9f6e468f5dd1f925578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1affd503f7d9b067489811090a51387fd2aa499509865030b0f93cded2062f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d86863a534719cf9493b82d08cf117a4e029d3c041f1718427c88d0ea909733\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.470970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.471063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.471089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.471122 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.471144 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.495069 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fafbfee2ff93d8b931969fe152f7ae3d698aaaa2d48538d4915fbfa3422f3880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.519725 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lm7vv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b17d6adf-e35b-4bf8-9ab2-e6720e595835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:08:55Z\\\",\\\"message\\\":\\\"2026-01-30T05:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16\\\\n2026-01-30T05:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_62497d82-dbe5-4b49-9856-765125ac2f16 to /host/opt/cni/bin/\\\\n2026-01-30T05:08:10Z [verbose] multus-daemon started\\\\n2026-01-30T05:08:10Z [verbose] Readiness Indicator file check\\\\n2026-01-30T05:08:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kkfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lm7vv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.556977 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa6b4bfc-bb66-4371-8e78-353873344f17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a9b5e4ff228a4713d44e1f3b6856f74e315f8a47abe7613c8e6813242ead57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://256d972e17b738e8e14a40cefd5e0d12f74ae5aba56f9c357e29f193ebd5b7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95346dd11511d4c19dd070873cc8b459a118c5f99ff87dff21965a39bfaf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c364285f503e7a0c201bd75f15c5913dafb298b41647b55569ee5251b9c8b4ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b67ef4253edb9455a783e664f610d2334412c679ac951b7f8bb24d3f58fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45011d12ca6fe225d496dc64afc4405629ec53acc2ebcda9fe14afa462261e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://719f0c75faa152591f9f72839b7e888b239077e56ef573f3bb247ce4a5a71ea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2567d1f1e9ab3a8a35740c8725efd69a6ae2022ba1e9f8771702839d5f54c97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.575285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.575335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.575351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.575374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.575394 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.583967 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f92025d1-3392-4c42-802e-b549f0bf4e7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 05:07:59.538717 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:07:59.541678 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-523852778/tls.crt::/tmp/serving-cert-523852778/tls.key\\\\\\\"\\\\nI0130 05:08:05.256370 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:05.274783 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:05.274829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:05.274866 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:05.274878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:05.290126 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:05.290156 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290160 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:05.290164 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:05.290167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:05.290170 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:05.290174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:05.290290 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:05.295265 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.608233 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.629745 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.656851 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06cb3786-294c-45f0-b414-66d84f8d5786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06a5c2b37c07fbb37c0c500395d9b5993f80e98aadeeee8eb46fd0244833a78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bfd37702d6bdbd580da18ffcc234ebbd7841ada15be7651a40e707ecce25afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79334886da3dbdc68e53047fc37af6a3b595e00416870302129378761b06d6f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d5e58ce5b45bf04b943e52fb7ce3bb1f03be834add77024a98e04b87d7657f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://668fe0242a4500ab470502a033941344ddc824c25a664105485ff020c9b1e630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca75899b961dd42245ad6eb520d8c625639dda1a1c5ca494eca6346267f53af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42d7e1b38ba8c02ccddcd87a719a110d26b7906f53689726498904a122883ead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfvtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cdsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.678098 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.678175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.678193 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.678214 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.678254 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.689314 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"556d9fc5-72b4-4134-8074-1e9d07012763\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T05:09:00Z\\\",\\\"message\\\":\\\"ints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0130 05:09:00.314584 6959 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 05:09:00.314585 6959 services_controller.go:444] Built service openshift-dns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314592 6959 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0130 05:09:00.314395 6959 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/marketplace-operator-metrics]} name:Service_openshift-marketplace/marketplace-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.53:8081: 10.217.5.53:8383:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {89fe421e-04e8-4967-ac75-77a0e6f784ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0130 05:09:00.314653 6959 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bshbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.706417 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cba59fc-ee1f-450a-ab9e-2743c1bbb933\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb77e9defc8c4121eae34daeca1948ee8aef2d6c884fb05b2a5c53e85cbe9c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e913d425338029033aa1073be9668185b9f9c4dbb2560466d086b52aa6ce17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e913d425338029033aa1073be9668185b9f9c4dbb2560466d086b52aa6ce17c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.726251 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.744448 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87d53ad37831a6ab12d1403e37e87caefd5dfa5703945f11ce17998748bbc932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.761984 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xjfpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c26ef8ba-80e9-4ce4-a950-9333ceda4fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d5081c972400ada81f4c0b439fae6d2bcd98a3ef693be6596edcdd202eb6766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zs6rf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xjfpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.779888 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"189be3dc-d439-47c2-b1f2-7413fc4b5e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db0a3f8d149cf186174eeafb40cec15384675a4263dd97c7a73e1e3ba8caab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6mq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfdxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.782198 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.782251 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.782271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.782298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.782315 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.796207 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vtnpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99cb8b56-06fb-4497-82f8-d2ba1887be6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd74ac41f4e65cb4380bf0f4338b63ba1aa7de94874cc685cd55b7580896d00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2qqgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vtnpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.817596 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt48b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1421762e-4873-46cb-8c43-b8faa0cbca62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5dkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt48b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.842260 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeeda6865f3327eb7eaa6fc09558dc67cacc0750e7ef858b21a2cc654668cf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094e7a5134e512ac6f42e5672e7a84a101f2905333079e1fd22eb08a3c579257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.860410 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f069d6db-7396-4c40-9ea9-4cc66c499cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ecc1689992632455a84de692ae57730c3d457434258e04a43892135033a3fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ebe22d2604043e99ae63c45b8dd031466c26632279b08cb8cab1c220d18a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dss26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rmtbw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:09:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.886556 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.886666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.886695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.886730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.886755 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.990212 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.990293 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.990311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.990338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.990362 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:25Z","lastTransitionTime":"2026-01-30T05:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:25 crc kubenswrapper[4931]: I0130 05:09:25.993107 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:25 crc kubenswrapper[4931]: E0130 05:09:25.993381 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:09:25 crc kubenswrapper[4931]: E0130 05:09:25.993547 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs podName:1421762e-4873-46cb-8c43-b8faa0cbca62 nodeName:}" failed. No retries permitted until 2026-01-30 05:10:29.993483541 +0000 UTC m=+165.363393838 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs") pod "network-metrics-daemon-gt48b" (UID: "1421762e-4873-46cb-8c43-b8faa0cbca62") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.094064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.094213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.094236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.094381 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.094404 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.197039 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.197106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.197125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.197152 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.197175 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.301083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.301149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.301171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.301202 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.301237 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.406026 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.406111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.406131 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.406161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.406182 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.431944 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:25:00.529683599 +0000 UTC Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.510483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.510537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.510553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.510580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.510602 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.614096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.614156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.614179 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.614208 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.614228 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.717441 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.717510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.717528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.717554 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.717572 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.820539 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.820621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.820642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.820676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.820695 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.923599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.923672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.923691 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.923723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:26 crc kubenswrapper[4931]: I0130 05:09:26.923744 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:26Z","lastTransitionTime":"2026-01-30T05:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.027591 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.027683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.027703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.027738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.027762 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.133378 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.133477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.133497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.133574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.133597 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.237118 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.237183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.237201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.237225 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.237243 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.340021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.340100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.340129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.340161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.340182 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.421538 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.421619 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.421648 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:27 crc kubenswrapper[4931]: E0130 05:09:27.422195 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.421722 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:27 crc kubenswrapper[4931]: E0130 05:09:27.422323 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:27 crc kubenswrapper[4931]: E0130 05:09:27.422461 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:27 crc kubenswrapper[4931]: E0130 05:09:27.422647 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.433063 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 14:53:36.631224708 +0000 UTC Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.443910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.443957 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.443970 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.443986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.444000 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.552478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.552568 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.552594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.552630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.552661 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.657255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.657332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.657352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.657383 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.657403 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.760633 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.760768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.760789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.760814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.760832 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.864485 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.864559 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.864577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.864610 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.864632 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.967885 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.968006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.968029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.968058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:27 crc kubenswrapper[4931]: I0130 05:09:27.968080 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:27Z","lastTransitionTime":"2026-01-30T05:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.071998 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.072076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.072096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.072126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.072150 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.175565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.175666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.175699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.175741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.175762 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.279180 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.279294 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.279322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.279362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.279388 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.383607 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.383675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.383695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.383721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.383742 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.434161 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:52:08.358951009 +0000 UTC Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.488114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.488245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.488269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.488303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.488331 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.592257 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.592323 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.592341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.592368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.592387 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.695396 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.695511 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.695532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.695569 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.695600 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.799009 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.799385 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.799594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.799750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.799879 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.904169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.904246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.904266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.904299 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:28 crc kubenswrapper[4931]: I0130 05:09:28.904321 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:28Z","lastTransitionTime":"2026-01-30T05:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.007889 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.007964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.007982 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.008009 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.008037 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.111635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.111702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.111720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.111747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.111766 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.218893 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.219303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.219573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.221109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.221166 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.326217 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.326319 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.326342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.326373 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.326401 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.421633 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:29 crc kubenswrapper[4931]: E0130 05:09:29.421799 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.422645 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:29 crc kubenswrapper[4931]: E0130 05:09:29.422729 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.422757 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.422909 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:29 crc kubenswrapper[4931]: E0130 05:09:29.422955 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bshbf_openshift-ovn-kubernetes(556d9fc5-72b4-4134-8074-1e9d07012763)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" Jan 30 05:09:29 crc kubenswrapper[4931]: E0130 05:09:29.422977 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.423131 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:29 crc kubenswrapper[4931]: E0130 05:09:29.423200 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.429304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.429336 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.429346 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.429360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.429373 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.434566 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 12:39:45.422873232 +0000 UTC Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.532665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.532699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.532711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.532725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.532735 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.636014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.636078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.636095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.636119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.636137 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.739106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.739221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.739240 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.739273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.739294 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.842201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.842288 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.842312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.842343 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.842369 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.945801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.945863 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.945884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.945910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:29 crc kubenswrapper[4931]: I0130 05:09:29.945929 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:29Z","lastTransitionTime":"2026-01-30T05:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.049650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.049716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.049738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.049770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.049795 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.152925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.152997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.153019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.153052 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.153075 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.257940 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.258017 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.258040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.258072 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.258099 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.361314 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.361394 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.361411 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.361470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.361492 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.435322 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 06:57:32.502862918 +0000 UTC Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.464630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.464700 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.464722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.464752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.464773 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.567214 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.567264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.567290 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.567319 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.567355 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.670641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.670721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.670741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.670768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.670786 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.774148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.774209 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.774226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.774253 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.774272 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.878340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.878405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.878461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.878497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.878521 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.981696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.981773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.981798 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.981828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:30 crc kubenswrapper[4931]: I0130 05:09:30.981848 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:30Z","lastTransitionTime":"2026-01-30T05:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.084793 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.084928 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.084955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.084987 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.085051 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.189135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.189182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.189193 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.189212 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.189225 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.292517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.292687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.292709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.292735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.292789 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.396735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.396833 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.396863 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.396895 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.396918 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.421759 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.421763 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.421773 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:31 crc kubenswrapper[4931]: E0130 05:09:31.421996 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.422050 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:31 crc kubenswrapper[4931]: E0130 05:09:31.422288 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:31 crc kubenswrapper[4931]: E0130 05:09:31.422354 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:31 crc kubenswrapper[4931]: E0130 05:09:31.422592 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.435924 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 07:59:46.557802713 +0000 UTC Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.500645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.500720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.500738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.500764 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.500783 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.605214 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.605285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.605303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.605332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.605350 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.709043 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.709129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.709152 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.709188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.709214 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.813350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.813409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.813460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.813488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.813508 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.917134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.917191 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.917201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.917221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:31 crc kubenswrapper[4931]: I0130 05:09:31.917237 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:31Z","lastTransitionTime":"2026-01-30T05:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.020613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.020682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.020713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.020739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.020763 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.123906 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.123959 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.123971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.123991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.124005 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.227259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.227327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.227352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.227387 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.227413 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.332225 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.332306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.332331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.332361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.332387 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.436071 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 09:20:44.696760998 +0000 UTC Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.436393 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.436528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.436549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.436579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.436601 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.539999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.540068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.540087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.540114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.540135 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.645409 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.645509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.645529 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.645557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.645577 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.749154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.749225 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.749248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.749283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.749307 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.853125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.853204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.853225 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.853252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.853269 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.956957 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.957042 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.957073 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.957140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:32 crc kubenswrapper[4931]: I0130 05:09:32.957166 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:32Z","lastTransitionTime":"2026-01-30T05:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.059896 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.059962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.059990 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.060018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.060039 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.164017 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.164111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.164154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.164192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.164219 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.267911 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.267998 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.268027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.268059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.268084 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.372357 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.372535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.372565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.372599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.372621 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.421851 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.422114 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.422158 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.422377 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:33 crc kubenswrapper[4931]: E0130 05:09:33.422382 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:33 crc kubenswrapper[4931]: E0130 05:09:33.422580 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:33 crc kubenswrapper[4931]: E0130 05:09:33.422676 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:33 crc kubenswrapper[4931]: E0130 05:09:33.422827 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.436994 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 09:30:14.893172783 +0000 UTC Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.476365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.476488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.476519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.476587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.476608 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.581313 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.581369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.581382 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.581402 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.581418 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.684627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.684712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.684742 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.684776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.684800 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.788892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.788971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.788991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.789019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.789044 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.891637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.891710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.891728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.891752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.891770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.995190 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.995366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.995413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.995501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:33 crc kubenswrapper[4931]: I0130 05:09:33.995528 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:33Z","lastTransitionTime":"2026-01-30T05:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.098781 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.098839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.098849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.098873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.098884 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:34Z","lastTransitionTime":"2026-01-30T05:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.201991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.202054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.202067 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.202087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.202104 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:34Z","lastTransitionTime":"2026-01-30T05:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.304907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.304962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.304972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.304989 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.305002 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:34Z","lastTransitionTime":"2026-01-30T05:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.373405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.373497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.373517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.373543 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.373562 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:09:34Z","lastTransitionTime":"2026-01-30T05:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.431043 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9"] Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.431635 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.434267 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.434497 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.434726 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.434837 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.437192 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 04:53:30.030887631 +0000 UTC Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.437244 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.448931 4931 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.491930 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rmtbw" podStartSLOduration=87.491899364 podStartE2EDuration="1m27.491899364s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.456518183 +0000 UTC m=+109.826428510" watchObservedRunningTime="2026-01-30 05:09:34.491899364 +0000 UTC m=+109.861809621" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.495768 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e05528c-e400-4c01-98f1-e97adf895d92-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.495891 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1e05528c-e400-4c01-98f1-e97adf895d92-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.495949 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e05528c-e400-4c01-98f1-e97adf895d92-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.496176 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e05528c-e400-4c01-98f1-e97adf895d92-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.496756 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1e05528c-e400-4c01-98f1-e97adf895d92-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.510524 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.51048984 podStartE2EDuration="1m0.51048984s" podCreationTimestamp="2026-01-30 05:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.510334676 +0000 UTC m=+109.880244933" watchObservedRunningTime="2026-01-30 05:09:34.51048984 +0000 UTC m=+109.880400137" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.564788 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lm7vv" podStartSLOduration=87.564750596 podStartE2EDuration="1m27.564750596s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.551260189 +0000 UTC m=+109.921170456" watchObservedRunningTime="2026-01-30 05:09:34.564750596 +0000 UTC m=+109.934660893" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598533 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1e05528c-e400-4c01-98f1-e97adf895d92-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e05528c-e400-4c01-98f1-e97adf895d92-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1e05528c-e400-4c01-98f1-e97adf895d92-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598704 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1e05528c-e400-4c01-98f1-e97adf895d92-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598748 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e05528c-e400-4c01-98f1-e97adf895d92-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598830 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e05528c-e400-4c01-98f1-e97adf895d92-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.598980 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1e05528c-e400-4c01-98f1-e97adf895d92-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.599888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e05528c-e400-4c01-98f1-e97adf895d92-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.609885 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.609845612 podStartE2EDuration="1m22.609845612s" podCreationTimestamp="2026-01-30 05:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.583696613 +0000 UTC m=+109.953606920" watchObservedRunningTime="2026-01-30 05:09:34.609845612 +0000 UTC m=+109.979755909" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.609967 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e05528c-e400-4c01-98f1-e97adf895d92-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.610254 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.610244033 podStartE2EDuration="1m29.610244033s" podCreationTimestamp="2026-01-30 05:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.608590065 +0000 UTC m=+109.978500322" watchObservedRunningTime="2026-01-30 05:09:34.610244033 +0000 UTC m=+109.980154330" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.623121 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e05528c-e400-4c01-98f1-e97adf895d92-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9dsd9\" (UID: \"1e05528c-e400-4c01-98f1-e97adf895d92\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.694916 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cdsw5" podStartSLOduration=87.694863231 podStartE2EDuration="1m27.694863231s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.694315955 +0000 UTC m=+110.064226222" watchObservedRunningTime="2026-01-30 05:09:34.694863231 +0000 UTC m=+110.064773488" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.771868 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.771834724 podStartE2EDuration="1m28.771834724s" podCreationTimestamp="2026-01-30 05:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.770569047 +0000 UTC m=+110.140479304" watchObservedRunningTime="2026-01-30 05:09:34.771834724 +0000 UTC m=+110.141744981" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.771923 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.834536 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xjfpj" podStartSLOduration=87.834516017 podStartE2EDuration="1m27.834516017s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.819177326 +0000 UTC m=+110.189087583" watchObservedRunningTime="2026-01-30 05:09:34.834516017 +0000 UTC m=+110.204426274" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.834634 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podStartSLOduration=87.834630791 podStartE2EDuration="1m27.834630791s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.833797996 +0000 UTC m=+110.203708253" watchObservedRunningTime="2026-01-30 05:09:34.834630791 +0000 UTC m=+110.204541048" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.845219 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vtnpc" podStartSLOduration=87.845213032 podStartE2EDuration="1m27.845213032s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.84514956 +0000 UTC m=+110.215059817" watchObservedRunningTime="2026-01-30 05:09:34.845213032 +0000 UTC m=+110.215123289" Jan 30 05:09:34 crc kubenswrapper[4931]: I0130 05:09:34.871934 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=27.871904196 podStartE2EDuration="27.871904196s" podCreationTimestamp="2026-01-30 05:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:34.87134603 +0000 UTC m=+110.241256297" watchObservedRunningTime="2026-01-30 05:09:34.871904196 +0000 UTC m=+110.241814463" Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.246838 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" event={"ID":"1e05528c-e400-4c01-98f1-e97adf895d92","Type":"ContainerStarted","Data":"8928f664343ffe393101efb5da1dc68f33ba1bc5030ba2639756113dd78479a2"} Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.246932 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" event={"ID":"1e05528c-e400-4c01-98f1-e97adf895d92","Type":"ContainerStarted","Data":"352cf410d6dc3550ccd5bbea2ce945e73032e1f1b716ddde2f2a24c7464b9087"} Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.274032 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9dsd9" podStartSLOduration=88.274006399 podStartE2EDuration="1m28.274006399s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:35.271895627 +0000 UTC m=+110.641805924" watchObservedRunningTime="2026-01-30 05:09:35.274006399 +0000 UTC m=+110.643916696" Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.421249 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.421360 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:35 crc kubenswrapper[4931]: E0130 05:09:35.421555 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.421576 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:35 crc kubenswrapper[4931]: I0130 05:09:35.421655 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:35 crc kubenswrapper[4931]: E0130 05:09:35.424083 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:35 crc kubenswrapper[4931]: E0130 05:09:35.424241 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:35 crc kubenswrapper[4931]: E0130 05:09:35.424476 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:37 crc kubenswrapper[4931]: I0130 05:09:37.421060 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:37 crc kubenswrapper[4931]: I0130 05:09:37.421174 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:37 crc kubenswrapper[4931]: I0130 05:09:37.421262 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:37 crc kubenswrapper[4931]: E0130 05:09:37.421255 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:37 crc kubenswrapper[4931]: I0130 05:09:37.421054 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:37 crc kubenswrapper[4931]: E0130 05:09:37.421453 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:37 crc kubenswrapper[4931]: E0130 05:09:37.421636 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:37 crc kubenswrapper[4931]: E0130 05:09:37.421754 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:39 crc kubenswrapper[4931]: I0130 05:09:39.422036 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:39 crc kubenswrapper[4931]: I0130 05:09:39.422107 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:39 crc kubenswrapper[4931]: E0130 05:09:39.422628 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:39 crc kubenswrapper[4931]: I0130 05:09:39.422196 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:39 crc kubenswrapper[4931]: I0130 05:09:39.422164 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:39 crc kubenswrapper[4931]: E0130 05:09:39.422803 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:39 crc kubenswrapper[4931]: E0130 05:09:39.423207 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:39 crc kubenswrapper[4931]: E0130 05:09:39.423369 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:41 crc kubenswrapper[4931]: I0130 05:09:41.421527 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:41 crc kubenswrapper[4931]: I0130 05:09:41.421673 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:41 crc kubenswrapper[4931]: I0130 05:09:41.421722 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:41 crc kubenswrapper[4931]: I0130 05:09:41.421737 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:41 crc kubenswrapper[4931]: E0130 05:09:41.422610 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:41 crc kubenswrapper[4931]: E0130 05:09:41.422879 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:41 crc kubenswrapper[4931]: E0130 05:09:41.423070 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:41 crc kubenswrapper[4931]: E0130 05:09:41.423185 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:42 crc kubenswrapper[4931]: I0130 05:09:42.277122 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/1.log" Jan 30 05:09:42 crc kubenswrapper[4931]: I0130 05:09:42.278654 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/0.log" Jan 30 05:09:42 crc kubenswrapper[4931]: I0130 05:09:42.278754 4931 generic.go:334] "Generic (PLEG): container finished" podID="b17d6adf-e35b-4bf8-9ab2-e6720e595835" containerID="c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0" exitCode=1 Jan 30 05:09:42 crc kubenswrapper[4931]: I0130 05:09:42.278827 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerDied","Data":"c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0"} Jan 30 05:09:42 crc kubenswrapper[4931]: I0130 05:09:42.278949 4931 scope.go:117] "RemoveContainer" containerID="71e5ae535724d0bf08b647693f5ce248b193261779f219e61f21eeb5b8263899" Jan 30 05:09:42 crc kubenswrapper[4931]: I0130 05:09:42.279606 4931 scope.go:117] "RemoveContainer" containerID="c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0" Jan 30 05:09:42 crc kubenswrapper[4931]: E0130 05:09:42.279943 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lm7vv_openshift-multus(b17d6adf-e35b-4bf8-9ab2-e6720e595835)\"" pod="openshift-multus/multus-lm7vv" podUID="b17d6adf-e35b-4bf8-9ab2-e6720e595835" Jan 30 05:09:43 crc kubenswrapper[4931]: I0130 05:09:43.285833 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/1.log" Jan 30 05:09:43 crc kubenswrapper[4931]: I0130 05:09:43.421713 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:43 crc kubenswrapper[4931]: I0130 05:09:43.421775 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:43 crc kubenswrapper[4931]: I0130 05:09:43.421713 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:43 crc kubenswrapper[4931]: I0130 05:09:43.421912 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:43 crc kubenswrapper[4931]: E0130 05:09:43.422051 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:43 crc kubenswrapper[4931]: E0130 05:09:43.422330 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:43 crc kubenswrapper[4931]: E0130 05:09:43.422403 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:43 crc kubenswrapper[4931]: E0130 05:09:43.422538 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:44 crc kubenswrapper[4931]: I0130 05:09:44.422275 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.297952 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/3.log" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.302109 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerStarted","Data":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.302663 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.347627 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podStartSLOduration=98.347599558 podStartE2EDuration="1m38.347599558s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:45.346863676 +0000 UTC m=+120.716774013" watchObservedRunningTime="2026-01-30 05:09:45.347599558 +0000 UTC m=+120.717509855" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.371457 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gt48b"] Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.371612 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:45 crc kubenswrapper[4931]: E0130 05:09:45.371713 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.421564 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.421565 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:45 crc kubenswrapper[4931]: E0130 05:09:45.421798 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:45 crc kubenswrapper[4931]: E0130 05:09:45.422072 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:45 crc kubenswrapper[4931]: I0130 05:09:45.422660 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:45 crc kubenswrapper[4931]: E0130 05:09:45.424544 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:45 crc kubenswrapper[4931]: E0130 05:09:45.451602 4931 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 30 05:09:45 crc kubenswrapper[4931]: E0130 05:09:45.561038 4931 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:09:47 crc kubenswrapper[4931]: I0130 05:09:47.422965 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:47 crc kubenswrapper[4931]: I0130 05:09:47.423490 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:47 crc kubenswrapper[4931]: E0130 05:09:47.423683 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:47 crc kubenswrapper[4931]: I0130 05:09:47.423381 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:47 crc kubenswrapper[4931]: I0130 05:09:47.424078 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:47 crc kubenswrapper[4931]: E0130 05:09:47.424326 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:47 crc kubenswrapper[4931]: E0130 05:09:47.424832 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:47 crc kubenswrapper[4931]: E0130 05:09:47.424909 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:49 crc kubenswrapper[4931]: I0130 05:09:49.421411 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:49 crc kubenswrapper[4931]: I0130 05:09:49.421554 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:49 crc kubenswrapper[4931]: I0130 05:09:49.421595 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:49 crc kubenswrapper[4931]: E0130 05:09:49.421716 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:49 crc kubenswrapper[4931]: E0130 05:09:49.421886 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:49 crc kubenswrapper[4931]: E0130 05:09:49.421982 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:49 crc kubenswrapper[4931]: I0130 05:09:49.422698 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:49 crc kubenswrapper[4931]: E0130 05:09:49.422950 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:50 crc kubenswrapper[4931]: E0130 05:09:50.562361 4931 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:09:51 crc kubenswrapper[4931]: I0130 05:09:51.420998 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:51 crc kubenswrapper[4931]: I0130 05:09:51.421069 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:51 crc kubenswrapper[4931]: I0130 05:09:51.421117 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:51 crc kubenswrapper[4931]: I0130 05:09:51.421019 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:51 crc kubenswrapper[4931]: E0130 05:09:51.421201 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:51 crc kubenswrapper[4931]: E0130 05:09:51.421467 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:51 crc kubenswrapper[4931]: E0130 05:09:51.421491 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:51 crc kubenswrapper[4931]: E0130 05:09:51.421554 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:53 crc kubenswrapper[4931]: I0130 05:09:53.422008 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:53 crc kubenswrapper[4931]: I0130 05:09:53.422114 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:53 crc kubenswrapper[4931]: I0130 05:09:53.422199 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:53 crc kubenswrapper[4931]: I0130 05:09:53.422229 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:53 crc kubenswrapper[4931]: E0130 05:09:53.422221 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:53 crc kubenswrapper[4931]: E0130 05:09:53.422401 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:53 crc kubenswrapper[4931]: E0130 05:09:53.422649 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:53 crc kubenswrapper[4931]: E0130 05:09:53.422775 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:55 crc kubenswrapper[4931]: I0130 05:09:55.421624 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:55 crc kubenswrapper[4931]: I0130 05:09:55.421686 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:55 crc kubenswrapper[4931]: I0130 05:09:55.421648 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:55 crc kubenswrapper[4931]: I0130 05:09:55.424932 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:55 crc kubenswrapper[4931]: E0130 05:09:55.425019 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:55 crc kubenswrapper[4931]: E0130 05:09:55.425366 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:55 crc kubenswrapper[4931]: E0130 05:09:55.425628 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:55 crc kubenswrapper[4931]: I0130 05:09:55.425733 4931 scope.go:117] "RemoveContainer" containerID="c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0" Jan 30 05:09:55 crc kubenswrapper[4931]: E0130 05:09:55.425916 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:55 crc kubenswrapper[4931]: E0130 05:09:55.563647 4931 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:09:56 crc kubenswrapper[4931]: I0130 05:09:56.349177 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/1.log" Jan 30 05:09:56 crc kubenswrapper[4931]: I0130 05:09:56.349244 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerStarted","Data":"9cbe0bfee502f12e8f2f3a6f1a461efb27353f5529809ccc54fecbb26b304ada"} Jan 30 05:09:57 crc kubenswrapper[4931]: I0130 05:09:57.421895 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:57 crc kubenswrapper[4931]: I0130 05:09:57.421951 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:57 crc kubenswrapper[4931]: I0130 05:09:57.422010 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:57 crc kubenswrapper[4931]: E0130 05:09:57.422208 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:57 crc kubenswrapper[4931]: I0130 05:09:57.422269 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:57 crc kubenswrapper[4931]: E0130 05:09:57.422473 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:57 crc kubenswrapper[4931]: E0130 05:09:57.422536 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:57 crc kubenswrapper[4931]: E0130 05:09:57.422620 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:09:59 crc kubenswrapper[4931]: I0130 05:09:59.421158 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:59 crc kubenswrapper[4931]: I0130 05:09:59.421211 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:09:59 crc kubenswrapper[4931]: I0130 05:09:59.421261 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:09:59 crc kubenswrapper[4931]: I0130 05:09:59.421298 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:09:59 crc kubenswrapper[4931]: E0130 05:09:59.422964 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:09:59 crc kubenswrapper[4931]: E0130 05:09:59.423172 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:09:59 crc kubenswrapper[4931]: E0130 05:09:59.423284 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt48b" podUID="1421762e-4873-46cb-8c43-b8faa0cbca62" Jan 30 05:09:59 crc kubenswrapper[4931]: E0130 05:09:59.423497 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.421046 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.421052 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.421211 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.421135 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.424398 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.424906 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.425284 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.425689 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.425833 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 05:10:01 crc kubenswrapper[4931]: I0130 05:10:01.425838 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.881739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.954763 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k9mcd"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.955537 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.957615 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ww4ml"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.958149 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.959822 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8ndkb"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.960853 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.962743 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsn4r"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.963182 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.963829 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.964505 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.967494 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.967859 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.969899 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ff4lr"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.970864 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.971028 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.971408 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.971640 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.974833 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.975298 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.976038 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.979808 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6wmnm"] Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.980003 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.989929 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.990403 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.994539 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.990857 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.993882 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.991292 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.991452 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.992847 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.993947 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.994592 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 05:10:04 crc kubenswrapper[4931]: I0130 05:10:04.994833 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.012789 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.012867 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.014072 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.015336 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.015994 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.016148 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.016272 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.021676 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.021907 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.022088 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.022297 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.022457 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tbgzs"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.022637 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.022816 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.023050 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.022786 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.023641 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7jf2b"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.024347 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.024506 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.024592 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.024807 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.027405 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.027903 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.028093 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.028185 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.028608 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wwdht"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.028805 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.028900 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029010 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.028846 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029160 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029183 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029206 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029303 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029308 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.029717 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.032481 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.033297 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.036840 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.037794 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.038172 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f8zg7"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.038720 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.039197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.039400 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.039649 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.039968 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040479 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d163e-7881-411f-a61b-a00e9c8bc9dc-config\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040534 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040611 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442g6\" (UniqueName: \"kubernetes.io/projected/cd36df00-a4ac-44ab-bdee-fcf018713f78-kube-api-access-442g6\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040636 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e06ad469-0fb9-47d7-90fc-3c74ef8bb833-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7242g\" (UID: \"e06ad469-0fb9-47d7-90fc-3c74ef8bb833\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040673 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vwh\" (UniqueName: \"kubernetes.io/projected/62b9975b-f28e-46de-89a0-bac3d2e7f927-kube-api-access-77vwh\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040698 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbab60d9-c5df-4396-8012-94dc987f82c2-node-pullsecrets\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040717 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61a1f22c-baac-4356-9d01-ec2b51700b3a-serving-cert\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040743 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040764 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-serving-cert\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040784 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-service-ca-bundle\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040808 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040831 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-audit\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040851 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040870 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/177d163e-7881-411f-a61b-a00e9c8bc9dc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd326f4-63cb-4c1d-bb6c-98118a45f714-serving-cert\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040913 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040932 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-oauth-config\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040963 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-trusted-ca-bundle\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.040985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-config\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041008 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-serving-cert\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041054 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041076 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-etcd-serving-ca\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041099 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041121 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qxm\" (UniqueName: \"kubernetes.io/projected/4fd326f4-63cb-4c1d-bb6c-98118a45f714-kube-api-access-22qxm\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041144 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041170 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-client-ca\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041192 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62b9975b-f28e-46de-89a0-bac3d2e7f927-machine-approver-tls\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041235 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-config\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041257 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-policies\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041276 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-dir\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041299 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62b9975b-f28e-46de-89a0-bac3d2e7f927-auth-proxy-config\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041320 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-config\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041354 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-client-ca\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041379 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd36df00-a4ac-44ab-bdee-fcf018713f78-serving-cert\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041403 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041447 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-config\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041473 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2kc7\" (UniqueName: \"kubernetes.io/projected/61a1f22c-baac-4356-9d01-ec2b51700b3a-kube-api-access-b2kc7\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041494 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9975b-f28e-46de-89a0-bac3d2e7f927-config\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041528 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrkft\" (UniqueName: \"kubernetes.io/projected/dbab60d9-c5df-4396-8012-94dc987f82c2-kube-api-access-lrkft\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041614 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmxjn\" (UniqueName: \"kubernetes.io/projected/cf0e8eba-09e8-4d9c-87de-9c57583e7276-kube-api-access-cmxjn\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041641 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-oauth-serving-cert\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041687 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/177d163e-7881-411f-a61b-a00e9c8bc9dc-images\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041713 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-config\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041770 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-encryption-config\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041797 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-etcd-client\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041823 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041874 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q2wf\" (UniqueName: \"kubernetes.io/projected/177d163e-7881-411f-a61b-a00e9c8bc9dc-kube-api-access-5q2wf\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041895 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-service-ca\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041941 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwlrb\" (UniqueName: \"kubernetes.io/projected/e06ad469-0fb9-47d7-90fc-3c74ef8bb833-kube-api-access-mwlrb\") pod \"cluster-samples-operator-665b6dd947-7242g\" (UID: \"e06ad469-0fb9-47d7-90fc-3c74ef8bb833\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.041965 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbab60d9-c5df-4396-8012-94dc987f82c2-audit-dir\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.042012 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-image-import-ca\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.042039 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.042062 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmhv\" (UniqueName: \"kubernetes.io/projected/45ceead9-96b4-4b3c-9fba-1288da84db97-kube-api-access-gvmhv\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.042628 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.042753 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wwbmr"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.045662 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-r62wb"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.046299 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.047386 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.047495 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.048123 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.055401 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.060550 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-268mt"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.061043 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.061965 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.062249 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.067613 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.087513 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.089329 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.090115 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.091119 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.091479 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.092080 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.092389 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ww4ml"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.092444 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsn4r"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.092460 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8ndkb"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.097615 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.097938 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.098145 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.098409 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.098607 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.098782 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.099378 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.101943 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.102385 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.102700 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.103907 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.104074 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.104483 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.104956 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c8568"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.105674 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.105860 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.140839 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.141997 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145492 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145532 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbab60d9-c5df-4396-8012-94dc987f82c2-audit-dir\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145553 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q2wf\" (UniqueName: \"kubernetes.io/projected/177d163e-7881-411f-a61b-a00e9c8bc9dc-kube-api-access-5q2wf\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145573 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-service-ca\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145595 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwlrb\" (UniqueName: \"kubernetes.io/projected/e06ad469-0fb9-47d7-90fc-3c74ef8bb833-kube-api-access-mwlrb\") pod \"cluster-samples-operator-665b6dd947-7242g\" (UID: \"e06ad469-0fb9-47d7-90fc-3c74ef8bb833\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-image-import-ca\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145637 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmhv\" (UniqueName: \"kubernetes.io/projected/45ceead9-96b4-4b3c-9fba-1288da84db97-kube-api-access-gvmhv\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145678 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d163e-7881-411f-a61b-a00e9c8bc9dc-config\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145699 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145721 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145745 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442g6\" (UniqueName: \"kubernetes.io/projected/cd36df00-a4ac-44ab-bdee-fcf018713f78-kube-api-access-442g6\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145762 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e06ad469-0fb9-47d7-90fc-3c74ef8bb833-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7242g\" (UID: \"e06ad469-0fb9-47d7-90fc-3c74ef8bb833\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145785 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9ce65b-1339-4198-ae4d-5697206eba5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145805 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vwh\" (UniqueName: \"kubernetes.io/projected/62b9975b-f28e-46de-89a0-bac3d2e7f927-kube-api-access-77vwh\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145825 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbab60d9-c5df-4396-8012-94dc987f82c2-node-pullsecrets\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145842 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61a1f22c-baac-4356-9d01-ec2b51700b3a-serving-cert\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145860 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.145888 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-serving-cert\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.146006 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-service-ca-bundle\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.146895 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-service-ca-bundle\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147050 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147101 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/177d163e-7881-411f-a61b-a00e9c8bc9dc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147129 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147227 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d9ce65b-1339-4198-ae4d-5697206eba5f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147255 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-audit\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147306 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd326f4-63cb-4c1d-bb6c-98118a45f714-serving-cert\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147333 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147352 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d9ce65b-1339-4198-ae4d-5697206eba5f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147374 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-oauth-config\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147391 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-service-ca\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147446 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-config\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147485 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-serving-cert\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147505 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-trusted-ca-bundle\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147525 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-config\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147545 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kpsv\" (UniqueName: \"kubernetes.io/projected/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-kube-api-access-9kpsv\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147556 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k9mcd"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147565 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147590 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147612 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-etcd-serving-ca\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147631 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44nl\" (UniqueName: \"kubernetes.io/projected/9d9ce65b-1339-4198-ae4d-5697206eba5f-kube-api-access-d44nl\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147643 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147656 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147684 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22qxm\" (UniqueName: \"kubernetes.io/projected/4fd326f4-63cb-4c1d-bb6c-98118a45f714-kube-api-access-22qxm\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147728 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-trusted-ca\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147748 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-client-ca\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147789 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62b9975b-f28e-46de-89a0-bac3d2e7f927-machine-approver-tls\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-config\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147832 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-policies\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147854 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-dir\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147873 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62b9975b-f28e-46de-89a0-bac3d2e7f927-auth-proxy-config\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147893 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-config\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147922 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-client-ca\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147964 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd36df00-a4ac-44ab-bdee-fcf018713f78-serving-cert\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.147983 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-config\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148001 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148020 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-ca\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148043 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-client\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148069 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2kc7\" (UniqueName: \"kubernetes.io/projected/61a1f22c-baac-4356-9d01-ec2b51700b3a-kube-api-access-b2kc7\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148088 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9975b-f28e-46de-89a0-bac3d2e7f927-config\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148106 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-config\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148110 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148138 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrkft\" (UniqueName: \"kubernetes.io/projected/dbab60d9-c5df-4396-8012-94dc987f82c2-kube-api-access-lrkft\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148160 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmxjn\" (UniqueName: \"kubernetes.io/projected/cf0e8eba-09e8-4d9c-87de-9c57583e7276-kube-api-access-cmxjn\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148177 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpkw\" (UniqueName: \"kubernetes.io/projected/3ac5359b-e653-4824-ad6f-4672970dc0cc-kube-api-access-dvpkw\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148195 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-serving-cert\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148214 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/177d163e-7881-411f-a61b-a00e9c8bc9dc-images\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148233 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-oauth-serving-cert\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-config\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148288 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-encryption-config\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac5359b-e653-4824-ad6f-4672970dc0cc-serving-cert\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148318 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.148338 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-etcd-client\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.150447 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-client-ca\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.150950 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-config\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.153096 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.153164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dbab60d9-c5df-4396-8012-94dc987f82c2-audit-dir\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.155257 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-service-ca\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.155653 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.156013 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-serving-cert\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.157130 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d163e-7881-411f-a61b-a00e9c8bc9dc-config\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.157575 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.157823 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.158975 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.161026 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.161364 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-policies\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.162587 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9975b-f28e-46de-89a0-bac3d2e7f927-config\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.163863 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.164335 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.164503 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/177d163e-7881-411f-a61b-a00e9c8bc9dc-images\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.164883 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.165282 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.165630 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.165897 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.165966 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-phq4q"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.166226 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-oauth-serving-cert\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.166385 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-dir\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.166408 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.166776 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.167637 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lmnvn"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.168389 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-config\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.168697 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62b9975b-f28e-46de-89a0-bac3d2e7f927-auth-proxy-config\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.168707 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.169111 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.169750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-image-import-ca\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.169868 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-config\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.170423 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-etcd-serving-ca\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.170658 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.170743 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.173422 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.174157 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.171349 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-config\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.175450 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.175879 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.171037 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.171171 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-client-ca\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.171142 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.176294 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.177167 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dbab60d9-c5df-4396-8012-94dc987f82c2-node-pullsecrets\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.177309 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.181542 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.182313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.182593 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.182696 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.182809 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.182856 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-encryption-config\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.182971 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-config\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183076 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-serving-cert\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183232 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183290 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-audit\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183278 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183323 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183238 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183383 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183494 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.183776 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184295 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184301 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184578 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/177d163e-7881-411f-a61b-a00e9c8bc9dc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184693 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184817 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184905 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.184996 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.185055 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.185074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e06ad469-0fb9-47d7-90fc-3c74ef8bb833-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7242g\" (UID: \"e06ad469-0fb9-47d7-90fc-3c74ef8bb833\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.185164 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.185185 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.185958 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.186241 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.186322 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.186452 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbab60d9-c5df-4396-8012-94dc987f82c2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.186868 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.187839 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd36df00-a4ac-44ab-bdee-fcf018713f78-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.187884 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.188772 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.202913 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.202834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dbab60d9-c5df-4396-8012-94dc987f82c2-etcd-client\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.204099 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.204271 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-trusted-ca-bundle\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.204757 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62b9975b-f28e-46de-89a0-bac3d2e7f927-machine-approver-tls\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.205120 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.205536 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd36df00-a4ac-44ab-bdee-fcf018713f78-serving-cert\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.205648 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.207034 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.208950 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-oauth-config\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.211594 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.211634 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.212108 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.212875 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.216280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61a1f22c-baac-4356-9d01-ec2b51700b3a-serving-cert\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.214628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.215195 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.216295 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ff4lr"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.214239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd326f4-63cb-4c1d-bb6c-98118a45f714-serving-cert\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.217343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.218064 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.219074 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.221268 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.226741 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.226774 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.229257 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.229384 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.230030 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-scnkp"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.230463 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.230718 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.230946 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.231709 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.232814 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.233342 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.234252 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4jb99"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.234977 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.235934 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wwdht"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.236683 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tbgzs"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.238614 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.239347 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.240575 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c8568"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.241796 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.242680 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.243926 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.245257 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6wmnm"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.246715 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f8zg7"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.247992 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bj2bf"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249260 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10add70-0777-45cf-9555-7bda3b6ebeec-trusted-ca\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249313 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249358 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9ce65b-1339-4198-ae4d-5697206eba5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249381 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e8738e-651f-4f09-a052-1ff22028e3f3-service-ca-bundle\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249403 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5lp4\" (UniqueName: \"kubernetes.io/projected/6e653e21-3e72-4867-b39e-f374d752d503-kube-api-access-f5lp4\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249443 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pv9\" (UniqueName: \"kubernetes.io/projected/f10add70-0777-45cf-9555-7bda3b6ebeec-kube-api-access-l8pv9\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249552 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249622 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d9ce65b-1339-4198-ae4d-5697206eba5f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249638 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249654 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e653e21-3e72-4867-b39e-f374d752d503-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249719 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d9ce65b-1339-4198-ae4d-5697206eba5f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249775 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-service-ca\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249826 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-config\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249884 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kpsv\" (UniqueName: \"kubernetes.io/projected/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-kube-api-access-9kpsv\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.249987 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79v2m\" (UniqueName: \"kubernetes.io/projected/c142d29b-ca43-49b7-8055-3175cdf9c45e-kube-api-access-79v2m\") pod \"dns-operator-744455d44c-r62wb\" (UID: \"c142d29b-ca43-49b7-8055-3175cdf9c45e\") " pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250044 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f10add70-0777-45cf-9555-7bda3b6ebeec-metrics-tls\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250073 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44nl\" (UniqueName: \"kubernetes.io/projected/9d9ce65b-1339-4198-ae4d-5697206eba5f-kube-api-access-d44nl\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250179 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-trusted-ca\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250224 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250275 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-stats-auth\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250319 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6ps5\" (UniqueName: \"kubernetes.io/projected/21e8738e-651f-4f09-a052-1ff22028e3f3-kube-api-access-k6ps5\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250354 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-ca\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250375 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-client\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250397 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-config\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250453 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-default-certificate\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250496 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-serving-cert\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250515 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpkw\" (UniqueName: \"kubernetes.io/projected/3ac5359b-e653-4824-ad6f-4672970dc0cc-kube-api-access-dvpkw\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250535 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e653e21-3e72-4867-b39e-f374d752d503-proxy-tls\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250565 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-metrics-certs\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250601 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c142d29b-ca43-49b7-8055-3175cdf9c45e-metrics-tls\") pod \"dns-operator-744455d44c-r62wb\" (UID: \"c142d29b-ca43-49b7-8055-3175cdf9c45e\") " pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250627 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac5359b-e653-4824-ad6f-4672970dc0cc-serving-cert\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.250671 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f10add70-0777-45cf-9555-7bda3b6ebeec-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.251568 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d9ce65b-1339-4198-ae4d-5697206eba5f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.252111 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-trusted-ca\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.252340 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-config\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.252365 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-52zxd"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.252986 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9ce65b-1339-4198-ae4d-5697206eba5f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.253501 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.254380 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.255571 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.255892 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-serving-cert\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.256803 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.257052 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.258451 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.259701 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.261064 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wwbmr"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.262768 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.264035 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.265623 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lmnvn"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.267402 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7jf2b"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.268897 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.269923 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.271284 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4phnt"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.273532 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.273664 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.274178 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4jb99"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.275133 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.276062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.277335 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.278250 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-phq4q"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.279311 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4phnt"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.280454 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.281602 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.282881 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.284019 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-r62wb"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.285802 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-52zxd"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.286901 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bj2bf"] Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.298223 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.317562 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.324858 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-client\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.338053 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.352220 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f10add70-0777-45cf-9555-7bda3b6ebeec-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.353017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10add70-0777-45cf-9555-7bda3b6ebeec-trusted-ca\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.353217 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.353539 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e8738e-651f-4f09-a052-1ff22028e3f3-service-ca-bundle\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.353838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5lp4\" (UniqueName: \"kubernetes.io/projected/6e653e21-3e72-4867-b39e-f374d752d503-kube-api-access-f5lp4\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.353961 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pv9\" (UniqueName: \"kubernetes.io/projected/f10add70-0777-45cf-9555-7bda3b6ebeec-kube-api-access-l8pv9\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.354345 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355156 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e653e21-3e72-4867-b39e-f374d752d503-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79v2m\" (UniqueName: \"kubernetes.io/projected/c142d29b-ca43-49b7-8055-3175cdf9c45e-kube-api-access-79v2m\") pod \"dns-operator-744455d44c-r62wb\" (UID: \"c142d29b-ca43-49b7-8055-3175cdf9c45e\") " pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355309 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f10add70-0777-45cf-9555-7bda3b6ebeec-metrics-tls\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355375 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355414 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-stats-auth\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6ps5\" (UniqueName: \"kubernetes.io/projected/21e8738e-651f-4f09-a052-1ff22028e3f3-kube-api-access-k6ps5\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-default-certificate\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e653e21-3e72-4867-b39e-f374d752d503-proxy-tls\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355885 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-metrics-certs\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.355934 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c142d29b-ca43-49b7-8055-3175cdf9c45e-metrics-tls\") pod \"dns-operator-744455d44c-r62wb\" (UID: \"c142d29b-ca43-49b7-8055-3175cdf9c45e\") " pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.357317 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e653e21-3e72-4867-b39e-f374d752d503-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.358189 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.364750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac5359b-e653-4824-ad6f-4672970dc0cc-serving-cert\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.378063 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.398151 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.418440 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.429407 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c142d29b-ca43-49b7-8055-3175cdf9c45e-metrics-tls\") pod \"dns-operator-744455d44c-r62wb\" (UID: \"c142d29b-ca43-49b7-8055-3175cdf9c45e\") " pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.437680 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.457910 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.477881 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.497559 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.501949 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-config\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.517312 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.537405 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.557996 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.561824 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-ca\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.577683 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.588230 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.597346 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.604757 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.618050 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.621752 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac5359b-e653-4824-ad6f-4672970dc0cc-etcd-service-ca\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.637467 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.658801 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.678113 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.693390 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f10add70-0777-45cf-9555-7bda3b6ebeec-metrics-tls\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.699010 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.727862 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.735980 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10add70-0777-45cf-9555-7bda3b6ebeec-trusted-ca\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.738105 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.758934 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.771855 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-metrics-certs\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.778355 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.798052 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.810909 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-default-certificate\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.818778 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.830294 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21e8738e-651f-4f09-a052-1ff22028e3f3-stats-auth\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.838065 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.844917 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21e8738e-651f-4f09-a052-1ff22028e3f3-service-ca-bundle\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.859308 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.917733 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.937459 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.951672 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e653e21-3e72-4867-b39e-f374d752d503-proxy-tls\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.958706 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.979066 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 05:10:05 crc kubenswrapper[4931]: I0130 05:10:05.998411 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.047135 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q2wf\" (UniqueName: \"kubernetes.io/projected/177d163e-7881-411f-a61b-a00e9c8bc9dc-kube-api-access-5q2wf\") pod \"machine-api-operator-5694c8668f-k9mcd\" (UID: \"177d163e-7881-411f-a61b-a00e9c8bc9dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.067998 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwlrb\" (UniqueName: \"kubernetes.io/projected/e06ad469-0fb9-47d7-90fc-3c74ef8bb833-kube-api-access-mwlrb\") pod \"cluster-samples-operator-665b6dd947-7242g\" (UID: \"e06ad469-0fb9-47d7-90fc-3c74ef8bb833\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.088902 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442g6\" (UniqueName: \"kubernetes.io/projected/cd36df00-a4ac-44ab-bdee-fcf018713f78-kube-api-access-442g6\") pod \"authentication-operator-69f744f599-6wmnm\" (UID: \"cd36df00-a4ac-44ab-bdee-fcf018713f78\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.107001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmhv\" (UniqueName: \"kubernetes.io/projected/45ceead9-96b4-4b3c-9fba-1288da84db97-kube-api-access-gvmhv\") pod \"oauth-openshift-558db77b4-ww4ml\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.128307 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2kc7\" (UniqueName: \"kubernetes.io/projected/61a1f22c-baac-4356-9d01-ec2b51700b3a-kube-api-access-b2kc7\") pod \"controller-manager-879f6c89f-fsn4r\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.147264 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrkft\" (UniqueName: \"kubernetes.io/projected/dbab60d9-c5df-4396-8012-94dc987f82c2-kube-api-access-lrkft\") pod \"apiserver-76f77b778f-8ndkb\" (UID: \"dbab60d9-c5df-4396-8012-94dc987f82c2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.159042 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.167913 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmxjn\" (UniqueName: \"kubernetes.io/projected/cf0e8eba-09e8-4d9c-87de-9c57583e7276-kube-api-access-cmxjn\") pod \"console-f9d7485db-ff4lr\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.169071 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.177057 4931 request.go:700] Waited for 1.010108445s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.179731 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.200006 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.200469 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.215104 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.217754 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.240321 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.241252 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.242801 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.262236 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.271448 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.278846 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.327000 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.328925 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vwh\" (UniqueName: \"kubernetes.io/projected/62b9975b-f28e-46de-89a0-bac3d2e7f927-kube-api-access-77vwh\") pod \"machine-approver-56656f9798-sh5fl\" (UID: \"62b9975b-f28e-46de-89a0-bac3d2e7f927\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.358560 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.359847 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qxm\" (UniqueName: \"kubernetes.io/projected/4fd326f4-63cb-4c1d-bb6c-98118a45f714-kube-api-access-22qxm\") pod \"route-controller-manager-6576b87f9c-5zjn4\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.382050 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.399986 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.418688 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.438112 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.448402 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.448716 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.459171 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.477133 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.492873 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.493729 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g"] Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.501943 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.517879 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.525588 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8ndkb"] Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.538689 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.558967 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.579101 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.599120 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.622584 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.639295 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.658742 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.666386 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ff4lr"] Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.678561 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.698901 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.718139 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.719023 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4"] Jan 30 05:10:06 crc kubenswrapper[4931]: W0130 05:10:06.732876 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd326f4_63cb_4c1d_bb6c_98118a45f714.slice/crio-833fcf0086ce5d914597f0c997c10afab54c09e9f589df3d6d360cb20264d686 WatchSource:0}: Error finding container 833fcf0086ce5d914597f0c997c10afab54c09e9f589df3d6d360cb20264d686: Status 404 returned error can't find the container with id 833fcf0086ce5d914597f0c997c10afab54c09e9f589df3d6d360cb20264d686 Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.737462 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.758308 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.774092 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ww4ml"] Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.777603 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k9mcd"] Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.778610 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsn4r"] Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.780939 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.791254 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6wmnm"] Jan 30 05:10:06 crc kubenswrapper[4931]: W0130 05:10:06.795161 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61a1f22c_baac_4356_9d01_ec2b51700b3a.slice/crio-e5797b6657e3c9082bc25bca94daca7b60cb46a9c442bf1c2289963ba55e2ade WatchSource:0}: Error finding container e5797b6657e3c9082bc25bca94daca7b60cb46a9c442bf1c2289963ba55e2ade: Status 404 returned error can't find the container with id e5797b6657e3c9082bc25bca94daca7b60cb46a9c442bf1c2289963ba55e2ade Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.799497 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.818299 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.837842 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 05:10:06 crc kubenswrapper[4931]: W0130 05:10:06.840040 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ceead9_96b4_4b3c_9fba_1288da84db97.slice/crio-58f7af397c08f51e1fad13d7c31e06e26340ff1e4667e88288913594a1b1daca WatchSource:0}: Error finding container 58f7af397c08f51e1fad13d7c31e06e26340ff1e4667e88288913594a1b1daca: Status 404 returned error can't find the container with id 58f7af397c08f51e1fad13d7c31e06e26340ff1e4667e88288913594a1b1daca Jan 30 05:10:06 crc kubenswrapper[4931]: W0130 05:10:06.840809 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod177d163e_7881_411f_a61b_a00e9c8bc9dc.slice/crio-0c76b00db36e1460a93a4d001a56b2b33fcfd76b0b28d25ce3ce7f0597833b2b WatchSource:0}: Error finding container 0c76b00db36e1460a93a4d001a56b2b33fcfd76b0b28d25ce3ce7f0597833b2b: Status 404 returned error can't find the container with id 0c76b00db36e1460a93a4d001a56b2b33fcfd76b0b28d25ce3ce7f0597833b2b Jan 30 05:10:06 crc kubenswrapper[4931]: W0130 05:10:06.841490 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd36df00_a4ac_44ab_bdee_fcf018713f78.slice/crio-37fe82f073303ad2ec26a00e779c977370715ada853bd13298f25780d264336c WatchSource:0}: Error finding container 37fe82f073303ad2ec26a00e779c977370715ada853bd13298f25780d264336c: Status 404 returned error can't find the container with id 37fe82f073303ad2ec26a00e779c977370715ada853bd13298f25780d264336c Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.859632 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.878544 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.897594 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.919221 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.939148 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.958014 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.977275 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 05:10:06 crc kubenswrapper[4931]: I0130 05:10:06.997267 4931 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.040239 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.041281 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d9ce65b-1339-4198-ae4d-5697206eba5f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.058786 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.110209 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kpsv\" (UniqueName: \"kubernetes.io/projected/606fa13b-30a4-412e-86eb-9fcb5bc8ebb6-kube-api-access-9kpsv\") pod \"console-operator-58897d9998-7jf2b\" (UID: \"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6\") " pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.115221 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44nl\" (UniqueName: \"kubernetes.io/projected/9d9ce65b-1339-4198-ae4d-5697206eba5f-kube-api-access-d44nl\") pod \"cluster-image-registry-operator-dc59b4c8b-d7ptq\" (UID: \"9d9ce65b-1339-4198-ae4d-5697206eba5f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.137994 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.142122 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpkw\" (UniqueName: \"kubernetes.io/projected/3ac5359b-e653-4824-ad6f-4672970dc0cc-kube-api-access-dvpkw\") pod \"etcd-operator-b45778765-wwbmr\" (UID: \"3ac5359b-e653-4824-ad6f-4672970dc0cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.156188 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.157605 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.171569 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.179074 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.196777 4931 request.go:700] Waited for 1.94295257s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.200174 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.216138 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.218990 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.237210 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.261092 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.297196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f10add70-0777-45cf-9555-7bda3b6ebeec-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.313093 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5lp4\" (UniqueName: \"kubernetes.io/projected/6e653e21-3e72-4867-b39e-f374d752d503-kube-api-access-f5lp4\") pod \"machine-config-controller-84d6567774-c8568\" (UID: \"6e653e21-3e72-4867-b39e-f374d752d503\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.341776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pv9\" (UniqueName: \"kubernetes.io/projected/f10add70-0777-45cf-9555-7bda3b6ebeec-kube-api-access-l8pv9\") pod \"ingress-operator-5b745b69d9-2sc4j\" (UID: \"f10add70-0777-45cf-9555-7bda3b6ebeec\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.357975 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f56094dd-41e6-41ed-9660-73cc0a3eb1ba-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2d7cc\" (UID: \"f56094dd-41e6-41ed-9660-73cc0a3eb1ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.386933 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79v2m\" (UniqueName: \"kubernetes.io/projected/c142d29b-ca43-49b7-8055-3175cdf9c45e-kube-api-access-79v2m\") pod \"dns-operator-744455d44c-r62wb\" (UID: \"c142d29b-ca43-49b7-8055-3175cdf9c45e\") " pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.399293 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6ps5\" (UniqueName: \"kubernetes.io/projected/21e8738e-651f-4f09-a052-1ff22028e3f3-kube-api-access-k6ps5\") pod \"router-default-5444994796-268mt\" (UID: \"21e8738e-651f-4f09-a052-1ff22028e3f3\") " pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.415603 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" event={"ID":"62b9975b-f28e-46de-89a0-bac3d2e7f927","Type":"ContainerStarted","Data":"586161ab3ede775ac3f91e597f7b0b1a477402d8abf5c30a4e8f49f56e2dd9d8"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.415669 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" event={"ID":"62b9975b-f28e-46de-89a0-bac3d2e7f927","Type":"ContainerStarted","Data":"50ef286c889de6bdaef344297b333f2d83d4717f3d694f829044cd75e6359a43"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.432889 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" event={"ID":"45ceead9-96b4-4b3c-9fba-1288da84db97","Type":"ContainerStarted","Data":"b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.432950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" event={"ID":"45ceead9-96b4-4b3c-9fba-1288da84db97","Type":"ContainerStarted","Data":"58f7af397c08f51e1fad13d7c31e06e26340ff1e4667e88288913594a1b1daca"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.433263 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.437609 4931 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ww4ml container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" start-of-body= Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.437676 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" podUID="45ceead9-96b4-4b3c-9fba-1288da84db97" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.449932 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq"] Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.455594 4931 generic.go:334] "Generic (PLEG): container finished" podID="dbab60d9-c5df-4396-8012-94dc987f82c2" containerID="3f631eef3438b6a10b7ec9737933edad6e5d0320c35f10d8a0eecdffe6bb346f" exitCode=0 Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.455814 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" event={"ID":"dbab60d9-c5df-4396-8012-94dc987f82c2","Type":"ContainerDied","Data":"3f631eef3438b6a10b7ec9737933edad6e5d0320c35f10d8a0eecdffe6bb346f"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.455867 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" event={"ID":"dbab60d9-c5df-4396-8012-94dc987f82c2","Type":"ContainerStarted","Data":"7031bf54e3631eefcdafc9a21435a3301aa9fa45b4b5aa4cce5c515cdfbdcd56"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.460636 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" event={"ID":"e06ad469-0fb9-47d7-90fc-3c74ef8bb833","Type":"ContainerStarted","Data":"a2e9b7dac206f851fa736dfb4c87eb536c9b58782137e02ba4984a609d0e49fb"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.460696 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" event={"ID":"e06ad469-0fb9-47d7-90fc-3c74ef8bb833","Type":"ContainerStarted","Data":"55fe6f4dad892ec73a230daaa6e14e6e05918aea3140a9d1b7451b371834b786"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.467488 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ff4lr" event={"ID":"cf0e8eba-09e8-4d9c-87de-9c57583e7276","Type":"ContainerStarted","Data":"0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.467522 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ff4lr" event={"ID":"cf0e8eba-09e8-4d9c-87de-9c57583e7276","Type":"ContainerStarted","Data":"6ef4e3652e767b58bcd714efc40fa7c13d1316dc132366e3239b8378ad811289"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.472232 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" event={"ID":"61a1f22c-baac-4356-9d01-ec2b51700b3a","Type":"ContainerStarted","Data":"5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.472273 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" event={"ID":"61a1f22c-baac-4356-9d01-ec2b51700b3a","Type":"ContainerStarted","Data":"e5797b6657e3c9082bc25bca94daca7b60cb46a9c442bf1c2289963ba55e2ade"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.472450 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.474467 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" event={"ID":"177d163e-7881-411f-a61b-a00e9c8bc9dc","Type":"ContainerStarted","Data":"1d8da87a29ce5940ad3ab1452a8ec7b489200512c995b66af845eadbd6ebfb32"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.474534 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" event={"ID":"177d163e-7881-411f-a61b-a00e9c8bc9dc","Type":"ContainerStarted","Data":"0c76b00db36e1460a93a4d001a56b2b33fcfd76b0b28d25ce3ce7f0597833b2b"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.476192 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" event={"ID":"cd36df00-a4ac-44ab-bdee-fcf018713f78","Type":"ContainerStarted","Data":"77ea8e35397d40af396e6f38d330aa0147be23d8777295af50f404fccf6ee812"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.476214 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" event={"ID":"cd36df00-a4ac-44ab-bdee-fcf018713f78","Type":"ContainerStarted","Data":"37fe82f073303ad2ec26a00e779c977370715ada853bd13298f25780d264336c"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.477973 4931 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fsn4r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.478009 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" podUID="61a1f22c-baac-4356-9d01-ec2b51700b3a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.478475 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" event={"ID":"4fd326f4-63cb-4c1d-bb6c-98118a45f714","Type":"ContainerStarted","Data":"8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.478530 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" event={"ID":"4fd326f4-63cb-4c1d-bb6c-98118a45f714","Type":"ContainerStarted","Data":"833fcf0086ce5d914597f0c997c10afab54c09e9f589df3d6d360cb20264d686"} Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.480858 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.482111 4931 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5zjn4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.482153 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" podUID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.488344 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7jf2b"] Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507730 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32e4a367-9945-4fdb-b5bc-4c8d35512264-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507803 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-certificates\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507821 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjxk\" (UniqueName: \"kubernetes.io/projected/8158446c-5883-48ad-86da-77db470d8214-kube-api-access-rcjxk\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507850 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-audit-policies\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507868 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-trusted-ca\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507904 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqvm\" (UniqueName: \"kubernetes.io/projected/fc8b1aac-27e5-4f8c-a329-821c231fb7c6-kube-api-access-qlqvm\") pod \"downloads-7954f5f757-tbgzs\" (UID: \"fc8b1aac-27e5-4f8c-a329-821c231fb7c6\") " pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nltg\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-kube-api-access-6nltg\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507949 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-config\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507965 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34eba67-19a6-4d4e-a902-9482b2847199-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.507982 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/25a99ace-2c29-419e-b5de-3f11b024ee43-audit-dir\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508004 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-bound-sa-token\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508038 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907e32f6-7e41-43fb-862c-c6a5f835ff73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508055 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-encryption-config\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508073 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvk4j\" (UniqueName: \"kubernetes.io/projected/907e32f6-7e41-43fb-862c-c6a5f835ff73-kube-api-access-hvk4j\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508189 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-serving-cert\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508223 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8158446c-5883-48ad-86da-77db470d8214-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dkvt\" (UniqueName: \"kubernetes.io/projected/25a99ace-2c29-419e-b5de-3f11b024ee43-kube-api-access-9dkvt\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f183f0-7d5c-45c9-88a3-df19bd214439-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508275 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-etcd-client\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508290 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a34eba67-19a6-4d4e-a902-9482b2847199-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508304 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-tls\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.508324 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.509641 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sx7q\" (UniqueName: \"kubernetes.io/projected/a34eba67-19a6-4d4e-a902-9482b2847199-kube-api-access-8sx7q\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.509675 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.509692 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.509711 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907e32f6-7e41-43fb-862c-c6a5f835ff73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.512145 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f183f0-7d5c-45c9-88a3-df19bd214439-config\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.512235 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63f183f0-7d5c-45c9-88a3-df19bd214439-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.512336 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.512162 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" Jan 30 05:10:07 crc kubenswrapper[4931]: E0130 05:10:07.512965 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.012942107 +0000 UTC m=+143.382852364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.513079 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8158446c-5883-48ad-86da-77db470d8214-serving-cert\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.513269 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32e4a367-9945-4fdb-b5bc-4c8d35512264-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.522528 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wwbmr"] Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.523917 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.531963 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.538657 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.552626 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" Jan 30 05:10:07 crc kubenswrapper[4931]: W0130 05:10:07.609161 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e8738e_651f_4f09_a052_1ff22028e3f3.slice/crio-5451d8c83316bfe9169f10fc93c4221f7918a21b39649b65fa46ec5547e8d5ea WatchSource:0}: Error finding container 5451d8c83316bfe9169f10fc93c4221f7918a21b39649b65fa46ec5547e8d5ea: Status 404 returned error can't find the container with id 5451d8c83316bfe9169f10fc93c4221f7918a21b39649b65fa46ec5547e8d5ea Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.616080 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:07 crc kubenswrapper[4931]: E0130 05:10:07.616792 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.116732704 +0000 UTC m=+143.486642961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617154 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617237 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617289 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-config\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617336 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907e32f6-7e41-43fb-862c-c6a5f835ff73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617360 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-proxy-tls\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617444 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-certs\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617475 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f183f0-7d5c-45c9-88a3-df19bd214439-config\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617514 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63f183f0-7d5c-45c9-88a3-df19bd214439-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gbb\" (UniqueName: \"kubernetes.io/projected/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-kube-api-access-f7gbb\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617616 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2b7b\" (UniqueName: \"kubernetes.io/projected/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-kube-api-access-l2b7b\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617731 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8158446c-5883-48ad-86da-77db470d8214-serving-cert\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0071e157-d4b7-40ac-8a50-35f9b7aa961d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617950 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbwmm\" (UniqueName: \"kubernetes.io/projected/8ec2553d-d0b3-4b15-a42c-73c1c25ea70f-kube-api-access-cbwmm\") pod \"migrator-59844c95c7-gb6js\" (UID: \"8ec2553d-d0b3-4b15-a42c-73c1c25ea70f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.617990 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-serving-cert\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618014 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ksg\" (UniqueName: \"kubernetes.io/projected/aa1d7187-eeb1-4145-8ba0-dd1e43023003-kube-api-access-x5ksg\") pod \"package-server-manager-789f6589d5-mg25x\" (UID: \"aa1d7187-eeb1-4145-8ba0-dd1e43023003\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618050 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmq7k\" (UniqueName: \"kubernetes.io/projected/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-kube-api-access-xmq7k\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618091 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32e4a367-9945-4fdb-b5bc-4c8d35512264-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-mountpoint-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618173 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618200 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618235 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48aa89b7-0ab1-432b-a693-2d56358c1d83-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618362 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855th\" (UniqueName: \"kubernetes.io/projected/1a8f99a6-f163-4720-8eb4-bc8607753d79-kube-api-access-855th\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618396 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-signing-cabundle\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618678 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32e4a367-9945-4fdb-b5bc-4c8d35512264-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.618714 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clphc\" (UniqueName: \"kubernetes.io/projected/6db474c1-8489-43b1-bb9e-0961f9dc1dc4-kube-api-access-clphc\") pod \"multus-admission-controller-857f4d67dd-lmnvn\" (UID: \"6db474c1-8489-43b1-bb9e-0961f9dc1dc4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.619751 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907e32f6-7e41-43fb-862c-c6a5f835ff73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.619939 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32e4a367-9945-4fdb-b5bc-4c8d35512264-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.621729 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f183f0-7d5c-45c9-88a3-df19bd214439-config\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: E0130 05:10:07.622646 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.122618739 +0000 UTC m=+143.492528996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.623572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-signing-key\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.623694 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjxk\" (UniqueName: \"kubernetes.io/projected/8158446c-5883-48ad-86da-77db470d8214-kube-api-access-rcjxk\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.624488 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-certificates\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.624725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.624790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6db474c1-8489-43b1-bb9e-0961f9dc1dc4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lmnvn\" (UID: \"6db474c1-8489-43b1-bb9e-0961f9dc1dc4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.625145 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-audit-policies\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.626965 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7krc\" (UniqueName: \"kubernetes.io/projected/e521b474-9f29-4841-a365-ed1589358607-kube-api-access-m7krc\") pod \"control-plane-machine-set-operator-78cbb6b69f-w2zzb\" (UID: \"e521b474-9f29-4841-a365-ed1589358607\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.628640 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qt2\" (UniqueName: \"kubernetes.io/projected/0071e157-d4b7-40ac-8a50-35f9b7aa961d-kube-api-access-56qt2\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.629110 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-trusted-ca\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.629146 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqvm\" (UniqueName: \"kubernetes.io/projected/fc8b1aac-27e5-4f8c-a329-821c231fb7c6-kube-api-access-qlqvm\") pod \"downloads-7954f5f757-tbgzs\" (UID: \"fc8b1aac-27e5-4f8c-a329-821c231fb7c6\") " pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.629481 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.630771 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-certificates\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.631530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-audit-policies\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.631775 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2g64\" (UniqueName: \"kubernetes.io/projected/107d8fb1-31b1-4bec-8d55-a27e312609b1-kube-api-access-r2g64\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.631813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-config\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.631877 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-node-bootstrap-token\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.632021 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nltg\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-kube-api-access-6nltg\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.632062 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34eba67-19a6-4d4e-a902-9482b2847199-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.632127 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/25a99ace-2c29-419e-b5de-3f11b024ee43-audit-dir\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.632598 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-trusted-ca\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.633206 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-config\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.634082 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32e4a367-9945-4fdb-b5bc-4c8d35512264-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.635112 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8158446c-5883-48ad-86da-77db470d8214-serving-cert\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636166 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-bound-sa-token\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636331 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a8f99a6-f163-4720-8eb4-bc8607753d79-secret-volume\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636364 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-plugins-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636386 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-tmpfs\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636577 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907e32f6-7e41-43fb-862c-c6a5f835ff73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636608 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msk64\" (UniqueName: \"kubernetes.io/projected/eb3c187d-243f-457f-b419-e02a6898fd48-kube-api-access-msk64\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636633 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-socket-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636677 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-encryption-config\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636705 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvk4j\" (UniqueName: \"kubernetes.io/projected/907e32f6-7e41-43fb-862c-c6a5f835ff73-kube-api-access-hvk4j\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636725 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-images\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636747 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa1d7187-eeb1-4145-8ba0-dd1e43023003-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mg25x\" (UID: \"aa1d7187-eeb1-4145-8ba0-dd1e43023003\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.636921 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6c9\" (UniqueName: \"kubernetes.io/projected/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-kube-api-access-xr6c9\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.637063 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-apiservice-cert\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.637383 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34eba67-19a6-4d4e-a902-9482b2847199-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.637479 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8ts\" (UniqueName: \"kubernetes.io/projected/48aa89b7-0ab1-432b-a693-2d56358c1d83-kube-api-access-zk8ts\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.637679 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/25a99ace-2c29-419e-b5de-3f11b024ee43-audit-dir\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.638347 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.644546 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r28cq\" (UniqueName: \"kubernetes.io/projected/bc314d0c-da50-4607-93e1-5bece9c3b2b1-kube-api-access-r28cq\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.645577 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-encryption-config\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.645768 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907e32f6-7e41-43fb-862c-c6a5f835ff73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.645819 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cl26\" (UniqueName: \"kubernetes.io/projected/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-kube-api-access-6cl26\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.645993 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.646027 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-registration-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.646612 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a8f99a6-f163-4720-8eb4-bc8607753d79-config-volume\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.646655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-serving-cert\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.646678 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldhj\" (UniqueName: \"kubernetes.io/projected/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-kube-api-access-tldhj\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.647132 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b53f53c-0afe-4574-aebc-64c6d81f10d9-cert\") pod \"ingress-canary-52zxd\" (UID: \"6b53f53c-0afe-4574-aebc-64c6d81f10d9\") " pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.647261 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8158446c-5883-48ad-86da-77db470d8214-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.647300 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-webhook-cert\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.651356 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dkvt\" (UniqueName: \"kubernetes.io/projected/25a99ace-2c29-419e-b5de-3f11b024ee43-kube-api-access-9dkvt\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.651457 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f183f0-7d5c-45c9-88a3-df19bd214439-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.651481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-etcd-client\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.651517 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a34eba67-19a6-4d4e-a902-9482b2847199-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653229 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-tls\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653462 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb3c187d-243f-457f-b419-e02a6898fd48-srv-cert\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653518 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-config-volume\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653550 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj727\" (UniqueName: \"kubernetes.io/projected/6b53f53c-0afe-4574-aebc-64c6d81f10d9-kube-api-access-pj727\") pod \"ingress-canary-52zxd\" (UID: \"6b53f53c-0afe-4574-aebc-64c6d81f10d9\") " pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653569 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-metrics-tls\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653620 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0071e157-d4b7-40ac-8a50-35f9b7aa961d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653684 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48aa89b7-0ab1-432b-a693-2d56358c1d83-srv-cert\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653714 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653776 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb3c187d-243f-457f-b419-e02a6898fd48-profile-collector-cert\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653805 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e521b474-9f29-4841-a365-ed1589358607-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w2zzb\" (UID: \"e521b474-9f29-4841-a365-ed1589358607\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653832 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-csi-data-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.653878 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sx7q\" (UniqueName: \"kubernetes.io/projected/a34eba67-19a6-4d4e-a902-9482b2847199-kube-api-access-8sx7q\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.654375 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a99ace-2c29-419e-b5de-3f11b024ee43-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.657161 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8158446c-5883-48ad-86da-77db470d8214-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.660684 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-serving-cert\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.661550 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a34eba67-19a6-4d4e-a902-9482b2847199-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.662284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/25a99ace-2c29-419e-b5de-3f11b024ee43-etcd-client\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.662490 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f183f0-7d5c-45c9-88a3-df19bd214439-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.664126 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6fa9de7-6e76-472c-99c8-51a6c52eb6ae-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l98zf\" (UID: \"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.668101 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-tls\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.683894 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63f183f0-7d5c-45c9-88a3-df19bd214439-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zfq7h\" (UID: \"63f183f0-7d5c-45c9-88a3-df19bd214439\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.706583 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjxk\" (UniqueName: \"kubernetes.io/projected/8158446c-5883-48ad-86da-77db470d8214-kube-api-access-rcjxk\") pod \"openshift-config-operator-7777fb866f-wwdht\" (UID: \"8158446c-5883-48ad-86da-77db470d8214\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.747626 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqvm\" (UniqueName: \"kubernetes.io/projected/fc8b1aac-27e5-4f8c-a329-821c231fb7c6-kube-api-access-qlqvm\") pod \"downloads-7954f5f757-tbgzs\" (UID: \"fc8b1aac-27e5-4f8c-a329-821c231fb7c6\") " pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.755760 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756038 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clphc\" (UniqueName: \"kubernetes.io/projected/6db474c1-8489-43b1-bb9e-0961f9dc1dc4-kube-api-access-clphc\") pod \"multus-admission-controller-857f4d67dd-lmnvn\" (UID: \"6db474c1-8489-43b1-bb9e-0961f9dc1dc4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756063 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855th\" (UniqueName: \"kubernetes.io/projected/1a8f99a6-f163-4720-8eb4-bc8607753d79-kube-api-access-855th\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-signing-cabundle\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756150 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-signing-key\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756172 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6db474c1-8489-43b1-bb9e-0961f9dc1dc4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lmnvn\" (UID: \"6db474c1-8489-43b1-bb9e-0961f9dc1dc4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756198 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7krc\" (UniqueName: \"kubernetes.io/projected/e521b474-9f29-4841-a365-ed1589358607-kube-api-access-m7krc\") pod \"control-plane-machine-set-operator-78cbb6b69f-w2zzb\" (UID: \"e521b474-9f29-4841-a365-ed1589358607\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756217 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qt2\" (UniqueName: \"kubernetes.io/projected/0071e157-d4b7-40ac-8a50-35f9b7aa961d-kube-api-access-56qt2\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756240 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2g64\" (UniqueName: \"kubernetes.io/projected/107d8fb1-31b1-4bec-8d55-a27e312609b1-kube-api-access-r2g64\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756264 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-node-bootstrap-token\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756290 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a8f99a6-f163-4720-8eb4-bc8607753d79-secret-volume\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756307 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-plugins-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756323 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-tmpfs\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756351 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msk64\" (UniqueName: \"kubernetes.io/projected/eb3c187d-243f-457f-b419-e02a6898fd48-kube-api-access-msk64\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756381 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-socket-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756684 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-images\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756713 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa1d7187-eeb1-4145-8ba0-dd1e43023003-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mg25x\" (UID: \"aa1d7187-eeb1-4145-8ba0-dd1e43023003\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756752 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6c9\" (UniqueName: \"kubernetes.io/projected/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-kube-api-access-xr6c9\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756788 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-apiservice-cert\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756816 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8ts\" (UniqueName: \"kubernetes.io/projected/48aa89b7-0ab1-432b-a693-2d56358c1d83-kube-api-access-zk8ts\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756839 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r28cq\" (UniqueName: \"kubernetes.io/projected/bc314d0c-da50-4607-93e1-5bece9c3b2b1-kube-api-access-r28cq\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cl26\" (UniqueName: \"kubernetes.io/projected/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-kube-api-access-6cl26\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756913 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756937 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-registration-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.756992 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldhj\" (UniqueName: \"kubernetes.io/projected/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-kube-api-access-tldhj\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a8f99a6-f163-4720-8eb4-bc8607753d79-config-volume\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757027 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b53f53c-0afe-4574-aebc-64c6d81f10d9-cert\") pod \"ingress-canary-52zxd\" (UID: \"6b53f53c-0afe-4574-aebc-64c6d81f10d9\") " pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757047 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-webhook-cert\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757087 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb3c187d-243f-457f-b419-e02a6898fd48-srv-cert\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757101 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-config-volume\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757118 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj727\" (UniqueName: \"kubernetes.io/projected/6b53f53c-0afe-4574-aebc-64c6d81f10d9-kube-api-access-pj727\") pod \"ingress-canary-52zxd\" (UID: \"6b53f53c-0afe-4574-aebc-64c6d81f10d9\") " pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757135 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-metrics-tls\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757154 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0071e157-d4b7-40ac-8a50-35f9b7aa961d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48aa89b7-0ab1-432b-a693-2d56358c1d83-srv-cert\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757194 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb3c187d-243f-457f-b419-e02a6898fd48-profile-collector-cert\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757212 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e521b474-9f29-4841-a365-ed1589358607-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w2zzb\" (UID: \"e521b474-9f29-4841-a365-ed1589358607\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757231 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-csi-data-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757264 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-config\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757289 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-certs\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757305 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-proxy-tls\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gbb\" (UniqueName: \"kubernetes.io/projected/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-kube-api-access-f7gbb\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757371 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2b7b\" (UniqueName: \"kubernetes.io/projected/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-kube-api-access-l2b7b\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757398 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0071e157-d4b7-40ac-8a50-35f9b7aa961d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757441 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbwmm\" (UniqueName: \"kubernetes.io/projected/8ec2553d-d0b3-4b15-a42c-73c1c25ea70f-kube-api-access-cbwmm\") pod \"migrator-59844c95c7-gb6js\" (UID: \"8ec2553d-d0b3-4b15-a42c-73c1c25ea70f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757457 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-serving-cert\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757473 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ksg\" (UniqueName: \"kubernetes.io/projected/aa1d7187-eeb1-4145-8ba0-dd1e43023003-kube-api-access-x5ksg\") pod \"package-server-manager-789f6589d5-mg25x\" (UID: \"aa1d7187-eeb1-4145-8ba0-dd1e43023003\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmq7k\" (UniqueName: \"kubernetes.io/projected/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-kube-api-access-xmq7k\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757512 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-mountpoint-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.757568 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48aa89b7-0ab1-432b-a693-2d56358c1d83-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.758965 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-tmpfs\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.759521 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-config\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: E0130 05:10:07.759951 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.259911687 +0000 UTC m=+143.629822114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.760857 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-signing-cabundle\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.761275 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-plugins-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.763182 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.763256 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-registration-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.763865 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-apiservice-cert\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.763896 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6db474c1-8489-43b1-bb9e-0961f9dc1dc4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lmnvn\" (UID: \"6db474c1-8489-43b1-bb9e-0961f9dc1dc4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.764046 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-socket-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.764401 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-images\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.764876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-config-volume\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.765284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0071e157-d4b7-40ac-8a50-35f9b7aa961d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.765297 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.766172 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-mountpoint-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.766971 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a8f99a6-f163-4720-8eb4-bc8607753d79-config-volume\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.767352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa1d7187-eeb1-4145-8ba0-dd1e43023003-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mg25x\" (UID: \"aa1d7187-eeb1-4145-8ba0-dd1e43023003\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.768137 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.768311 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/107d8fb1-31b1-4bec-8d55-a27e312609b1-csi-data-dir\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.771742 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-signing-key\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.772166 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a8f99a6-f163-4720-8eb4-bc8607753d79-secret-volume\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.772727 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-serving-cert\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.773769 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb3c187d-243f-457f-b419-e02a6898fd48-srv-cert\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.774834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48aa89b7-0ab1-432b-a693-2d56358c1d83-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.775113 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-node-bootstrap-token\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.774359 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b53f53c-0afe-4574-aebc-64c6d81f10d9-cert\") pod \"ingress-canary-52zxd\" (UID: \"6b53f53c-0afe-4574-aebc-64c6d81f10d9\") " pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.776483 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nltg\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-kube-api-access-6nltg\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.778946 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvk4j\" (UniqueName: \"kubernetes.io/projected/907e32f6-7e41-43fb-862c-c6a5f835ff73-kube-api-access-hvk4j\") pod \"openshift-apiserver-operator-796bbdcf4f-dzzxq\" (UID: \"907e32f6-7e41-43fb-862c-c6a5f835ff73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.779603 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb3c187d-243f-457f-b419-e02a6898fd48-profile-collector-cert\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.779912 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.782776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0071e157-d4b7-40ac-8a50-35f9b7aa961d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.784616 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-certs\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.799005 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.800206 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-bound-sa-token\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.800856 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-metrics-tls\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.802200 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48aa89b7-0ab1-432b-a693-2d56358c1d83-srv-cert\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.810283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e521b474-9f29-4841-a365-ed1589358607-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w2zzb\" (UID: \"e521b474-9f29-4841-a365-ed1589358607\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.810294 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-webhook-cert\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.810292 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.814293 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-proxy-tls\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.822749 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-r62wb"] Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.823210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sx7q\" (UniqueName: \"kubernetes.io/projected/a34eba67-19a6-4d4e-a902-9482b2847199-kube-api-access-8sx7q\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqjcv\" (UID: \"a34eba67-19a6-4d4e-a902-9482b2847199\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.839612 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dkvt\" (UniqueName: \"kubernetes.io/projected/25a99ace-2c29-419e-b5de-3f11b024ee43-kube-api-access-9dkvt\") pod \"apiserver-7bbb656c7d-pspt5\" (UID: \"25a99ace-2c29-419e-b5de-3f11b024ee43\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.858723 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:07 crc kubenswrapper[4931]: E0130 05:10:07.859501 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.359482194 +0000 UTC m=+143.729392451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:07 crc kubenswrapper[4931]: W0130 05:10:07.859589 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc142d29b_ca43_49b7_8055_3175cdf9c45e.slice/crio-1799b4060a05f1ed7b73abfd6c9a938d2a59baf01655db5a69f70b4b95f0ace7 WatchSource:0}: Error finding container 1799b4060a05f1ed7b73abfd6c9a938d2a59baf01655db5a69f70b4b95f0ace7: Status 404 returned error can't find the container with id 1799b4060a05f1ed7b73abfd6c9a938d2a59baf01655db5a69f70b4b95f0ace7 Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.868088 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc"] Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.881130 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldhj\" (UniqueName: \"kubernetes.io/projected/0fbcd41f-dfc6-4581-9f07-74e0efea3f1a-kube-api-access-tldhj\") pod \"machine-config-operator-74547568cd-5c95n\" (UID: \"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: W0130 05:10:07.882017 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56094dd_41e6_41ed_9660_73cc0a3eb1ba.slice/crio-7d2922e3069bcc03e72eeb30ba7364f02c8649416551786a4d4e8647ed60e55d WatchSource:0}: Error finding container 7d2922e3069bcc03e72eeb30ba7364f02c8649416551786a4d4e8647ed60e55d: Status 404 returned error can't find the container with id 7d2922e3069bcc03e72eeb30ba7364f02c8649416551786a4d4e8647ed60e55d Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.885639 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.897646 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7krc\" (UniqueName: \"kubernetes.io/projected/e521b474-9f29-4841-a365-ed1589358607-kube-api-access-m7krc\") pod \"control-plane-machine-set-operator-78cbb6b69f-w2zzb\" (UID: \"e521b474-9f29-4841-a365-ed1589358607\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.915288 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qt2\" (UniqueName: \"kubernetes.io/projected/0071e157-d4b7-40ac-8a50-35f9b7aa961d-kube-api-access-56qt2\") pod \"kube-storage-version-migrator-operator-b67b599dd-84v42\" (UID: \"0071e157-d4b7-40ac-8a50-35f9b7aa961d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.956197 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2g64\" (UniqueName: \"kubernetes.io/projected/107d8fb1-31b1-4bec-8d55-a27e312609b1-kube-api-access-r2g64\") pod \"csi-hostpathplugin-bj2bf\" (UID: \"107d8fb1-31b1-4bec-8d55-a27e312609b1\") " pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.960237 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:07 crc kubenswrapper[4931]: E0130 05:10:07.960786 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.460768925 +0000 UTC m=+143.830679182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.971283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855th\" (UniqueName: \"kubernetes.io/projected/1a8f99a6-f163-4720-8eb4-bc8607753d79-kube-api-access-855th\") pod \"collect-profiles-29495820-5cp8g\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.986726 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.991176 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clphc\" (UniqueName: \"kubernetes.io/projected/6db474c1-8489-43b1-bb9e-0961f9dc1dc4-kube-api-access-clphc\") pod \"multus-admission-controller-857f4d67dd-lmnvn\" (UID: \"6db474c1-8489-43b1-bb9e-0961f9dc1dc4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:07 crc kubenswrapper[4931]: I0130 05:10:07.992936 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.007810 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.008061 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8ts\" (UniqueName: \"kubernetes.io/projected/48aa89b7-0ab1-432b-a693-2d56358c1d83-kube-api-access-zk8ts\") pod \"olm-operator-6b444d44fb-n4fvr\" (UID: \"48aa89b7-0ab1-432b-a693-2d56358c1d83\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.016870 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.020742 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.029366 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r28cq\" (UniqueName: \"kubernetes.io/projected/bc314d0c-da50-4607-93e1-5bece9c3b2b1-kube-api-access-r28cq\") pod \"marketplace-operator-79b997595-phq4q\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.030818 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c8568"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.038786 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cl26\" (UniqueName: \"kubernetes.io/projected/63899a5c-b7ca-4eca-8418-c6b9bc1f774b-kube-api-access-6cl26\") pod \"packageserver-d55dfcdfc-gx6j5\" (UID: \"63899a5c-b7ca-4eca-8418-c6b9bc1f774b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.039062 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.056001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6c9\" (UniqueName: \"kubernetes.io/projected/fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce-kube-api-access-xr6c9\") pod \"dns-default-4phnt\" (UID: \"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce\") " pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.071225 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.071885 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.571863125 +0000 UTC m=+143.941773382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.082754 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msk64\" (UniqueName: \"kubernetes.io/projected/eb3c187d-243f-457f-b419-e02a6898fd48-kube-api-access-msk64\") pod \"catalog-operator-68c6474976-rxvzv\" (UID: \"eb3c187d-243f-457f-b419-e02a6898fd48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.094309 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj727\" (UniqueName: \"kubernetes.io/projected/6b53f53c-0afe-4574-aebc-64c6d81f10d9-kube-api-access-pj727\") pod \"ingress-canary-52zxd\" (UID: \"6b53f53c-0afe-4574-aebc-64c6d81f10d9\") " pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.119977 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbwmm\" (UniqueName: \"kubernetes.io/projected/8ec2553d-d0b3-4b15-a42c-73c1c25ea70f-kube-api-access-cbwmm\") pod \"migrator-59844c95c7-gb6js\" (UID: \"8ec2553d-d0b3-4b15-a42c-73c1c25ea70f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" Jan 30 05:10:08 crc kubenswrapper[4931]: W0130 05:10:08.139601 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf10add70_0777_45cf_9555_7bda3b6ebeec.slice/crio-115185d2e989f25f8a64ef9857f8ea6b8d666b3ac9799afa22f6a808267acdca WatchSource:0}: Error finding container 115185d2e989f25f8a64ef9857f8ea6b8d666b3ac9799afa22f6a808267acdca: Status 404 returned error can't find the container with id 115185d2e989f25f8a64ef9857f8ea6b8d666b3ac9799afa22f6a808267acdca Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.147414 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmq7k\" (UniqueName: \"kubernetes.io/projected/d11d03bb-be3f-43b0-a59b-5d9fde1c9717-kube-api-access-xmq7k\") pod \"service-ca-operator-777779d784-6pfwg\" (UID: \"d11d03bb-be3f-43b0-a59b-5d9fde1c9717\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.147684 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.158055 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2b7b\" (UniqueName: \"kubernetes.io/projected/b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e-kube-api-access-l2b7b\") pod \"machine-config-server-scnkp\" (UID: \"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e\") " pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.162821 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:08 crc kubenswrapper[4931]: W0130 05:10:08.168710 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e653e21_3e72_4867_b39e_f374d752d503.slice/crio-f46822f967f3b67713b2f25757d023699ad723ad5288c2d15d844d908c545c94 WatchSource:0}: Error finding container f46822f967f3b67713b2f25757d023699ad723ad5288c2d15d844d908c545c94: Status 404 returned error can't find the container with id f46822f967f3b67713b2f25757d023699ad723ad5288c2d15d844d908c545c94 Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.175795 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.176720 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.177044 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.677027279 +0000 UTC m=+144.046937526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.182280 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.197570 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ksg\" (UniqueName: \"kubernetes.io/projected/aa1d7187-eeb1-4145-8ba0-dd1e43023003-kube-api-access-x5ksg\") pod \"package-server-manager-789f6589d5-mg25x\" (UID: \"aa1d7187-eeb1-4145-8ba0-dd1e43023003\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.198192 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gbb\" (UniqueName: \"kubernetes.io/projected/f61d7ea6-37e5-47de-89e8-8a0dc1b895f9-kube-api-access-f7gbb\") pod \"service-ca-9c57cc56f-4jb99\" (UID: \"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.206651 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.219023 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.223149 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.232809 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-scnkp" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.234166 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.239590 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.257589 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.258076 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.265805 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.272348 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-52zxd" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.276732 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.278270 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.278915 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.778873295 +0000 UTC m=+144.148783552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.382898 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.383785 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.883760452 +0000 UTC m=+144.253670709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.443073 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bj2bf"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.443117 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.485669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.485996 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:08.985983828 +0000 UTC m=+144.355894085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.495055 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.508918 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wwdht"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.516951 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n"] Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.540815 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" event={"ID":"c142d29b-ca43-49b7-8055-3175cdf9c45e","Type":"ContainerStarted","Data":"1799b4060a05f1ed7b73abfd6c9a938d2a59baf01655db5a69f70b4b95f0ace7"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.582292 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" event={"ID":"62b9975b-f28e-46de-89a0-bac3d2e7f927","Type":"ContainerStarted","Data":"b06f7aef078eef4b63f6c8683cb94ac06a872c79a767dccca059e02ae4e4d04f"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.587005 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.587229 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.087193858 +0000 UTC m=+144.457104125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.587546 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.588086 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.088073571 +0000 UTC m=+144.457983828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.589348 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" event={"ID":"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae","Type":"ContainerStarted","Data":"79c070cb5235142e9b262f19dbb8e16d5dcb7614ebc5d2dfdda6d8f430ca0ee1"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.591020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" event={"ID":"f56094dd-41e6-41ed-9660-73cc0a3eb1ba","Type":"ContainerStarted","Data":"7d2922e3069bcc03e72eeb30ba7364f02c8649416551786a4d4e8647ed60e55d"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.608595 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" event={"ID":"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6","Type":"ContainerStarted","Data":"f627df2c5d97d7add678aae4bf30858e359e0469f937f2fabb4fa636467d2356"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.609118 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.609130 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" event={"ID":"606fa13b-30a4-412e-86eb-9fcb5bc8ebb6","Type":"ContainerStarted","Data":"e4f5cd5ea507d6e0d8275941863ff61ec41db09214e1b868e8f75c038346fd6d"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.616765 4931 patch_prober.go:28] interesting pod/console-operator-58897d9998-7jf2b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.616822 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" podUID="606fa13b-30a4-412e-86eb-9fcb5bc8ebb6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.633226 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" event={"ID":"3ac5359b-e653-4824-ad6f-4672970dc0cc","Type":"ContainerStarted","Data":"dd1a30db2d1332e33de6c4cb89d7a7c551f550f018fa31b21c5bdd572f28147d"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.633290 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" event={"ID":"3ac5359b-e653-4824-ad6f-4672970dc0cc","Type":"ContainerStarted","Data":"69234c254b9137bcdf8d0c9f00100926cca05090829b029106c26c2a7c2a25ed"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.639655 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" event={"ID":"6e653e21-3e72-4867-b39e-f374d752d503","Type":"ContainerStarted","Data":"f46822f967f3b67713b2f25757d023699ad723ad5288c2d15d844d908c545c94"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.644071 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" event={"ID":"e06ad469-0fb9-47d7-90fc-3c74ef8bb833","Type":"ContainerStarted","Data":"ef71aa85d04aec87e6ca79010343f337197cd9e8ae6a72d2abb1163b3b3464ee"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.646113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" event={"ID":"9d9ce65b-1339-4198-ae4d-5697206eba5f","Type":"ContainerStarted","Data":"e0fdeaa06d83a5588d866d9e6750f39b2a25dbaddb13a9c0653d6a6f9ee971d5"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.646140 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" event={"ID":"9d9ce65b-1339-4198-ae4d-5697206eba5f","Type":"ContainerStarted","Data":"9961ae67bbba6ffb4a389a9a98f5f5e3812deeb1bf5bd9c07bd38ed20f591212"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.659500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" event={"ID":"dbab60d9-c5df-4396-8012-94dc987f82c2","Type":"ContainerStarted","Data":"aa8e44e00db75d9aed75b6a5cf74cebb3f9c3c221b78d34bbcba3d9d8a15e25f"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.688926 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.689700 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.189681271 +0000 UTC m=+144.559591518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.690061 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.694062 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.194045316 +0000 UTC m=+144.563955573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.711499 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" event={"ID":"177d163e-7881-411f-a61b-a00e9c8bc9dc","Type":"ContainerStarted","Data":"da16aa93f1a7222f8c24df0184a32a7f6c88bc7e60e55df350f1794bb668ab0e"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.747717 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" event={"ID":"f10add70-0777-45cf-9555-7bda3b6ebeec","Type":"ContainerStarted","Data":"115185d2e989f25f8a64ef9857f8ea6b8d666b3ac9799afa22f6a808267acdca"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.761715 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-268mt" event={"ID":"21e8738e-651f-4f09-a052-1ff22028e3f3","Type":"ContainerStarted","Data":"bab7ff071b13604057a32853a504ba260840750838a51dfecb75902fa27e4a3d"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.761754 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-268mt" event={"ID":"21e8738e-651f-4f09-a052-1ff22028e3f3","Type":"ContainerStarted","Data":"5451d8c83316bfe9169f10fc93c4221f7918a21b39649b65fa46ec5547e8d5ea"} Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.778509 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.778578 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.795784 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.295754839 +0000 UTC m=+144.665665096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.831973 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.832533 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.834527 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.334507227 +0000 UTC m=+144.704417484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.940690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:08 crc kubenswrapper[4931]: I0130 05:10:08.948025 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv"] Jan 30 05:10:08 crc kubenswrapper[4931]: E0130 05:10:08.949192 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.448957915 +0000 UTC m=+144.818868172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.040410 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.048618 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.049303 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.549277421 +0000 UTC m=+144.919187678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.095582 4931 csr.go:261] certificate signing request csr-f2dd6 is approved, waiting to be issued Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.104243 4931 csr.go:257] certificate signing request csr-f2dd6 is issued Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.150341 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.151010 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.650984674 +0000 UTC m=+145.020894931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.169657 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7242g" podStartSLOduration=122.169635404 podStartE2EDuration="2m2.169635404s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:09.133752641 +0000 UTC m=+144.503662898" watchObservedRunningTime="2026-01-30 05:10:09.169635404 +0000 UTC m=+144.539545661" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.240303 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" podStartSLOduration=122.240288411 podStartE2EDuration="2m2.240288411s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:09.240005644 +0000 UTC m=+144.609915901" watchObservedRunningTime="2026-01-30 05:10:09.240288411 +0000 UTC m=+144.610198668" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.258906 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.259250 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.759234979 +0000 UTC m=+145.129145236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.364241 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.364929 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.864891536 +0000 UTC m=+145.234801793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.466274 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.466877 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:09.966855025 +0000 UTC m=+145.336765282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.525951 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-k9mcd" podStartSLOduration=122.525930438 podStartE2EDuration="2m2.525930438s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:09.465159901 +0000 UTC m=+144.835070158" watchObservedRunningTime="2026-01-30 05:10:09.525930438 +0000 UTC m=+144.895840695" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.527148 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wwbmr" podStartSLOduration=122.5271423 podStartE2EDuration="2m2.5271423s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:09.495829517 +0000 UTC m=+144.865739774" watchObservedRunningTime="2026-01-30 05:10:09.5271423 +0000 UTC m=+144.897052557" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.545139 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:09 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:09 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:09 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.556471 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.580004 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.583132 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.583606 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.083589523 +0000 UTC m=+145.453499780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.611224 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tbgzs"] Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.664883 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5"] Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.685323 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.685776 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.185757988 +0000 UTC m=+145.555668245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.727240 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7ptq" podStartSLOduration=122.727224168 podStartE2EDuration="2m2.727224168s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:09.670027545 +0000 UTC m=+145.039937802" watchObservedRunningTime="2026-01-30 05:10:09.727224168 +0000 UTC m=+145.097134425" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.761046 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-phq4q"] Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.788586 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.789306 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.289222167 +0000 UTC m=+145.659132424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.893632 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" event={"ID":"dbab60d9-c5df-4396-8012-94dc987f82c2","Type":"ContainerStarted","Data":"820960a390e9f5b8227dfef695af181ec6a4c3f54f9101efd319215772d57fb3"} Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.893774 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:09 crc kubenswrapper[4931]: E0130 05:10:09.902211 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.402187255 +0000 UTC m=+145.772097512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.928755 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" event={"ID":"107d8fb1-31b1-4bec-8d55-a27e312609b1","Type":"ContainerStarted","Data":"716f19dbae5782cb4bb4e2a3002de9af3be27e8955667fd2b4ee2a8d8c79f6a8"} Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.944579 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" podStartSLOduration=122.944543398 podStartE2EDuration="2m2.944543398s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:09.853160197 +0000 UTC m=+145.223070454" watchObservedRunningTime="2026-01-30 05:10:09.944543398 +0000 UTC m=+145.314453655" Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.950587 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lmnvn"] Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.950812 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq"] Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.955220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" event={"ID":"6e653e21-3e72-4867-b39e-f374d752d503","Type":"ContainerStarted","Data":"3e97cf20fee7a8088c18e6fa6083b1c2b0aab0b86b2bc7ea7af6b798dd49835a"} Jan 30 05:10:09 crc kubenswrapper[4931]: I0130 05:10:09.999022 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.000440 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.500395666 +0000 UTC m=+145.870305933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.016431 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4phnt"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.032218 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-scnkp" event={"ID":"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e","Type":"ContainerStarted","Data":"445c7e5ffc93b3b0d06e6d127c8972a85e3ee078da31be9ad50effa715e5b3ac"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.040876 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv"] Jan 30 05:10:10 crc kubenswrapper[4931]: W0130 05:10:10.064683 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdb786d0_4d5f_4e2b_9f2b_5e17ef6c77ce.slice/crio-482cd8859960b178149275312ff118e048b059056f78cadc273858d76110df6d WatchSource:0}: Error finding container 482cd8859960b178149275312ff118e048b059056f78cadc273858d76110df6d: Status 404 returned error can't find the container with id 482cd8859960b178149275312ff118e048b059056f78cadc273858d76110df6d Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.064711 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" event={"ID":"25a99ace-2c29-419e-b5de-3f11b024ee43","Type":"ContainerStarted","Data":"10f5f0d13e9b86ef0a8f012d8f44236140548b8a7f53c5a261e0fe97cdc26db0"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.072945 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.082799 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" event={"ID":"f10add70-0777-45cf-9555-7bda3b6ebeec","Type":"ContainerStarted","Data":"209a73f85ef03f2eb1edec0c2fa0de11a35ef571d557b7d655a600b5c6a9324c"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.086825 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" podStartSLOduration=123.086805967 podStartE2EDuration="2m3.086805967s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.080519191 +0000 UTC m=+145.450429448" watchObservedRunningTime="2026-01-30 05:10:10.086805967 +0000 UTC m=+145.456716224" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.098220 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4jb99"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.100932 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.102386 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.602371116 +0000 UTC m=+145.972281383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.103604 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" event={"ID":"c142d29b-ca43-49b7-8055-3175cdf9c45e","Type":"ContainerStarted","Data":"88590e8dc9b979b0c062e5a0650bc2c8f187d7e714f08173a54d052c271e8fff"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.109904 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tbgzs" event={"ID":"fc8b1aac-27e5-4f8c-a329-821c231fb7c6","Type":"ContainerStarted","Data":"145c6fd52710339cbfd1c98ffcf108d9c1e58dde8fadd16261aca094cd0f7307"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.110117 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-30 05:05:09 +0000 UTC, rotation deadline is 2026-11-18 11:46:15.463168017 +0000 UTC Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.110161 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7014h36m5.353010847s for next certificate rotation Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.113624 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" event={"ID":"8158446c-5883-48ad-86da-77db470d8214","Type":"ContainerStarted","Data":"1812025aee5a6e426ec2a0e855618d2c9b3bb33f38fe5c2b8477da74912e19f2"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.113671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" event={"ID":"8158446c-5883-48ad-86da-77db470d8214","Type":"ContainerStarted","Data":"822d7ccf9f3b9fabf66dcaa23268833edba0a9536c78d90a05de6cc7d5a9a209"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.151120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" event={"ID":"63f183f0-7d5c-45c9-88a3-df19bd214439","Type":"ContainerStarted","Data":"e002421e9d8e8d6989b65ce72814617e8cc28645ffbc4c9400ef49d370809f39"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.165544 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ff4lr" podStartSLOduration=123.165519045 podStartE2EDuration="2m3.165519045s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.131169833 +0000 UTC m=+145.501080090" watchObservedRunningTime="2026-01-30 05:10:10.165519045 +0000 UTC m=+145.535429302" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.166607 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6wmnm" podStartSLOduration=123.166603154 podStartE2EDuration="2m3.166603154s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.163560984 +0000 UTC m=+145.533471241" watchObservedRunningTime="2026-01-30 05:10:10.166603154 +0000 UTC m=+145.536513401" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.188792 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" event={"ID":"a34eba67-19a6-4d4e-a902-9482b2847199","Type":"ContainerStarted","Data":"bc036db765cc587e68b8a1d1d21f2ceab16d55e250cf3caa914ef5d9d19cf60b"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.205460 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.207639 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.707607421 +0000 UTC m=+146.077517678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.208655 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" podStartSLOduration=123.208635208 podStartE2EDuration="2m3.208635208s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.207903099 +0000 UTC m=+145.577813356" watchObservedRunningTime="2026-01-30 05:10:10.208635208 +0000 UTC m=+145.578545465" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.242625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" event={"ID":"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a","Type":"ContainerStarted","Data":"a734f39440505ad6e36a5797c94de63391f20cf23b700fb30ed9c24d0e76f90b"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.243035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" event={"ID":"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a","Type":"ContainerStarted","Data":"5c8894ede0b984d84cb2316d3e83205a96dd5b47e01b5cf55a18aebaac856129"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.274292 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" event={"ID":"f56094dd-41e6-41ed-9660-73cc0a3eb1ba","Type":"ContainerStarted","Data":"352b58dbbb3a109a1f739924167256b34bcddcdd723c52b0a9dc82720d06d21a"} Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.266178 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-268mt" podStartSLOduration=123.26614352 podStartE2EDuration="2m3.26614352s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.260568923 +0000 UTC m=+145.630479180" watchObservedRunningTime="2026-01-30 05:10:10.26614352 +0000 UTC m=+145.636053777" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.318765 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sh5fl" podStartSLOduration=123.318740832 podStartE2EDuration="2m3.318740832s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.309472788 +0000 UTC m=+145.679383045" watchObservedRunningTime="2026-01-30 05:10:10.318740832 +0000 UTC m=+145.688651089" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.343359 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.346377 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.846349037 +0000 UTC m=+146.216259294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.406406 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.465799 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.471800 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.971754263 +0000 UTC m=+146.341664520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.472420 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.482354 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:10.982334801 +0000 UTC m=+146.352245058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.524445 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7jf2b" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.545262 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" podStartSLOduration=123.545238874 podStartE2EDuration="2m3.545238874s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.544755072 +0000 UTC m=+145.914665319" watchObservedRunningTime="2026-01-30 05:10:10.545238874 +0000 UTC m=+145.915149131" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.553963 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:10 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:10 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:10 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.554225 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.586020 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.586394 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.086377575 +0000 UTC m=+146.456287822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.587369 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.688802 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.689437 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.689848 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.189830454 +0000 UTC m=+146.559740711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.711682 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.721177 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2d7cc" podStartSLOduration=123.721156037 podStartE2EDuration="2m3.721156037s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.713163817 +0000 UTC m=+146.083074074" watchObservedRunningTime="2026-01-30 05:10:10.721156037 +0000 UTC m=+146.091066304" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.721489 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.721558 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.771878 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-52zxd"] Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.793442 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.794112 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.294085024 +0000 UTC m=+146.663995281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.876275 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-scnkp" podStartSLOduration=6.876255653 podStartE2EDuration="6.876255653s" podCreationTimestamp="2026-01-30 05:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:10.82475072 +0000 UTC m=+146.194660977" watchObservedRunningTime="2026-01-30 05:10:10.876255653 +0000 UTC m=+146.246165910" Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.901941 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:10 crc kubenswrapper[4931]: E0130 05:10:10.902605 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.402591615 +0000 UTC m=+146.772501872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:10 crc kubenswrapper[4931]: I0130 05:10:10.906944 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5"] Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.003272 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.003601 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.5035862 +0000 UTC m=+146.873496457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.112173 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.113081 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.613062537 +0000 UTC m=+146.982972794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.216005 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.216413 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.716389392 +0000 UTC m=+147.086299649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.248947 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.249023 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.317483 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.317957 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.817931421 +0000 UTC m=+147.187841678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.319916 4931 generic.go:334] "Generic (PLEG): container finished" podID="8158446c-5883-48ad-86da-77db470d8214" containerID="1812025aee5a6e426ec2a0e855618d2c9b3bb33f38fe5c2b8477da74912e19f2" exitCode=0 Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.320003 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" event={"ID":"8158446c-5883-48ad-86da-77db470d8214","Type":"ContainerDied","Data":"1812025aee5a6e426ec2a0e855618d2c9b3bb33f38fe5c2b8477da74912e19f2"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.348266 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" event={"ID":"63899a5c-b7ca-4eca-8418-c6b9bc1f774b","Type":"ContainerStarted","Data":"59eee242f13f592326eea47fc27c9fd67f269e9cd32e7b7f01d4b893c11049a9"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.351096 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" event={"ID":"107d8fb1-31b1-4bec-8d55-a27e312609b1","Type":"ContainerStarted","Data":"5d2207b458875d686ff369d91fae7714d79df423c53512b03ba6936ad5acd414"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.352818 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4phnt" event={"ID":"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce","Type":"ContainerStarted","Data":"482cd8859960b178149275312ff118e048b059056f78cadc273858d76110df6d"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.354069 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" event={"ID":"6e653e21-3e72-4867-b39e-f374d752d503","Type":"ContainerStarted","Data":"a69e38bc4425d484d14a5059496d9d463cff1bcddabaaac9699d49e4b73aba83"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.355860 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" event={"ID":"1a8f99a6-f163-4720-8eb4-bc8607753d79","Type":"ContainerStarted","Data":"925d7ec4214d424008eeb73fc8925f29c574b109b85902152a8bda78b7583feb"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.359935 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" event={"ID":"8ec2553d-d0b3-4b15-a42c-73c1c25ea70f","Type":"ContainerStarted","Data":"fd397060ee56100ef6da607b98736ba790b1563d7c84b4359327aa8ee3ae9f10"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.359978 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" event={"ID":"8ec2553d-d0b3-4b15-a42c-73c1c25ea70f","Type":"ContainerStarted","Data":"1f28c3132fe229940a650d73adad8e0bfdbf06169436b7fd464ed91fccfd74e7"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.361108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" event={"ID":"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9","Type":"ContainerStarted","Data":"39b7ca9ff298e89c9df90acb10a020ce42cc74ac13632078bd1f4f834ab339bc"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.362187 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" event={"ID":"63f183f0-7d5c-45c9-88a3-df19bd214439","Type":"ContainerStarted","Data":"9affac49170976fe65722d58557fc4f244e6828877b8729010551b959b08d5c9"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.402828 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" event={"ID":"6db474c1-8489-43b1-bb9e-0961f9dc1dc4","Type":"ContainerStarted","Data":"f4ba6213b720b7ea748f17ee2d070c0bd967c6e08ffb2402373a84d10b51951d"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.414459 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c8568" podStartSLOduration=124.414442767 podStartE2EDuration="2m4.414442767s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.41381245 +0000 UTC m=+146.783722707" watchObservedRunningTime="2026-01-30 05:10:11.414442767 +0000 UTC m=+146.784353034" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.421911 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.422035 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.922004256 +0000 UTC m=+147.291914513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.422342 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.424807 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:11.924769578 +0000 UTC m=+147.294679825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.448328 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zfq7h" podStartSLOduration=124.448297567 podStartE2EDuration="2m4.448297567s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.444312832 +0000 UTC m=+146.814223079" watchObservedRunningTime="2026-01-30 05:10:11.448297567 +0000 UTC m=+146.818207824" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.454041 4931 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rxvzv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.454111 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" podUID="eb3c187d-243f-457f-b419-e02a6898fd48" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.458932 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.459092 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" event={"ID":"25a99ace-2c29-419e-b5de-3f11b024ee43","Type":"ContainerStarted","Data":"7750cd40c9d0fef2887b1b482623cb1d16e81114d992d1c633d524553a28371b"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.459187 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" event={"ID":"eb3c187d-243f-457f-b419-e02a6898fd48","Type":"ContainerStarted","Data":"ee2d2bbfa2ca31e3fae4c57d3281b0161094c389f759f49d53b90f5ff089bd43"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.459269 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" event={"ID":"eb3c187d-243f-457f-b419-e02a6898fd48","Type":"ContainerStarted","Data":"a368aa881a65adedde7ec8125ad717cc3ae5d122ad48225bd395fa63650713b9"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.461854 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" event={"ID":"a34eba67-19a6-4d4e-a902-9482b2847199","Type":"ContainerStarted","Data":"fd359ba9fc381b7cbc57fbce72975e41f8adaa3bec535178a56e40ff760815f8"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.480207 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" event={"ID":"0071e157-d4b7-40ac-8a50-35f9b7aa961d","Type":"ContainerStarted","Data":"7308689e3f6f70e433937466dfc6cc4f9177148c72528bafe42f5064a1d27153"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.499932 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" event={"ID":"bc314d0c-da50-4607-93e1-5bece9c3b2b1","Type":"ContainerStarted","Data":"dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.499994 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" event={"ID":"bc314d0c-da50-4607-93e1-5bece9c3b2b1","Type":"ContainerStarted","Data":"b8dffc3066e9941e3da7e55a7eddcae34aa88188f6b968755e41658b1568e4e5"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.500599 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.508703 4931 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-phq4q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.509250 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.518906 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" podStartSLOduration=124.518879881 podStartE2EDuration="2m4.518879881s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.507212715 +0000 UTC m=+146.877122972" watchObservedRunningTime="2026-01-30 05:10:11.518879881 +0000 UTC m=+146.888790138" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.522815 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" event={"ID":"48aa89b7-0ab1-432b-a693-2d56358c1d83","Type":"ContainerStarted","Data":"c165f120e7002c7e49b3792b8faf5ee00cdc3fdbe72c9edf9357bcf5ce3731be"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.524821 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.525229 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.025195467 +0000 UTC m=+147.395105724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.525566 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.528337 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.02832058 +0000 UTC m=+147.398230837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.548679 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:11 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:11 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:11 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.548752 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.557967 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" event={"ID":"d11d03bb-be3f-43b0-a59b-5d9fde1c9717","Type":"ContainerStarted","Data":"a4faf5c3f72c32004e96c239a968666917ad28090767d94c5c11fcfcc4c18347"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.577207 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-52zxd" event={"ID":"6b53f53c-0afe-4574-aebc-64c6d81f10d9","Type":"ContainerStarted","Data":"6669c848f50e93107ea95a01f183003cc9cb854e8fd028c528fa6d1d8ba7376b"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.589934 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" event={"ID":"e521b474-9f29-4841-a365-ed1589358607","Type":"ContainerStarted","Data":"c9d349808c713379d952f12dbbedb3a0d7d0da509b01a2b7acc95be1929e3316"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.627090 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.627729 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" event={"ID":"f10add70-0777-45cf-9555-7bda3b6ebeec","Type":"ContainerStarted","Data":"97145adefbb638d94813a3bee5af6eeadd1dc166f942eff4d9310576e91c3244"} Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.628098 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.128069961 +0000 UTC m=+147.497980218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.666570 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" event={"ID":"907e32f6-7e41-43fb-862c-c6a5f835ff73","Type":"ContainerStarted","Data":"1562adb50e57a2a6211768fe7a2f6194dcb99bf742d732ddb537d7970604e0d0"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.666651 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" event={"ID":"907e32f6-7e41-43fb-862c-c6a5f835ff73","Type":"ContainerStarted","Data":"c2fed5e9ad93508252bfa0c1fc29fb823dc594f4543551f2f703f98b8bb963e3"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.673859 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqjcv" podStartSLOduration=124.673832194 podStartE2EDuration="2m4.673832194s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.621278403 +0000 UTC m=+146.991188660" watchObservedRunningTime="2026-01-30 05:10:11.673832194 +0000 UTC m=+147.043742451" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.680375 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" podStartSLOduration=124.680348685 podStartE2EDuration="2m4.680348685s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.673002382 +0000 UTC m=+147.042912639" watchObservedRunningTime="2026-01-30 05:10:11.680348685 +0000 UTC m=+147.050258942" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.697681 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" event={"ID":"c142d29b-ca43-49b7-8055-3175cdf9c45e","Type":"ContainerStarted","Data":"9373e7e356b5a3ebfb940bf62868db4278dac782cad114e8ac3779c0b8a1f774"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.705198 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2sc4j" podStartSLOduration=124.705175547 podStartE2EDuration="2m4.705175547s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.701614734 +0000 UTC m=+147.071524981" watchObservedRunningTime="2026-01-30 05:10:11.705175547 +0000 UTC m=+147.075085804" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.729113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" event={"ID":"aa1d7187-eeb1-4145-8ba0-dd1e43023003","Type":"ContainerStarted","Data":"d2b39db5352926f4d329e0ddd8917b1c0b2fdc23bf2a28b5d899d0d3de638c90"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.730592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.730946 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.230931784 +0000 UTC m=+147.600842041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.786245 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" podStartSLOduration=124.786213497 podStartE2EDuration="2m4.786213497s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.752294966 +0000 UTC m=+147.122205223" watchObservedRunningTime="2026-01-30 05:10:11.786213497 +0000 UTC m=+147.156123754" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.793640 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-scnkp" event={"ID":"b5d96c05-25a8-4a0b-b6a0-9ca3781ac12e","Type":"ContainerStarted","Data":"50ce1bb5d9c49d5eba29fd94db26330747effdd341a08b35e1f7dc00a045f5b2"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.813984 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" event={"ID":"f6fa9de7-6e76-472c-99c8-51a6c52eb6ae","Type":"ContainerStarted","Data":"24f84e9f6129c86faf1cceef9013ed3cdcc829143086f12ad6b2fb569b6d0d96"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.833179 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.833583 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.333568841 +0000 UTC m=+147.703479098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.839327 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dzzxq" podStartSLOduration=124.839300372 podStartE2EDuration="2m4.839300372s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.788687572 +0000 UTC m=+147.158597829" watchObservedRunningTime="2026-01-30 05:10:11.839300372 +0000 UTC m=+147.209210619" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.865584 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" event={"ID":"0fbcd41f-dfc6-4581-9f07-74e0efea3f1a","Type":"ContainerStarted","Data":"da6724cbad15f9c08591bbc95a5a2ca42ca54ae8221777df9cf685d7f5d33b58"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.866552 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-r62wb" podStartSLOduration=124.866530808 podStartE2EDuration="2m4.866530808s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.838765388 +0000 UTC m=+147.208675645" watchObservedRunningTime="2026-01-30 05:10:11.866530808 +0000 UTC m=+147.236441065" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.876255 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tbgzs" event={"ID":"fc8b1aac-27e5-4f8c-a329-821c231fb7c6","Type":"ContainerStarted","Data":"331b6a7b0f00f9612e135077c6cae626b67f9cf29a228aaf37d7c4c26d10810d"} Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.877535 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.888665 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-tbgzs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.888742 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tbgzs" podUID="fc8b1aac-27e5-4f8c-a329-821c231fb7c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.916885 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l98zf" podStartSLOduration=124.91686035 podStartE2EDuration="2m4.91686035s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.902352399 +0000 UTC m=+147.272262656" watchObservedRunningTime="2026-01-30 05:10:11.91686035 +0000 UTC m=+147.286770607" Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.940082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:11 crc kubenswrapper[4931]: E0130 05:10:11.942165 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.442134085 +0000 UTC m=+147.812044342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:11 crc kubenswrapper[4931]: I0130 05:10:11.993511 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5c95n" podStartSLOduration=124.993493004 podStartE2EDuration="2m4.993493004s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:11.981842018 +0000 UTC m=+147.351752275" watchObservedRunningTime="2026-01-30 05:10:11.993493004 +0000 UTC m=+147.363403261" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.046623 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.048376 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.548358516 +0000 UTC m=+147.918268773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.151416 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.151837 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.651823095 +0000 UTC m=+148.021733352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.200160 4931 patch_prober.go:28] interesting pod/apiserver-76f77b778f-8ndkb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]log ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]etcd ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/generic-apiserver-start-informers ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/max-in-flight-filter ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 30 05:10:12 crc kubenswrapper[4931]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 30 05:10:12 crc kubenswrapper[4931]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/project.openshift.io-projectcache ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/openshift.io-startinformers ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 30 05:10:12 crc kubenswrapper[4931]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 30 05:10:12 crc kubenswrapper[4931]: livez check failed Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.200210 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" podUID="dbab60d9-c5df-4396-8012-94dc987f82c2" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.257675 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.757644266 +0000 UTC m=+148.127554523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.257784 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.258542 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.258978 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.758970691 +0000 UTC m=+148.128880938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.361837 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.362257 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.862240685 +0000 UTC m=+148.232150942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.463670 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.463987 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:12.963975659 +0000 UTC m=+148.333885916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.549887 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:12 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:12 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:12 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.550316 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.564828 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.565258 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.06523942 +0000 UTC m=+148.435149677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.666878 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.667259 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.167247801 +0000 UTC m=+148.537158058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.768485 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.768877 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.268861021 +0000 UTC m=+148.638771278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.870281 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.870592 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.370581064 +0000 UTC m=+148.740491321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.890664 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" event={"ID":"f61d7ea6-37e5-47de-89e8-8a0dc1b895f9","Type":"ContainerStarted","Data":"070a6faeedaa98e2ef6e21e22031a225b72c05425b2cc201d937d490c1768656"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.894252 4931 generic.go:334] "Generic (PLEG): container finished" podID="25a99ace-2c29-419e-b5de-3f11b024ee43" containerID="7750cd40c9d0fef2887b1b482623cb1d16e81114d992d1c633d524553a28371b" exitCode=0 Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.894298 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" event={"ID":"25a99ace-2c29-419e-b5de-3f11b024ee43","Type":"ContainerDied","Data":"7750cd40c9d0fef2887b1b482623cb1d16e81114d992d1c633d524553a28371b"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.894359 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" event={"ID":"25a99ace-2c29-419e-b5de-3f11b024ee43","Type":"ContainerStarted","Data":"d0ebfb4796177ce5d2f97225b61917883686ba7754e5abf03dcc7723817f548c"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.896708 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-52zxd" event={"ID":"6b53f53c-0afe-4574-aebc-64c6d81f10d9","Type":"ContainerStarted","Data":"26ed8b69c544c6a6a40d832c21f3653e3c1ee58e9c255975345fd23ae00e97fb"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.898794 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" event={"ID":"0071e157-d4b7-40ac-8a50-35f9b7aa961d","Type":"ContainerStarted","Data":"3e80aa8483d5c39e31d431dfff2b17161d28f87a2bff18f79bf6df5b318dbd66"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.901415 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" event={"ID":"6db474c1-8489-43b1-bb9e-0961f9dc1dc4","Type":"ContainerStarted","Data":"966a8b56a0cd29c0d03de4d675353b24308f545b212e5aee4d14328c6df6b7bb"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.901471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" event={"ID":"6db474c1-8489-43b1-bb9e-0961f9dc1dc4","Type":"ContainerStarted","Data":"8c8724012f411472cde1d05efea748fd76c75afc1a34222827cc887796d90a7b"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.905486 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4phnt" event={"ID":"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce","Type":"ContainerStarted","Data":"c887ef4525a06db76b0777a24ea390f5605ebdb97a053a363a216b631473280a"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.905515 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4phnt" event={"ID":"fdb786d0-4d5f-4e2b-9f2b-5e17ef6c77ce","Type":"ContainerStarted","Data":"905f34b53355a70cfee7d2120983e52950c9a688eec5e904ffe4eb9a43173c56"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.905857 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.907597 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" event={"ID":"63899a5c-b7ca-4eca-8418-c6b9bc1f774b","Type":"ContainerStarted","Data":"2f0546229f980ec7b9a46854704bd46525b35dcd6613e96f66c311941bdb1380"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.908287 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.911046 4931 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gx6j5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.911093 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" podUID="63899a5c-b7ca-4eca-8418-c6b9bc1f774b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.911378 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" event={"ID":"48aa89b7-0ab1-432b-a693-2d56358c1d83","Type":"ContainerStarted","Data":"f41be65f196765acfd7674543e25034e5f5217f9a43df82a2d1cc2cc2d6a6170"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.911956 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.913559 4931 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-n4fvr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.913588 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" podUID="48aa89b7-0ab1-432b-a693-2d56358c1d83" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.915182 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" event={"ID":"aa1d7187-eeb1-4145-8ba0-dd1e43023003","Type":"ContainerStarted","Data":"4ec4b05f8b1f9a05c8644b6183b1ae9bf0757c2f973729ade9c9ffa179f141ea"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.915211 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" event={"ID":"aa1d7187-eeb1-4145-8ba0-dd1e43023003","Type":"ContainerStarted","Data":"f2d2d2c1d5060878fd889c6c9440b533b28c30e2d9cd909a1261840854e2c3ca"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.915532 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.918139 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" event={"ID":"e521b474-9f29-4841-a365-ed1589358607","Type":"ContainerStarted","Data":"dfcd024b2f504a863bbcbc60457cb93ebf2ec2f189b4fb99017279711c4d374b"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.919976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" event={"ID":"8ec2553d-d0b3-4b15-a42c-73c1c25ea70f","Type":"ContainerStarted","Data":"ace69891c41d64e610c291a77fcc35e2d208c29663c6aee22b44205b825605f7"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.921909 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6pfwg" event={"ID":"d11d03bb-be3f-43b0-a59b-5d9fde1c9717","Type":"ContainerStarted","Data":"95f6ce1bc4e15c222b9fce15112cc0e0076f6e1ee9a889d3822f17d045d274f0"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.923226 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" event={"ID":"1a8f99a6-f163-4720-8eb4-bc8607753d79","Type":"ContainerStarted","Data":"76f686a64b7bcba52e9cf572d78b41631a5873f435dbdf098126fe32ac5ccc3f"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.928565 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" event={"ID":"8158446c-5883-48ad-86da-77db470d8214","Type":"ContainerStarted","Data":"b8ee44b302d783cba73cd5cb20413829996b68d6d940ca0f42328230d40f3f4e"} Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.928592 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.931353 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-tbgzs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.931401 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tbgzs" podUID="fc8b1aac-27e5-4f8c-a329-821c231fb7c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.931463 4931 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-phq4q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.931475 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.952931 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tbgzs" podStartSLOduration=125.952915618 podStartE2EDuration="2m5.952915618s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:12.05232677 +0000 UTC m=+147.422237027" watchObservedRunningTime="2026-01-30 05:10:12.952915618 +0000 UTC m=+148.322825875" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.970274 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rxvzv" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.971189 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.971383 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.471357103 +0000 UTC m=+148.841267360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.971471 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:12 crc kubenswrapper[4931]: E0130 05:10:12.971926 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.471903657 +0000 UTC m=+148.841813914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.994157 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4jb99" podStartSLOduration=125.994136891 podStartE2EDuration="2m5.994136891s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:12.956725488 +0000 UTC m=+148.326635745" watchObservedRunningTime="2026-01-30 05:10:12.994136891 +0000 UTC m=+148.364047138" Jan 30 05:10:12 crc kubenswrapper[4931]: I0130 05:10:12.996536 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" podStartSLOduration=125.996530264 podStartE2EDuration="2m5.996530264s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:12.995195519 +0000 UTC m=+148.365105776" watchObservedRunningTime="2026-01-30 05:10:12.996530264 +0000 UTC m=+148.366440521" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.021673 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.021752 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.023927 4931 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-pspt5 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.024018 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" podUID="25a99ace-2c29-419e-b5de-3f11b024ee43" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.027693 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" podStartSLOduration=126.027679073 podStartE2EDuration="2m6.027679073s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.026525622 +0000 UTC m=+148.396435889" watchObservedRunningTime="2026-01-30 05:10:13.027679073 +0000 UTC m=+148.397589330" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.048995 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84v42" podStartSLOduration=126.048977722 podStartE2EDuration="2m6.048977722s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.047694019 +0000 UTC m=+148.417604276" watchObservedRunningTime="2026-01-30 05:10:13.048977722 +0000 UTC m=+148.418887979" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.073174 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.073376 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.573348623 +0000 UTC m=+148.943258880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.076078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.079279 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.579261068 +0000 UTC m=+148.949171325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.112818 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gb6js" podStartSLOduration=126.11279368 podStartE2EDuration="2m6.11279368s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.081204009 +0000 UTC m=+148.451114266" watchObservedRunningTime="2026-01-30 05:10:13.11279368 +0000 UTC m=+148.482703937" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.114908 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" podStartSLOduration=126.114901115 podStartE2EDuration="2m6.114901115s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.107938332 +0000 UTC m=+148.477848579" watchObservedRunningTime="2026-01-30 05:10:13.114901115 +0000 UTC m=+148.484811372" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.145056 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4phnt" podStartSLOduration=8.145037327 podStartE2EDuration="8.145037327s" podCreationTimestamp="2026-01-30 05:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.141849783 +0000 UTC m=+148.511760040" watchObservedRunningTime="2026-01-30 05:10:13.145037327 +0000 UTC m=+148.514947584" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.179374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.179610 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.179663 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.179689 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.179720 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.179759 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.679734119 +0000 UTC m=+149.049644366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.179806 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.180201 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.680191281 +0000 UTC m=+149.050101538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.185338 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lmnvn" podStartSLOduration=126.185311865 podStartE2EDuration="2m6.185311865s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.165800543 +0000 UTC m=+148.535710810" watchObservedRunningTime="2026-01-30 05:10:13.185311865 +0000 UTC m=+148.555222112" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.185553 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" podStartSLOduration=126.185549022 podStartE2EDuration="2m6.185549022s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.179556474 +0000 UTC m=+148.549466731" watchObservedRunningTime="2026-01-30 05:10:13.185549022 +0000 UTC m=+148.555459279" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.185882 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.192477 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.198281 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.199235 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.200963 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.223560 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-52zxd" podStartSLOduration=8.22354272 podStartE2EDuration="8.22354272s" podCreationTimestamp="2026-01-30 05:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.203999316 +0000 UTC m=+148.573909593" watchObservedRunningTime="2026-01-30 05:10:13.22354272 +0000 UTC m=+148.593452967" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.252853 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" podStartSLOduration=126.25283334 podStartE2EDuration="2m6.25283334s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.249001469 +0000 UTC m=+148.618911726" watchObservedRunningTime="2026-01-30 05:10:13.25283334 +0000 UTC m=+148.622743597" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.287599 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.288198 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.788175069 +0000 UTC m=+149.158085326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.292626 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" podStartSLOduration=126.292598905 podStartE2EDuration="2m6.292598905s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.28935416 +0000 UTC m=+148.659264417" watchObservedRunningTime="2026-01-30 05:10:13.292598905 +0000 UTC m=+148.662509162" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.348096 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w2zzb" podStartSLOduration=126.348068443 podStartE2EDuration="2m6.348068443s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:13.317877589 +0000 UTC m=+148.687787846" watchObservedRunningTime="2026-01-30 05:10:13.348068443 +0000 UTC m=+148.717978700" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.390235 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.390662 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.890645802 +0000 UTC m=+149.260556059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.453127 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.482731 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.490857 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.491105 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.99107473 +0000 UTC m=+149.360984987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.491382 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.491822 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:13.991815609 +0000 UTC m=+149.361725866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.566758 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:13 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:13 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:13 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.566841 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.595859 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.596190 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.096168612 +0000 UTC m=+149.466078869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.702527 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.702957 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.202944878 +0000 UTC m=+149.572855135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: W0130 05:10:13.802261 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f5a040cc9674363783d069aa1b9802e19208315ea912837e33dd68f24600a4f4 WatchSource:0}: Error finding container f5a040cc9674363783d069aa1b9802e19208315ea912837e33dd68f24600a4f4: Status 404 returned error can't find the container with id f5a040cc9674363783d069aa1b9802e19208315ea912837e33dd68f24600a4f4 Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.803978 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.804110 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.304082656 +0000 UTC m=+149.673992913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.804581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.805204 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.305179355 +0000 UTC m=+149.675089612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.838087 4931 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.905703 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:13 crc kubenswrapper[4931]: E0130 05:10:13.905972 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.405955803 +0000 UTC m=+149.775866060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.945529 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f5a040cc9674363783d069aa1b9802e19208315ea912837e33dd68f24600a4f4"} Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.958935 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" event={"ID":"107d8fb1-31b1-4bec-8d55-a27e312609b1","Type":"ContainerStarted","Data":"dcc07b917752e4701b33de49bd4c70afad023ba346e2b08fbb6a3ea298f25922"} Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.958983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" event={"ID":"107d8fb1-31b1-4bec-8d55-a27e312609b1","Type":"ContainerStarted","Data":"939ca2e71ca9511bbb2e4bd38be07de31a2e1d98b0e0172601c8b9718a391a15"} Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.963454 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-tbgzs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.963538 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tbgzs" podUID="fc8b1aac-27e5-4f8c-a329-821c231fb7c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 05:10:13 crc kubenswrapper[4931]: I0130 05:10:13.977501 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n4fvr" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.010088 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.013227 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.513191171 +0000 UTC m=+149.883101418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.117902 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.118518 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.618495089 +0000 UTC m=+149.988405346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.219869 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.220725 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.720704835 +0000 UTC m=+150.090615092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.320790 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.321250 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.821221656 +0000 UTC m=+150.191131913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.371298 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k5fcn"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.372731 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.405479 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.424739 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-utilities\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.424790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg6xd\" (UniqueName: \"kubernetes.io/projected/9163b44e-4aa5-422c-a2fd-55747c8d506e-kube-api-access-zg6xd\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.424862 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.424887 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-catalog-content\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.425340 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:14.925325562 +0000 UTC m=+150.295235819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.493908 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5fcn"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.525944 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.526475 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-catalog-content\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.526567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-utilities\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.526594 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg6xd\" (UniqueName: \"kubernetes.io/projected/9163b44e-4aa5-422c-a2fd-55747c8d506e-kube-api-access-zg6xd\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.527236 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:15.02721168 +0000 UTC m=+150.397121937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.527792 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-catalog-content\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.528083 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-utilities\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.550237 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:14 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:14 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:14 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.550313 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.566458 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frnwj"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.569062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frnwj"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.569171 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.574129 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.595587 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg6xd\" (UniqueName: \"kubernetes.io/projected/9163b44e-4aa5-422c-a2fd-55747c8d506e-kube-api-access-zg6xd\") pod \"community-operators-k5fcn\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.636557 4931 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-30T05:10:13.83811093Z","Handler":null,"Name":""} Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.637539 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-catalog-content\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.637633 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.637686 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6cpz\" (UniqueName: \"kubernetes.io/projected/0dbdc3df-7306-41e4-93c6-d7d27d481789-kube-api-access-z6cpz\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.637722 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-utilities\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.638205 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:10:15.138186826 +0000 UTC m=+150.508097083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f8zg7" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.730484 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gx6j5" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.743079 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.743267 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6cpz\" (UniqueName: \"kubernetes.io/projected/0dbdc3df-7306-41e4-93c6-d7d27d481789-kube-api-access-z6cpz\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.743295 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-utilities\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.743355 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-catalog-content\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: E0130 05:10:14.743824 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:10:15.243799212 +0000 UTC m=+150.613709469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.743892 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-catalog-content\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.744212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-utilities\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.750657 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4j7wh"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.751712 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.780243 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6cpz\" (UniqueName: \"kubernetes.io/projected/0dbdc3df-7306-41e4-93c6-d7d27d481789-kube-api-access-z6cpz\") pod \"certified-operators-frnwj\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.780747 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4j7wh"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.785070 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.803122 4931 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.803169 4931 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.851363 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5brk\" (UniqueName: \"kubernetes.io/projected/39e99a4f-8956-424c-a4c6-7a67f9983cd0-kube-api-access-v5brk\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.851482 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-utilities\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.851525 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-catalog-content\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.851553 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.873385 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.873446 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.938693 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f8zg7\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.943945 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.953052 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pfl6d"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.953763 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.953938 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5brk\" (UniqueName: \"kubernetes.io/projected/39e99a4f-8956-424c-a4c6-7a67f9983cd0-kube-api-access-v5brk\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.953993 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-utilities\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.954029 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-catalog-content\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.954276 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.954448 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-catalog-content\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.955003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-utilities\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.979775 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfl6d"] Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.982875 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5brk\" (UniqueName: \"kubernetes.io/projected/39e99a4f-8956-424c-a4c6-7a67f9983cd0-kube-api-access-v5brk\") pod \"community-operators-4j7wh\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:14 crc kubenswrapper[4931]: I0130 05:10:14.989876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.009961 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.012537 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f08533c8c1e5e422c76f7d093f8442201c4a8c0eb6f27404db8c3f13e460af3d"} Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.012575 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3693cd0900980aaf02b98121d11ee70a79160d1598ac0fde1e855eb064cb7c4a"} Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.014237 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ac537307a51a0abf53217144aeb8e8505649d78e862fdfde9f5b7b863b8585ee"} Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.016734 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" event={"ID":"107d8fb1-31b1-4bec-8d55-a27e312609b1","Type":"ContainerStarted","Data":"18057f42a99b06d4067370e7058db919da5a13cb4cd6ff4eda3c3b5719b57209"} Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.018925 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2e170e479a37fb7f0f7ea50509dc59b9ba98d604e2d4a30f544e2559d870dec5"} Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.018953 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"75efaad328d23840ed695e63aadcacb49e97d45449f35d6f23aac0c16b4637c7"} Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.019250 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.061033 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn84w\" (UniqueName: \"kubernetes.io/projected/72ab8593-3b5e-421a-ac80-b85376b21ffe-kube-api-access-xn84w\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.061182 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-utilities\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.061367 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-catalog-content\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.070006 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.098042 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bj2bf" podStartSLOduration=10.09800645 podStartE2EDuration="10.09800645s" podCreationTimestamp="2026-01-30 05:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:15.067633292 +0000 UTC m=+150.437543549" watchObservedRunningTime="2026-01-30 05:10:15.09800645 +0000 UTC m=+150.467916707" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.162439 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-catalog-content\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.162516 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn84w\" (UniqueName: \"kubernetes.io/projected/72ab8593-3b5e-421a-ac80-b85376b21ffe-kube-api-access-xn84w\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.162574 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-utilities\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.163603 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-utilities\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.164828 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-catalog-content\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.200710 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn84w\" (UniqueName: \"kubernetes.io/projected/72ab8593-3b5e-421a-ac80-b85376b21ffe-kube-api-access-xn84w\") pod \"certified-operators-pfl6d\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.291006 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.351645 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5fcn"] Jan 30 05:10:15 crc kubenswrapper[4931]: W0130 05:10:15.390590 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9163b44e_4aa5_422c_a2fd_55747c8d506e.slice/crio-072fd632153afb9e250c4d51854168ac5eaa4674e8d0bff4bbfe11fe55d97dbc WatchSource:0}: Error finding container 072fd632153afb9e250c4d51854168ac5eaa4674e8d0bff4bbfe11fe55d97dbc: Status 404 returned error can't find the container with id 072fd632153afb9e250c4d51854168ac5eaa4674e8d0bff4bbfe11fe55d97dbc Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.402541 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frnwj"] Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.453241 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.522583 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f8zg7"] Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.547176 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:15 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:15 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:15 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.547251 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.591568 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4j7wh"] Jan 30 05:10:15 crc kubenswrapper[4931]: W0130 05:10:15.616778 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39e99a4f_8956_424c_a4c6_7a67f9983cd0.slice/crio-450b4e5ad73b8f494223ffa7ed558e29541dc036fd17fb80d85509be86652339 WatchSource:0}: Error finding container 450b4e5ad73b8f494223ffa7ed558e29541dc036fd17fb80d85509be86652339: Status 404 returned error can't find the container with id 450b4e5ad73b8f494223ffa7ed558e29541dc036fd17fb80d85509be86652339 Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.710990 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfl6d"] Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.946476 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.947216 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.949819 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.949817 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.956869 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.975016 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/792f11bc-0559-4037-8c28-1628f1cc0ec7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:15 crc kubenswrapper[4931]: I0130 05:10:15.975120 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/792f11bc-0559-4037-8c28-1628f1cc0ec7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.025138 4931 generic.go:334] "Generic (PLEG): container finished" podID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerID="47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b" exitCode=0 Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.025207 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerDied","Data":"47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.025256 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerStarted","Data":"8fc2bd9106d95cb2212067bb79c5743a637b67855826a61a2a9690fea3308441"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.027651 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.028370 4931 generic.go:334] "Generic (PLEG): container finished" podID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerID="0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893" exitCode=0 Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.028438 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerDied","Data":"0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.028456 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerStarted","Data":"450b4e5ad73b8f494223ffa7ed558e29541dc036fd17fb80d85509be86652339"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.030357 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerStarted","Data":"d2226ce036426e1065e325748a9372dea2501d4becb5598917d5e4c3d429e02b"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.033102 4931 generic.go:334] "Generic (PLEG): container finished" podID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerID="395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae" exitCode=0 Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.033155 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerDied","Data":"395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.033224 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerStarted","Data":"072fd632153afb9e250c4d51854168ac5eaa4674e8d0bff4bbfe11fe55d97dbc"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.035004 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" event={"ID":"32e4a367-9945-4fdb-b5bc-4c8d35512264","Type":"ContainerStarted","Data":"a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.035045 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" event={"ID":"32e4a367-9945-4fdb-b5bc-4c8d35512264","Type":"ContainerStarted","Data":"79ebc9473f22f72df11aa297cb419ebdd7c57ca36caf670a91a0d056621b7c54"} Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.035246 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.076478 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/792f11bc-0559-4037-8c28-1628f1cc0ec7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.076571 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/792f11bc-0559-4037-8c28-1628f1cc0ec7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.076652 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/792f11bc-0559-4037-8c28-1628f1cc0ec7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.102442 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/792f11bc-0559-4037-8c28-1628f1cc0ec7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.247380 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.252687 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8ndkb" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.263285 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.269448 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" podStartSLOduration=129.269402594 podStartE2EDuration="2m9.269402594s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:16.183946038 +0000 UTC m=+151.553856295" watchObservedRunningTime="2026-01-30 05:10:16.269402594 +0000 UTC m=+151.639312871" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.449230 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.449303 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.474551 4931 patch_prober.go:28] interesting pod/console-f9d7485db-ff4lr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.474644 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ff4lr" podUID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.553744 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:16 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Jan 30 05:10:16 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:16 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.553809 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.569381 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7jp5s"] Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.570492 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.577698 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.583390 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xcp\" (UniqueName: \"kubernetes.io/projected/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-kube-api-access-d8xcp\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.583468 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-utilities\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.583496 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-catalog-content\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.590676 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jp5s"] Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.628087 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.685374 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-catalog-content\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.685865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xcp\" (UniqueName: \"kubernetes.io/projected/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-kube-api-access-d8xcp\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.685917 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-utilities\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.686351 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-catalog-content\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.688023 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-utilities\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.712497 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xcp\" (UniqueName: \"kubernetes.io/projected/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-kube-api-access-d8xcp\") pod \"redhat-marketplace-7jp5s\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.773036 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwdht" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.927183 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.956954 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mltbk"] Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.958695 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.960767 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mltbk"] Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.999810 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjhnn\" (UniqueName: \"kubernetes.io/projected/025f8209-dd2a-482c-8bb2-e0ad2a98a563-kube-api-access-xjhnn\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:16 crc kubenswrapper[4931]: I0130 05:10:16.999863 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-utilities\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:16.999912 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-catalog-content\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.045085 4931 generic.go:334] "Generic (PLEG): container finished" podID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerID="bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5" exitCode=0 Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.045898 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerDied","Data":"bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5"} Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.054003 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"792f11bc-0559-4037-8c28-1628f1cc0ec7","Type":"ContainerStarted","Data":"6d7890ef195ce83aca04b6656932ece915976829dbe4227034fe4866b13227f6"} Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.101319 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-catalog-content\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.103074 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjhnn\" (UniqueName: \"kubernetes.io/projected/025f8209-dd2a-482c-8bb2-e0ad2a98a563-kube-api-access-xjhnn\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.103106 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-utilities\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.104263 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-utilities\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.106460 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-catalog-content\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.125770 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjhnn\" (UniqueName: \"kubernetes.io/projected/025f8209-dd2a-482c-8bb2-e0ad2a98a563-kube-api-access-xjhnn\") pod \"redhat-marketplace-mltbk\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.246774 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jp5s"] Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.283439 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.540609 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.544658 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z64mf"] Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.546388 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.548464 4931 patch_prober.go:28] interesting pod/router-default-5444994796-268mt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:10:17 crc kubenswrapper[4931]: [+]has-synced ok Jan 30 05:10:17 crc kubenswrapper[4931]: [+]process-running ok Jan 30 05:10:17 crc kubenswrapper[4931]: healthz check failed Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.553057 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-268mt" podUID="21e8738e-651f-4f09-a052-1ff22028e3f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.554843 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.562278 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z64mf"] Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.717354 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-catalog-content\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.717406 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gdhb\" (UniqueName: \"kubernetes.io/projected/bb356dde-8435-471d-a260-8966eeb15eb3-kube-api-access-9gdhb\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.717553 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-utilities\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.818862 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-utilities\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.819221 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-catalog-content\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.819438 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gdhb\" (UniqueName: \"kubernetes.io/projected/bb356dde-8435-471d-a260-8966eeb15eb3-kube-api-access-9gdhb\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.819953 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-catalog-content\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.820209 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-utilities\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.848505 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gdhb\" (UniqueName: \"kubernetes.io/projected/bb356dde-8435-471d-a260-8966eeb15eb3-kube-api-access-9gdhb\") pod \"redhat-operators-z64mf\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.850773 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mltbk"] Jan 30 05:10:17 crc kubenswrapper[4931]: W0130 05:10:17.872286 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025f8209_dd2a_482c_8bb2_e0ad2a98a563.slice/crio-38ccccf46a0ae1cf8c452695abc11e45f7b2dbc3854f74a91d1dd3b017d08519 WatchSource:0}: Error finding container 38ccccf46a0ae1cf8c452695abc11e45f7b2dbc3854f74a91d1dd3b017d08519: Status 404 returned error can't find the container with id 38ccccf46a0ae1cf8c452695abc11e45f7b2dbc3854f74a91d1dd3b017d08519 Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.897197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.945667 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56dq5"] Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.948810 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:17 crc kubenswrapper[4931]: I0130 05:10:17.966052 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56dq5"] Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.018235 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-tbgzs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.018304 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-tbgzs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.018308 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tbgzs" podUID="fc8b1aac-27e5-4f8c-a329-821c231fb7c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.018454 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tbgzs" podUID="fc8b1aac-27e5-4f8c-a329-821c231fb7c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.027736 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.034682 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pspt5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.094301 4931 generic.go:334] "Generic (PLEG): container finished" podID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerID="ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2" exitCode=0 Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.094664 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jp5s" event={"ID":"9ac0e0dc-4375-4faf-a262-2cf4e9772a29","Type":"ContainerDied","Data":"ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2"} Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.094694 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jp5s" event={"ID":"9ac0e0dc-4375-4faf-a262-2cf4e9772a29","Type":"ContainerStarted","Data":"827a507dec87e3e9291f3f56b6d8162668e69da1d6e51e16d8c5431ea4ab1518"} Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.107018 4931 generic.go:334] "Generic (PLEG): container finished" podID="1a8f99a6-f163-4720-8eb4-bc8607753d79" containerID="76f686a64b7bcba52e9cf572d78b41631a5873f435dbdf098126fe32ac5ccc3f" exitCode=0 Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.107137 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" event={"ID":"1a8f99a6-f163-4720-8eb4-bc8607753d79","Type":"ContainerDied","Data":"76f686a64b7bcba52e9cf572d78b41631a5873f435dbdf098126fe32ac5ccc3f"} Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.111307 4931 generic.go:334] "Generic (PLEG): container finished" podID="792f11bc-0559-4037-8c28-1628f1cc0ec7" containerID="7e807662008e3ca4617fe0888edfc68608e136fe58951ccd21cc89ffd24e6aaa" exitCode=0 Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.111364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"792f11bc-0559-4037-8c28-1628f1cc0ec7","Type":"ContainerDied","Data":"7e807662008e3ca4617fe0888edfc68608e136fe58951ccd21cc89ffd24e6aaa"} Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.124895 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerStarted","Data":"38ccccf46a0ae1cf8c452695abc11e45f7b2dbc3854f74a91d1dd3b017d08519"} Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.127087 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-catalog-content\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.127230 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-utilities\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.127283 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnh5\" (UniqueName: \"kubernetes.io/projected/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-kube-api-access-2dnh5\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.175791 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.228162 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-catalog-content\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.228252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-utilities\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.228288 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnh5\" (UniqueName: \"kubernetes.io/projected/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-kube-api-access-2dnh5\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.232571 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-catalog-content\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.232621 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-utilities\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.255738 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnh5\" (UniqueName: \"kubernetes.io/projected/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-kube-api-access-2dnh5\") pod \"redhat-operators-56dq5\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.282969 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.543507 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.546989 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-268mt" Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.559452 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z64mf"] Jan 30 05:10:18 crc kubenswrapper[4931]: I0130 05:10:18.773647 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56dq5"] Jan 30 05:10:18 crc kubenswrapper[4931]: W0130 05:10:18.788712 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9dd69f_1c2e_4b14_83f8_dff33fe2118d.slice/crio-54c769092679e1f74ddf76a97ce7a50ba4a68138b396c7ae26403fddbe513fc1 WatchSource:0}: Error finding container 54c769092679e1f74ddf76a97ce7a50ba4a68138b396c7ae26403fddbe513fc1: Status 404 returned error can't find the container with id 54c769092679e1f74ddf76a97ce7a50ba4a68138b396c7ae26403fddbe513fc1 Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.159782 4931 generic.go:334] "Generic (PLEG): container finished" podID="bb356dde-8435-471d-a260-8966eeb15eb3" containerID="27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f" exitCode=0 Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.160500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerDied","Data":"27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f"} Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.160541 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerStarted","Data":"a6a9276eab6557cd642ac08c2583f1c3b08c9bbb62478c22c66b2f818922633b"} Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.170204 4931 generic.go:334] "Generic (PLEG): container finished" podID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerID="95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836" exitCode=0 Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.170310 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerDied","Data":"95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836"} Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.200864 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerStarted","Data":"54c769092679e1f74ddf76a97ce7a50ba4a68138b396c7ae26403fddbe513fc1"} Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.570653 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.600412 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.659584 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/792f11bc-0559-4037-8c28-1628f1cc0ec7-kubelet-dir\") pod \"792f11bc-0559-4037-8c28-1628f1cc0ec7\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.659723 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/792f11bc-0559-4037-8c28-1628f1cc0ec7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "792f11bc-0559-4037-8c28-1628f1cc0ec7" (UID: "792f11bc-0559-4037-8c28-1628f1cc0ec7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.659902 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/792f11bc-0559-4037-8c28-1628f1cc0ec7-kube-api-access\") pod \"792f11bc-0559-4037-8c28-1628f1cc0ec7\" (UID: \"792f11bc-0559-4037-8c28-1628f1cc0ec7\") " Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.660331 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/792f11bc-0559-4037-8c28-1628f1cc0ec7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.666509 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792f11bc-0559-4037-8c28-1628f1cc0ec7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "792f11bc-0559-4037-8c28-1628f1cc0ec7" (UID: "792f11bc-0559-4037-8c28-1628f1cc0ec7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.761079 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-855th\" (UniqueName: \"kubernetes.io/projected/1a8f99a6-f163-4720-8eb4-bc8607753d79-kube-api-access-855th\") pod \"1a8f99a6-f163-4720-8eb4-bc8607753d79\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.761157 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a8f99a6-f163-4720-8eb4-bc8607753d79-secret-volume\") pod \"1a8f99a6-f163-4720-8eb4-bc8607753d79\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.761213 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a8f99a6-f163-4720-8eb4-bc8607753d79-config-volume\") pod \"1a8f99a6-f163-4720-8eb4-bc8607753d79\" (UID: \"1a8f99a6-f163-4720-8eb4-bc8607753d79\") " Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.761748 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/792f11bc-0559-4037-8c28-1628f1cc0ec7-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.762601 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8f99a6-f163-4720-8eb4-bc8607753d79-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a8f99a6-f163-4720-8eb4-bc8607753d79" (UID: "1a8f99a6-f163-4720-8eb4-bc8607753d79"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.765102 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8f99a6-f163-4720-8eb4-bc8607753d79-kube-api-access-855th" (OuterVolumeSpecName: "kube-api-access-855th") pod "1a8f99a6-f163-4720-8eb4-bc8607753d79" (UID: "1a8f99a6-f163-4720-8eb4-bc8607753d79"). InnerVolumeSpecName "kube-api-access-855th". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.765346 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8f99a6-f163-4720-8eb4-bc8607753d79-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a8f99a6-f163-4720-8eb4-bc8607753d79" (UID: "1a8f99a6-f163-4720-8eb4-bc8607753d79"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.863009 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a8f99a6-f163-4720-8eb4-bc8607753d79-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.863045 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a8f99a6-f163-4720-8eb4-bc8607753d79-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:19 crc kubenswrapper[4931]: I0130 05:10:19.863066 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-855th\" (UniqueName: \"kubernetes.io/projected/1a8f99a6-f163-4720-8eb4-bc8607753d79-kube-api-access-855th\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.214759 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"792f11bc-0559-4037-8c28-1628f1cc0ec7","Type":"ContainerDied","Data":"6d7890ef195ce83aca04b6656932ece915976829dbe4227034fe4866b13227f6"} Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.214826 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d7890ef195ce83aca04b6656932ece915976829dbe4227034fe4866b13227f6" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.214906 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.237550 4931 generic.go:334] "Generic (PLEG): container finished" podID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerID="c23bf00de71e269e3c5c3d32d2b7e2842aa494dd5f905c0feee3ad4799d5aa22" exitCode=0 Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.237750 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerDied","Data":"c23bf00de71e269e3c5c3d32d2b7e2842aa494dd5f905c0feee3ad4799d5aa22"} Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.244770 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" event={"ID":"1a8f99a6-f163-4720-8eb4-bc8607753d79","Type":"ContainerDied","Data":"925d7ec4214d424008eeb73fc8925f29c574b109b85902152a8bda78b7583feb"} Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.244820 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="925d7ec4214d424008eeb73fc8925f29c574b109b85902152a8bda78b7583feb" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.244825 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.568791 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 05:10:20 crc kubenswrapper[4931]: E0130 05:10:20.569060 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8f99a6-f163-4720-8eb4-bc8607753d79" containerName="collect-profiles" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.569072 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8f99a6-f163-4720-8eb4-bc8607753d79" containerName="collect-profiles" Jan 30 05:10:20 crc kubenswrapper[4931]: E0130 05:10:20.569085 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792f11bc-0559-4037-8c28-1628f1cc0ec7" containerName="pruner" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.569093 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="792f11bc-0559-4037-8c28-1628f1cc0ec7" containerName="pruner" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.569207 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="792f11bc-0559-4037-8c28-1628f1cc0ec7" containerName="pruner" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.569217 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8f99a6-f163-4720-8eb4-bc8607753d79" containerName="collect-profiles" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.569610 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.573481 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.574650 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.576897 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.673498 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd46484-5f82-4786-a8a1-25484b70f820-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.673637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd46484-5f82-4786-a8a1-25484b70f820-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.775607 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd46484-5f82-4786-a8a1-25484b70f820-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.775727 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd46484-5f82-4786-a8a1-25484b70f820-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.775769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd46484-5f82-4786-a8a1-25484b70f820-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.795641 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd46484-5f82-4786-a8a1-25484b70f820-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:20 crc kubenswrapper[4931]: I0130 05:10:20.889824 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:21 crc kubenswrapper[4931]: I0130 05:10:21.511158 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 05:10:22 crc kubenswrapper[4931]: I0130 05:10:22.296073 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dcd46484-5f82-4786-a8a1-25484b70f820","Type":"ContainerStarted","Data":"7f8a5441fbe7c584c757919eb28cbefcf2c8e76103b55e4ac23031a1715dee4e"} Jan 30 05:10:23 crc kubenswrapper[4931]: I0130 05:10:23.281916 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4phnt" Jan 30 05:10:23 crc kubenswrapper[4931]: I0130 05:10:23.312309 4931 generic.go:334] "Generic (PLEG): container finished" podID="dcd46484-5f82-4786-a8a1-25484b70f820" containerID="1f13b2e50cc2004031bacb2a31e30974e49e8bb1676e761b872966edb1b3f54f" exitCode=0 Jan 30 05:10:23 crc kubenswrapper[4931]: I0130 05:10:23.312347 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dcd46484-5f82-4786-a8a1-25484b70f820","Type":"ContainerDied","Data":"1f13b2e50cc2004031bacb2a31e30974e49e8bb1676e761b872966edb1b3f54f"} Jan 30 05:10:26 crc kubenswrapper[4931]: I0130 05:10:26.480048 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:26 crc kubenswrapper[4931]: I0130 05:10:26.484061 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:10:27 crc kubenswrapper[4931]: I0130 05:10:27.363771 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:10:27 crc kubenswrapper[4931]: I0130 05:10:27.364369 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4931]: I0130 05:10:28.030731 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tbgzs" Jan 30 05:10:30 crc kubenswrapper[4931]: I0130 05:10:30.088131 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:10:30 crc kubenswrapper[4931]: I0130 05:10:30.096247 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1421762e-4873-46cb-8c43-b8faa0cbca62-metrics-certs\") pod \"network-metrics-daemon-gt48b\" (UID: \"1421762e-4873-46cb-8c43-b8faa0cbca62\") " pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:10:30 crc kubenswrapper[4931]: I0130 05:10:30.268316 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt48b" Jan 30 05:10:34 crc kubenswrapper[4931]: I0130 05:10:34.997873 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:10:36 crc kubenswrapper[4931]: I0130 05:10:36.959248 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.124367 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd46484-5f82-4786-a8a1-25484b70f820-kube-api-access\") pod \"dcd46484-5f82-4786-a8a1-25484b70f820\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.124496 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd46484-5f82-4786-a8a1-25484b70f820-kubelet-dir\") pod \"dcd46484-5f82-4786-a8a1-25484b70f820\" (UID: \"dcd46484-5f82-4786-a8a1-25484b70f820\") " Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.124708 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcd46484-5f82-4786-a8a1-25484b70f820-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dcd46484-5f82-4786-a8a1-25484b70f820" (UID: "dcd46484-5f82-4786-a8a1-25484b70f820"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.124874 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd46484-5f82-4786-a8a1-25484b70f820-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.130440 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd46484-5f82-4786-a8a1-25484b70f820-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dcd46484-5f82-4786-a8a1-25484b70f820" (UID: "dcd46484-5f82-4786-a8a1-25484b70f820"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.226317 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd46484-5f82-4786-a8a1-25484b70f820-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.419679 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dcd46484-5f82-4786-a8a1-25484b70f820","Type":"ContainerDied","Data":"7f8a5441fbe7c584c757919eb28cbefcf2c8e76103b55e4ac23031a1715dee4e"} Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.419728 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f8a5441fbe7c584c757919eb28cbefcf2c8e76103b55e4ac23031a1715dee4e" Jan 30 05:10:37 crc kubenswrapper[4931]: I0130 05:10:37.419746 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.066822 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.067938 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gdhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-z64mf_openshift-marketplace(bb356dde-8435-471d-a260-8966eeb15eb3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.072927 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-z64mf" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.173057 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.173213 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xn84w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-pfl6d_openshift-marketplace(72ab8593-3b5e-421a-ac80-b85376b21ffe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.174400 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-pfl6d" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.185598 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.185719 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8xcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7jp5s_openshift-marketplace(9ac0e0dc-4375-4faf-a262-2cf4e9772a29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.187612 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7jp5s" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.225651 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.226251 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dnh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-56dq5_openshift-marketplace(7e9dd69f-1c2e-4b14-83f8-dff33fe2118d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.227713 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.227832 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zg6xd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-k5fcn_openshift-marketplace(9163b44e-4aa5-422c-a2fd-55747c8d506e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.228948 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-k5fcn" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.232132 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-56dq5" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" Jan 30 05:10:45 crc kubenswrapper[4931]: I0130 05:10:45.323855 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gt48b"] Jan 30 05:10:45 crc kubenswrapper[4931]: W0130 05:10:45.331700 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1421762e_4873_46cb_8c43_b8faa0cbca62.slice/crio-af78854884472a71530d82d034faa6f6359f0652a628d54776f28d6a26b04995 WatchSource:0}: Error finding container af78854884472a71530d82d034faa6f6359f0652a628d54776f28d6a26b04995: Status 404 returned error can't find the container with id af78854884472a71530d82d034faa6f6359f0652a628d54776f28d6a26b04995 Jan 30 05:10:45 crc kubenswrapper[4931]: I0130 05:10:45.478824 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerStarted","Data":"924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9"} Jan 30 05:10:45 crc kubenswrapper[4931]: I0130 05:10:45.484160 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gt48b" event={"ID":"1421762e-4873-46cb-8c43-b8faa0cbca62","Type":"ContainerStarted","Data":"af78854884472a71530d82d034faa6f6359f0652a628d54776f28d6a26b04995"} Jan 30 05:10:45 crc kubenswrapper[4931]: I0130 05:10:45.486702 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerStarted","Data":"8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111"} Jan 30 05:10:45 crc kubenswrapper[4931]: I0130 05:10:45.488240 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerStarted","Data":"a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8"} Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.493400 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-56dq5" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.493994 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-z64mf" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.494140 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-k5fcn" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.494843 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7jp5s" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" Jan 30 05:10:45 crc kubenswrapper[4931]: E0130 05:10:45.494961 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-pfl6d" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.497055 4931 generic.go:334] "Generic (PLEG): container finished" podID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerID="924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9" exitCode=0 Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.497151 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerDied","Data":"924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9"} Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.503242 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gt48b" event={"ID":"1421762e-4873-46cb-8c43-b8faa0cbca62","Type":"ContainerStarted","Data":"a598cff63bbc39cb13ce46e815b31cc5173cc8625f4cbbeedd2e4a6af3e83182"} Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.503300 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gt48b" event={"ID":"1421762e-4873-46cb-8c43-b8faa0cbca62","Type":"ContainerStarted","Data":"254617a263e25220ad4eb40ffafa564e067a606620a8cbddb9e3a5d832e1ee94"} Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.508194 4931 generic.go:334] "Generic (PLEG): container finished" podID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerID="8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111" exitCode=0 Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.508325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerDied","Data":"8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111"} Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.512656 4931 generic.go:334] "Generic (PLEG): container finished" podID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerID="a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8" exitCode=0 Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.512695 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerDied","Data":"a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8"} Jan 30 05:10:46 crc kubenswrapper[4931]: I0130 05:10:46.553697 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gt48b" podStartSLOduration=159.553659824 podStartE2EDuration="2m39.553659824s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:46.547681507 +0000 UTC m=+181.917591794" watchObservedRunningTime="2026-01-30 05:10:46.553659824 +0000 UTC m=+181.923570091" Jan 30 05:10:47 crc kubenswrapper[4931]: I0130 05:10:47.521108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerStarted","Data":"df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca"} Jan 30 05:10:47 crc kubenswrapper[4931]: I0130 05:10:47.523914 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerStarted","Data":"8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534"} Jan 30 05:10:47 crc kubenswrapper[4931]: I0130 05:10:47.528616 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerStarted","Data":"61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573"} Jan 30 05:10:47 crc kubenswrapper[4931]: I0130 05:10:47.545488 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frnwj" podStartSLOduration=2.457923278 podStartE2EDuration="33.545466139s" podCreationTimestamp="2026-01-30 05:10:14 +0000 UTC" firstStartedPulling="2026-01-30 05:10:16.027396054 +0000 UTC m=+151.397306311" lastFinishedPulling="2026-01-30 05:10:47.114938915 +0000 UTC m=+182.484849172" observedRunningTime="2026-01-30 05:10:47.541738031 +0000 UTC m=+182.911648338" watchObservedRunningTime="2026-01-30 05:10:47.545466139 +0000 UTC m=+182.915376406" Jan 30 05:10:47 crc kubenswrapper[4931]: I0130 05:10:47.570154 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4j7wh" podStartSLOduration=2.605764374 podStartE2EDuration="33.570112717s" podCreationTimestamp="2026-01-30 05:10:14 +0000 UTC" firstStartedPulling="2026-01-30 05:10:16.031850861 +0000 UTC m=+151.401761118" lastFinishedPulling="2026-01-30 05:10:46.996199214 +0000 UTC m=+182.366109461" observedRunningTime="2026-01-30 05:10:47.569160192 +0000 UTC m=+182.939070459" watchObservedRunningTime="2026-01-30 05:10:47.570112717 +0000 UTC m=+182.940022984" Jan 30 05:10:47 crc kubenswrapper[4931]: I0130 05:10:47.607315 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mltbk" podStartSLOduration=3.837449271 podStartE2EDuration="31.607284933s" podCreationTimestamp="2026-01-30 05:10:16 +0000 UTC" firstStartedPulling="2026-01-30 05:10:19.185225881 +0000 UTC m=+154.555136138" lastFinishedPulling="2026-01-30 05:10:46.955061543 +0000 UTC m=+182.324971800" observedRunningTime="2026-01-30 05:10:47.600564997 +0000 UTC m=+182.970475264" watchObservedRunningTime="2026-01-30 05:10:47.607284933 +0000 UTC m=+182.977195190" Jan 30 05:10:48 crc kubenswrapper[4931]: I0130 05:10:48.504533 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mg25x" Jan 30 05:10:53 crc kubenswrapper[4931]: I0130 05:10:53.461470 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:10:54 crc kubenswrapper[4931]: I0130 05:10:54.945643 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:54 crc kubenswrapper[4931]: I0130 05:10:54.945718 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.070780 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.089481 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.146535 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.146722 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.154452 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 05:10:55 crc kubenswrapper[4931]: E0130 05:10:55.154739 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd46484-5f82-4786-a8a1-25484b70f820" containerName="pruner" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.154755 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd46484-5f82-4786-a8a1-25484b70f820" containerName="pruner" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.154856 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd46484-5f82-4786-a8a1-25484b70f820" containerName="pruner" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.155280 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.157253 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.158054 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.168408 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.260891 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.264809 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.366229 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.366326 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.366808 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.385019 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.499524 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.629489 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.631631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.724381 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 05:10:55 crc kubenswrapper[4931]: I0130 05:10:55.950730 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ww4ml"] Jan 30 05:10:56 crc kubenswrapper[4931]: I0130 05:10:56.585505 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc","Type":"ContainerStarted","Data":"ad1c13e14a492a607796de98bd778ea69e3d65c16697f4c694aa005acd953049"} Jan 30 05:10:56 crc kubenswrapper[4931]: I0130 05:10:56.585841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc","Type":"ContainerStarted","Data":"3e88c70970ead77b66a99a43f05bacb873bb709d4d21867e011b9a24c1f4cf06"} Jan 30 05:10:56 crc kubenswrapper[4931]: I0130 05:10:56.603710 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.603689546 podStartE2EDuration="1.603689546s" podCreationTimestamp="2026-01-30 05:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:56.602137385 +0000 UTC m=+191.972047642" watchObservedRunningTime="2026-01-30 05:10:56.603689546 +0000 UTC m=+191.973599803" Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.283774 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.283995 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.338243 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:57 crc kubenswrapper[4931]: E0130 05:10:57.359627 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod4e2d9da0_86e1_4a44_a714_6ca3d2d32edc.slice/crio-conmon-ad1c13e14a492a607796de98bd778ea69e3d65c16697f4c694aa005acd953049.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.363055 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.363122 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.393745 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4j7wh"] Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.593316 4931 generic.go:334] "Generic (PLEG): container finished" podID="4e2d9da0-86e1-4a44-a714-6ca3d2d32edc" containerID="ad1c13e14a492a607796de98bd778ea69e3d65c16697f4c694aa005acd953049" exitCode=0 Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.593442 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc","Type":"ContainerDied","Data":"ad1c13e14a492a607796de98bd778ea69e3d65c16697f4c694aa005acd953049"} Jan 30 05:10:57 crc kubenswrapper[4931]: I0130 05:10:57.657613 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:10:58 crc kubenswrapper[4931]: I0130 05:10:58.599909 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4j7wh" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="registry-server" containerID="cri-o://8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534" gracePeriod=2 Jan 30 05:10:58 crc kubenswrapper[4931]: I0130 05:10:58.840546 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:58.992390 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.024497 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kube-api-access\") pod \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.024598 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kubelet-dir\") pod \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\" (UID: \"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc\") " Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.024734 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4e2d9da0-86e1-4a44-a714-6ca3d2d32edc" (UID: "4e2d9da0-86e1-4a44-a714-6ca3d2d32edc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.025316 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.032305 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4e2d9da0-86e1-4a44-a714-6ca3d2d32edc" (UID: "4e2d9da0-86e1-4a44-a714-6ca3d2d32edc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.126602 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-catalog-content\") pod \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.126835 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5brk\" (UniqueName: \"kubernetes.io/projected/39e99a4f-8956-424c-a4c6-7a67f9983cd0-kube-api-access-v5brk\") pod \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.126887 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-utilities\") pod \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\" (UID: \"39e99a4f-8956-424c-a4c6-7a67f9983cd0\") " Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.127137 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e2d9da0-86e1-4a44-a714-6ca3d2d32edc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.127736 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-utilities" (OuterVolumeSpecName: "utilities") pod "39e99a4f-8956-424c-a4c6-7a67f9983cd0" (UID: "39e99a4f-8956-424c-a4c6-7a67f9983cd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.132895 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e99a4f-8956-424c-a4c6-7a67f9983cd0-kube-api-access-v5brk" (OuterVolumeSpecName: "kube-api-access-v5brk") pod "39e99a4f-8956-424c-a4c6-7a67f9983cd0" (UID: "39e99a4f-8956-424c-a4c6-7a67f9983cd0"). InnerVolumeSpecName "kube-api-access-v5brk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.190291 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39e99a4f-8956-424c-a4c6-7a67f9983cd0" (UID: "39e99a4f-8956-424c-a4c6-7a67f9983cd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.228162 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5brk\" (UniqueName: \"kubernetes.io/projected/39e99a4f-8956-424c-a4c6-7a67f9983cd0-kube-api-access-v5brk\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.228201 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.228213 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e99a4f-8956-424c-a4c6-7a67f9983cd0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.596732 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mltbk"] Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.606859 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e2d9da0-86e1-4a44-a714-6ca3d2d32edc","Type":"ContainerDied","Data":"3e88c70970ead77b66a99a43f05bacb873bb709d4d21867e011b9a24c1f4cf06"} Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.606913 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e88c70970ead77b66a99a43f05bacb873bb709d4d21867e011b9a24c1f4cf06" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.607017 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.609556 4931 generic.go:334] "Generic (PLEG): container finished" podID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerID="8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534" exitCode=0 Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.609601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerDied","Data":"8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534"} Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.609620 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j7wh" event={"ID":"39e99a4f-8956-424c-a4c6-7a67f9983cd0","Type":"ContainerDied","Data":"450b4e5ad73b8f494223ffa7ed558e29541dc036fd17fb80d85509be86652339"} Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.609638 4931 scope.go:117] "RemoveContainer" containerID="8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.609735 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4j7wh" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.612891 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerStarted","Data":"128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e"} Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.615848 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerStarted","Data":"6a25d2a944e2534b4ce44d83ba3e3a50d7e29e3ca3c70fb2a75b736f90d90dec"} Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.621590 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerStarted","Data":"70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6"} Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.629837 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4j7wh"] Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.633506 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4j7wh"] Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.637872 4931 scope.go:117] "RemoveContainer" containerID="924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.692111 4931 scope.go:117] "RemoveContainer" containerID="0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.807030 4931 scope.go:117] "RemoveContainer" containerID="8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534" Jan 30 05:10:59 crc kubenswrapper[4931]: E0130 05:10:59.808009 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534\": container with ID starting with 8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534 not found: ID does not exist" containerID="8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.808070 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534"} err="failed to get container status \"8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534\": rpc error: code = NotFound desc = could not find container \"8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534\": container with ID starting with 8983535f739cee230963069e641cbe7d953fb87dff0ee7647e2b412a361bd534 not found: ID does not exist" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.808155 4931 scope.go:117] "RemoveContainer" containerID="924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9" Jan 30 05:10:59 crc kubenswrapper[4931]: E0130 05:10:59.808543 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9\": container with ID starting with 924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9 not found: ID does not exist" containerID="924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.808608 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9"} err="failed to get container status \"924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9\": rpc error: code = NotFound desc = could not find container \"924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9\": container with ID starting with 924d3fe053808cb7c25239d42d036e4a8bda434c0ff0f8a8ede24d716ee714c9 not found: ID does not exist" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.808644 4931 scope.go:117] "RemoveContainer" containerID="0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893" Jan 30 05:10:59 crc kubenswrapper[4931]: E0130 05:10:59.809044 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893\": container with ID starting with 0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893 not found: ID does not exist" containerID="0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893" Jan 30 05:10:59 crc kubenswrapper[4931]: I0130 05:10:59.809078 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893"} err="failed to get container status \"0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893\": rpc error: code = NotFound desc = could not find container \"0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893\": container with ID starting with 0ab90532d3015b014ea0416fbd3c969102b9cb3b660fd71f10ccc1460320a893 not found: ID does not exist" Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.630293 4931 generic.go:334] "Generic (PLEG): container finished" podID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerID="128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e" exitCode=0 Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.630328 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerDied","Data":"128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e"} Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.633750 4931 generic.go:334] "Generic (PLEG): container finished" podID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerID="6a25d2a944e2534b4ce44d83ba3e3a50d7e29e3ca3c70fb2a75b736f90d90dec" exitCode=0 Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.633804 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerDied","Data":"6a25d2a944e2534b4ce44d83ba3e3a50d7e29e3ca3c70fb2a75b736f90d90dec"} Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.637137 4931 generic.go:334] "Generic (PLEG): container finished" podID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerID="70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6" exitCode=0 Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.637220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerDied","Data":"70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6"} Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.639534 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerStarted","Data":"6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06"} Jan 30 05:11:00 crc kubenswrapper[4931]: I0130 05:11:00.645485 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mltbk" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="registry-server" containerID="cri-o://61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573" gracePeriod=2 Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.055694 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.092619 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-utilities\") pod \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.092683 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjhnn\" (UniqueName: \"kubernetes.io/projected/025f8209-dd2a-482c-8bb2-e0ad2a98a563-kube-api-access-xjhnn\") pod \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.092716 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-catalog-content\") pod \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\" (UID: \"025f8209-dd2a-482c-8bb2-e0ad2a98a563\") " Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.094625 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-utilities" (OuterVolumeSpecName: "utilities") pod "025f8209-dd2a-482c-8bb2-e0ad2a98a563" (UID: "025f8209-dd2a-482c-8bb2-e0ad2a98a563"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.101540 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025f8209-dd2a-482c-8bb2-e0ad2a98a563-kube-api-access-xjhnn" (OuterVolumeSpecName: "kube-api-access-xjhnn") pod "025f8209-dd2a-482c-8bb2-e0ad2a98a563" (UID: "025f8209-dd2a-482c-8bb2-e0ad2a98a563"). InnerVolumeSpecName "kube-api-access-xjhnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.129645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "025f8209-dd2a-482c-8bb2-e0ad2a98a563" (UID: "025f8209-dd2a-482c-8bb2-e0ad2a98a563"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.193510 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjhnn\" (UniqueName: \"kubernetes.io/projected/025f8209-dd2a-482c-8bb2-e0ad2a98a563-kube-api-access-xjhnn\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.193552 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.193565 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025f8209-dd2a-482c-8bb2-e0ad2a98a563-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.433403 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" path="/var/lib/kubelet/pods/39e99a4f-8956-424c-a4c6-7a67f9983cd0/volumes" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.655164 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerStarted","Data":"22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.659983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerStarted","Data":"6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.674336 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerStarted","Data":"98f1a2d73272f9f16ee133a746115272f4fdf7cbb2973402df8b866824230ca2"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.678877 4931 generic.go:334] "Generic (PLEG): container finished" podID="bb356dde-8435-471d-a260-8966eeb15eb3" containerID="6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06" exitCode=0 Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.678993 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerDied","Data":"6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.690277 4931 generic.go:334] "Generic (PLEG): container finished" podID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerID="61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573" exitCode=0 Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.690560 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mltbk" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.691625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerDied","Data":"61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.691682 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mltbk" event={"ID":"025f8209-dd2a-482c-8bb2-e0ad2a98a563","Type":"ContainerDied","Data":"38ccccf46a0ae1cf8c452695abc11e45f7b2dbc3854f74a91d1dd3b017d08519"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.691719 4931 scope.go:117] "RemoveContainer" containerID="61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.693943 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pfl6d" podStartSLOduration=3.6294463500000003 podStartE2EDuration="47.693914661s" podCreationTimestamp="2026-01-30 05:10:14 +0000 UTC" firstStartedPulling="2026-01-30 05:10:17.048445826 +0000 UTC m=+152.418356083" lastFinishedPulling="2026-01-30 05:11:01.112914137 +0000 UTC m=+196.482824394" observedRunningTime="2026-01-30 05:11:01.685629354 +0000 UTC m=+197.055539651" watchObservedRunningTime="2026-01-30 05:11:01.693914661 +0000 UTC m=+197.063824918" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.696376 4931 generic.go:334] "Generic (PLEG): container finished" podID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerID="709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd" exitCode=0 Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.696454 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jp5s" event={"ID":"9ac0e0dc-4375-4faf-a262-2cf4e9772a29","Type":"ContainerDied","Data":"709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd"} Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.729512 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k5fcn" podStartSLOduration=2.644228646 podStartE2EDuration="47.729478542s" podCreationTimestamp="2026-01-30 05:10:14 +0000 UTC" firstStartedPulling="2026-01-30 05:10:16.037483989 +0000 UTC m=+151.407394246" lastFinishedPulling="2026-01-30 05:11:01.122733885 +0000 UTC m=+196.492644142" observedRunningTime="2026-01-30 05:11:01.712928308 +0000 UTC m=+197.082838575" watchObservedRunningTime="2026-01-30 05:11:01.729478542 +0000 UTC m=+197.099388799" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.731310 4931 scope.go:117] "RemoveContainer" containerID="8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.757398 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56dq5" podStartSLOduration=3.925124056 podStartE2EDuration="44.757366074s" podCreationTimestamp="2026-01-30 05:10:17 +0000 UTC" firstStartedPulling="2026-01-30 05:10:20.240440682 +0000 UTC m=+155.610350939" lastFinishedPulling="2026-01-30 05:11:01.0726827 +0000 UTC m=+196.442592957" observedRunningTime="2026-01-30 05:11:01.751497619 +0000 UTC m=+197.121407906" watchObservedRunningTime="2026-01-30 05:11:01.757366074 +0000 UTC m=+197.127276331" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.766664 4931 scope.go:117] "RemoveContainer" containerID="95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.790297 4931 scope.go:117] "RemoveContainer" containerID="61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573" Jan 30 05:11:01 crc kubenswrapper[4931]: E0130 05:11:01.790729 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573\": container with ID starting with 61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573 not found: ID does not exist" containerID="61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.790777 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573"} err="failed to get container status \"61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573\": rpc error: code = NotFound desc = could not find container \"61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573\": container with ID starting with 61e91ec5adf3b6d2e6c6bf351571b66e56de20029d70508816d60e2f718fa573 not found: ID does not exist" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.790808 4931 scope.go:117] "RemoveContainer" containerID="8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111" Jan 30 05:11:01 crc kubenswrapper[4931]: E0130 05:11:01.793685 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111\": container with ID starting with 8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111 not found: ID does not exist" containerID="8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.793717 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111"} err="failed to get container status \"8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111\": rpc error: code = NotFound desc = could not find container \"8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111\": container with ID starting with 8077360b2f317c6b05d19deec3953c16d0d53ccc821e4ebd6c5f208a32b8e111 not found: ID does not exist" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.793733 4931 scope.go:117] "RemoveContainer" containerID="95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836" Jan 30 05:11:01 crc kubenswrapper[4931]: E0130 05:11:01.794046 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836\": container with ID starting with 95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836 not found: ID does not exist" containerID="95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.794074 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836"} err="failed to get container status \"95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836\": rpc error: code = NotFound desc = could not find container \"95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836\": container with ID starting with 95a2f8458be6179195776e6c58dc301461d3257c1b29b85caedabd8aea064836 not found: ID does not exist" Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.801077 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mltbk"] Jan 30 05:11:01 crc kubenswrapper[4931]: I0130 05:11:01.804647 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mltbk"] Jan 30 05:11:02 crc kubenswrapper[4931]: I0130 05:11:02.703359 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jp5s" event={"ID":"9ac0e0dc-4375-4faf-a262-2cf4e9772a29","Type":"ContainerStarted","Data":"c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf"} Jan 30 05:11:02 crc kubenswrapper[4931]: I0130 05:11:02.706474 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerStarted","Data":"be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2"} Jan 30 05:11:02 crc kubenswrapper[4931]: I0130 05:11:02.729190 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7jp5s" podStartSLOduration=2.613166089 podStartE2EDuration="46.729171291s" podCreationTimestamp="2026-01-30 05:10:16 +0000 UTC" firstStartedPulling="2026-01-30 05:10:18.106706187 +0000 UTC m=+153.476616444" lastFinishedPulling="2026-01-30 05:11:02.222711389 +0000 UTC m=+197.592621646" observedRunningTime="2026-01-30 05:11:02.725828791 +0000 UTC m=+198.095739048" watchObservedRunningTime="2026-01-30 05:11:02.729171291 +0000 UTC m=+198.099081548" Jan 30 05:11:02 crc kubenswrapper[4931]: I0130 05:11:02.747269 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z64mf" podStartSLOduration=2.8182336279999998 podStartE2EDuration="45.7472449s" podCreationTimestamp="2026-01-30 05:10:17 +0000 UTC" firstStartedPulling="2026-01-30 05:10:19.162842133 +0000 UTC m=+154.532752390" lastFinishedPulling="2026-01-30 05:11:02.091853405 +0000 UTC m=+197.461763662" observedRunningTime="2026-01-30 05:11:02.7448737 +0000 UTC m=+198.114783947" watchObservedRunningTime="2026-01-30 05:11:02.7472449 +0000 UTC m=+198.117155157" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.351692 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.352494 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="extract-content" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.352630 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="extract-content" Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.352694 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="registry-server" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.352761 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="registry-server" Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.352831 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2d9da0-86e1-4a44-a714-6ca3d2d32edc" containerName="pruner" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.352890 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2d9da0-86e1-4a44-a714-6ca3d2d32edc" containerName="pruner" Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.352954 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="extract-utilities" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353027 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="extract-utilities" Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.353091 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="extract-content" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353151 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="extract-content" Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.353211 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="extract-utilities" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353286 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="extract-utilities" Jan 30 05:11:03 crc kubenswrapper[4931]: E0130 05:11:03.353351 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="registry-server" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353408 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="registry-server" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353583 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" containerName="registry-server" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353644 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2d9da0-86e1-4a44-a714-6ca3d2d32edc" containerName="pruner" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.353704 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e99a4f-8956-424c-a4c6-7a67f9983cd0" containerName="registry-server" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.354203 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.361591 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.363491 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.369845 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.430653 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025f8209-dd2a-482c-8bb2-e0ad2a98a563" path="/var/lib/kubelet/pods/025f8209-dd2a-482c-8bb2-e0ad2a98a563/volumes" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.529808 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.530149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-var-lock\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.530337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29b6db44-5b56-401a-bbce-c9e55735350f-kube-api-access\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.631851 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.631927 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-var-lock\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.631957 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29b6db44-5b56-401a-bbce-c9e55735350f-kube-api-access\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.632339 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.632377 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-var-lock\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.662242 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29b6db44-5b56-401a-bbce-c9e55735350f-kube-api-access\") pod \"installer-9-crc\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.678894 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:03 crc kubenswrapper[4931]: I0130 05:11:03.930903 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 05:11:04 crc kubenswrapper[4931]: I0130 05:11:04.723892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29b6db44-5b56-401a-bbce-c9e55735350f","Type":"ContainerStarted","Data":"5262c0a3d9bad98410b15e5334a833765b519d07d9825f5243288324f99b437e"} Jan 30 05:11:04 crc kubenswrapper[4931]: I0130 05:11:04.723977 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29b6db44-5b56-401a-bbce-c9e55735350f","Type":"ContainerStarted","Data":"b157ac78dbfd45487650da2507a964b9ef37da369d371e13ba3722a6cc6cbd9b"} Jan 30 05:11:04 crc kubenswrapper[4931]: I0130 05:11:04.746505 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.746473344 podStartE2EDuration="1.746473344s" podCreationTimestamp="2026-01-30 05:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:11:04.744388921 +0000 UTC m=+200.114299218" watchObservedRunningTime="2026-01-30 05:11:04.746473344 +0000 UTC m=+200.116383641" Jan 30 05:11:04 crc kubenswrapper[4931]: I0130 05:11:04.786443 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:11:04 crc kubenswrapper[4931]: I0130 05:11:04.786515 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:11:04 crc kubenswrapper[4931]: I0130 05:11:04.869230 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:11:05 crc kubenswrapper[4931]: I0130 05:11:05.293135 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:11:05 crc kubenswrapper[4931]: I0130 05:11:05.293649 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:11:05 crc kubenswrapper[4931]: I0130 05:11:05.354220 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:11:06 crc kubenswrapper[4931]: I0130 05:11:06.928108 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:11:06 crc kubenswrapper[4931]: I0130 05:11:06.928180 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:11:07 crc kubenswrapper[4931]: I0130 05:11:07.009284 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:11:07 crc kubenswrapper[4931]: I0130 05:11:07.795413 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:11:07 crc kubenswrapper[4931]: I0130 05:11:07.898263 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:11:07 crc kubenswrapper[4931]: I0130 05:11:07.898657 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:11:08 crc kubenswrapper[4931]: I0130 05:11:08.284096 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:11:08 crc kubenswrapper[4931]: I0130 05:11:08.284298 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:11:08 crc kubenswrapper[4931]: I0130 05:11:08.338008 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:11:08 crc kubenswrapper[4931]: I0130 05:11:08.829893 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:11:08 crc kubenswrapper[4931]: I0130 05:11:08.942042 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z64mf" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="registry-server" probeResult="failure" output=< Jan 30 05:11:08 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 05:11:08 crc kubenswrapper[4931]: > Jan 30 05:11:10 crc kubenswrapper[4931]: I0130 05:11:09.999476 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56dq5"] Jan 30 05:11:11 crc kubenswrapper[4931]: I0130 05:11:11.787480 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56dq5" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="registry-server" containerID="cri-o://98f1a2d73272f9f16ee133a746115272f4fdf7cbb2973402df8b866824230ca2" gracePeriod=2 Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.826776 4931 generic.go:334] "Generic (PLEG): container finished" podID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerID="98f1a2d73272f9f16ee133a746115272f4fdf7cbb2973402df8b866824230ca2" exitCode=0 Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.826852 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerDied","Data":"98f1a2d73272f9f16ee133a746115272f4fdf7cbb2973402df8b866824230ca2"} Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.885099 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.919861 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dnh5\" (UniqueName: \"kubernetes.io/projected/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-kube-api-access-2dnh5\") pod \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.920025 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-utilities\") pod \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.924706 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-utilities" (OuterVolumeSpecName: "utilities") pod "7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" (UID: "7e9dd69f-1c2e-4b14-83f8-dff33fe2118d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.926782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-catalog-content\") pod \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\" (UID: \"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d\") " Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.928712 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:13 crc kubenswrapper[4931]: I0130 05:11:13.938747 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-kube-api-access-2dnh5" (OuterVolumeSpecName: "kube-api-access-2dnh5") pod "7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" (UID: "7e9dd69f-1c2e-4b14-83f8-dff33fe2118d"). InnerVolumeSpecName "kube-api-access-2dnh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.030371 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dnh5\" (UniqueName: \"kubernetes.io/projected/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-kube-api-access-2dnh5\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.107222 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" (UID: "7e9dd69f-1c2e-4b14-83f8-dff33fe2118d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.132590 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.841493 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dq5" event={"ID":"7e9dd69f-1c2e-4b14-83f8-dff33fe2118d","Type":"ContainerDied","Data":"54c769092679e1f74ddf76a97ce7a50ba4a68138b396c7ae26403fddbe513fc1"} Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.841681 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dq5" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.842236 4931 scope.go:117] "RemoveContainer" containerID="98f1a2d73272f9f16ee133a746115272f4fdf7cbb2973402df8b866824230ca2" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.860489 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.880903 4931 scope.go:117] "RemoveContainer" containerID="6a25d2a944e2534b4ce44d83ba3e3a50d7e29e3ca3c70fb2a75b736f90d90dec" Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.905355 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56dq5"] Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.915266 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56dq5"] Jan 30 05:11:14 crc kubenswrapper[4931]: I0130 05:11:14.923605 4931 scope.go:117] "RemoveContainer" containerID="c23bf00de71e269e3c5c3d32d2b7e2842aa494dd5f905c0feee3ad4799d5aa22" Jan 30 05:11:15 crc kubenswrapper[4931]: I0130 05:11:15.360540 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:11:15 crc kubenswrapper[4931]: I0130 05:11:15.435856 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" path="/var/lib/kubelet/pods/7e9dd69f-1c2e-4b14-83f8-dff33fe2118d/volumes" Jan 30 05:11:17 crc kubenswrapper[4931]: I0130 05:11:17.793748 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pfl6d"] Jan 30 05:11:17 crc kubenswrapper[4931]: I0130 05:11:17.794411 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pfl6d" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="registry-server" containerID="cri-o://22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed" gracePeriod=2 Jan 30 05:11:17 crc kubenswrapper[4931]: I0130 05:11:17.961824 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.046093 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.233797 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.299060 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-catalog-content\") pod \"72ab8593-3b5e-421a-ac80-b85376b21ffe\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.299191 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn84w\" (UniqueName: \"kubernetes.io/projected/72ab8593-3b5e-421a-ac80-b85376b21ffe-kube-api-access-xn84w\") pod \"72ab8593-3b5e-421a-ac80-b85376b21ffe\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.299230 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-utilities\") pod \"72ab8593-3b5e-421a-ac80-b85376b21ffe\" (UID: \"72ab8593-3b5e-421a-ac80-b85376b21ffe\") " Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.300585 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-utilities" (OuterVolumeSpecName: "utilities") pod "72ab8593-3b5e-421a-ac80-b85376b21ffe" (UID: "72ab8593-3b5e-421a-ac80-b85376b21ffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.307078 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ab8593-3b5e-421a-ac80-b85376b21ffe-kube-api-access-xn84w" (OuterVolumeSpecName: "kube-api-access-xn84w") pod "72ab8593-3b5e-421a-ac80-b85376b21ffe" (UID: "72ab8593-3b5e-421a-ac80-b85376b21ffe"). InnerVolumeSpecName "kube-api-access-xn84w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.350843 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72ab8593-3b5e-421a-ac80-b85376b21ffe" (UID: "72ab8593-3b5e-421a-ac80-b85376b21ffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.401584 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn84w\" (UniqueName: \"kubernetes.io/projected/72ab8593-3b5e-421a-ac80-b85376b21ffe-kube-api-access-xn84w\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.401631 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.401644 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ab8593-3b5e-421a-ac80-b85376b21ffe-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.881017 4931 generic.go:334] "Generic (PLEG): container finished" podID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerID="22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed" exitCode=0 Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.881139 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfl6d" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.881231 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerDied","Data":"22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed"} Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.881368 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfl6d" event={"ID":"72ab8593-3b5e-421a-ac80-b85376b21ffe","Type":"ContainerDied","Data":"d2226ce036426e1065e325748a9372dea2501d4becb5598917d5e4c3d429e02b"} Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.881450 4931 scope.go:117] "RemoveContainer" containerID="22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.910549 4931 scope.go:117] "RemoveContainer" containerID="128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.936921 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pfl6d"] Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.944392 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pfl6d"] Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.963109 4931 scope.go:117] "RemoveContainer" containerID="bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.987487 4931 scope.go:117] "RemoveContainer" containerID="22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed" Jan 30 05:11:18 crc kubenswrapper[4931]: E0130 05:11:18.988325 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed\": container with ID starting with 22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed not found: ID does not exist" containerID="22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.988399 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed"} err="failed to get container status \"22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed\": rpc error: code = NotFound desc = could not find container \"22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed\": container with ID starting with 22e8d8be79fff3d0637bf7d16c30e3800da082978a27e4e29fb44d16521799ed not found: ID does not exist" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.988478 4931 scope.go:117] "RemoveContainer" containerID="128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e" Jan 30 05:11:18 crc kubenswrapper[4931]: E0130 05:11:18.990195 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e\": container with ID starting with 128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e not found: ID does not exist" containerID="128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.990243 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e"} err="failed to get container status \"128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e\": rpc error: code = NotFound desc = could not find container \"128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e\": container with ID starting with 128fe76504e70bb6392537003236cd2c16bd8f6afb75b2b9e7f6b3093e55a04e not found: ID does not exist" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.990275 4931 scope.go:117] "RemoveContainer" containerID="bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5" Jan 30 05:11:18 crc kubenswrapper[4931]: E0130 05:11:18.990842 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5\": container with ID starting with bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5 not found: ID does not exist" containerID="bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5" Jan 30 05:11:18 crc kubenswrapper[4931]: I0130 05:11:18.990881 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5"} err="failed to get container status \"bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5\": rpc error: code = NotFound desc = could not find container \"bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5\": container with ID starting with bd65a17d38c4081e883be3cbec7fcb60af6937440ed8104d6646abc57031b8d5 not found: ID does not exist" Jan 30 05:11:19 crc kubenswrapper[4931]: I0130 05:11:19.450595 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" path="/var/lib/kubelet/pods/72ab8593-3b5e-421a-ac80-b85376b21ffe/volumes" Jan 30 05:11:20 crc kubenswrapper[4931]: I0130 05:11:20.994142 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" podUID="45ceead9-96b4-4b3c-9fba-1288da84db97" containerName="oauth-openshift" containerID="cri-o://b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634" gracePeriod=15 Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.444931 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558654 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-cliconfig\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558737 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-idp-0-file-data\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558794 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-dir\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558855 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-policies\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558916 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-router-certs\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558924 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558955 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-ocp-branding-template\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.558995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvmhv\" (UniqueName: \"kubernetes.io/projected/45ceead9-96b4-4b3c-9fba-1288da84db97-kube-api-access-gvmhv\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559037 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-serving-cert\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559077 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-error\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-login\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559598 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-service-ca\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559659 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-provider-selection\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559719 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-trusted-ca-bundle\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559759 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-session\") pod \"45ceead9-96b4-4b3c-9fba-1288da84db97\" (UID: \"45ceead9-96b4-4b3c-9fba-1288da84db97\") " Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.559858 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.560169 4931 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.560196 4931 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.560437 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.560760 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.561528 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.566842 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.566725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.567018 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.567406 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.567840 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.568889 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.569449 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ceead9-96b4-4b3c-9fba-1288da84db97-kube-api-access-gvmhv" (OuterVolumeSpecName: "kube-api-access-gvmhv") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "kube-api-access-gvmhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.571069 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.579381 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "45ceead9-96b4-4b3c-9fba-1288da84db97" (UID: "45ceead9-96b4-4b3c-9fba-1288da84db97"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.661845 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.661963 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.661986 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvmhv\" (UniqueName: \"kubernetes.io/projected/45ceead9-96b4-4b3c-9fba-1288da84db97-kube-api-access-gvmhv\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662049 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662072 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662095 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662113 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662135 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662153 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662173 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662192 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.662211 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45ceead9-96b4-4b3c-9fba-1288da84db97-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.905714 4931 generic.go:334] "Generic (PLEG): container finished" podID="45ceead9-96b4-4b3c-9fba-1288da84db97" containerID="b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634" exitCode=0 Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.905863 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.906387 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" event={"ID":"45ceead9-96b4-4b3c-9fba-1288da84db97","Type":"ContainerDied","Data":"b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634"} Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.909224 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ww4ml" event={"ID":"45ceead9-96b4-4b3c-9fba-1288da84db97","Type":"ContainerDied","Data":"58f7af397c08f51e1fad13d7c31e06e26340ff1e4667e88288913594a1b1daca"} Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.909320 4931 scope.go:117] "RemoveContainer" containerID="b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.941548 4931 scope.go:117] "RemoveContainer" containerID="b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634" Jan 30 05:11:21 crc kubenswrapper[4931]: E0130 05:11:21.942286 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634\": container with ID starting with b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634 not found: ID does not exist" containerID="b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.942776 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634"} err="failed to get container status \"b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634\": rpc error: code = NotFound desc = could not find container \"b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634\": container with ID starting with b43d6b866734f28abac79825b9d740f660929182d58b93c5b171a5e5dbe71634 not found: ID does not exist" Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.961679 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ww4ml"] Jan 30 05:11:21 crc kubenswrapper[4931]: I0130 05:11:21.967835 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ww4ml"] Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.567966 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-cf4db658-jjpht"] Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568380 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="registry-server" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568403 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="registry-server" Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568459 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="extract-content" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568472 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="extract-content" Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568493 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="extract-utilities" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568507 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="extract-utilities" Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568528 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="registry-server" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568540 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="registry-server" Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568561 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="extract-utilities" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568576 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="extract-utilities" Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568602 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ceead9-96b4-4b3c-9fba-1288da84db97" containerName="oauth-openshift" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568614 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ceead9-96b4-4b3c-9fba-1288da84db97" containerName="oauth-openshift" Jan 30 05:11:22 crc kubenswrapper[4931]: E0130 05:11:22.568634 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="extract-content" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568646 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="extract-content" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568820 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ab8593-3b5e-421a-ac80-b85376b21ffe" containerName="registry-server" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568845 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ceead9-96b4-4b3c-9fba-1288da84db97" containerName="oauth-openshift" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.568874 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9dd69f-1c2e-4b14-83f8-dff33fe2118d" containerName="registry-server" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.569591 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.572336 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.583199 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.583454 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.583711 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.583838 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.584200 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.584478 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.584688 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.584874 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.584962 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.585833 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.590797 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cf4db658-jjpht"] Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.591343 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.594511 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.601561 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.623104 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.680876 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.681412 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-service-ca\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.681665 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-audit-policies\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.681844 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.682057 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-error\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.682333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.682658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-router-certs\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.682917 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.683194 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-session\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.683409 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-login\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.683845 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.684108 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.684395 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e83887db-501f-4612-95c5-9874573e6cc3-audit-dir\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.684610 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xzp\" (UniqueName: \"kubernetes.io/projected/e83887db-501f-4612-95c5-9874573e6cc3-kube-api-access-g7xzp\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786452 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-login\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786565 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786611 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786656 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e83887db-501f-4612-95c5-9874573e6cc3-audit-dir\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786699 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xzp\" (UniqueName: \"kubernetes.io/projected/e83887db-501f-4612-95c5-9874573e6cc3-kube-api-access-g7xzp\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786749 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786793 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-service-ca\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786829 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-audit-policies\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786861 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.786896 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-error\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.788132 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.788679 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-router-certs\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.788751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.788811 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-session\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.789049 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.789977 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-audit-policies\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.790751 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e83887db-501f-4612-95c5-9874573e6cc3-audit-dir\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.792331 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-service-ca\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.794858 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-login\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.796503 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.798612 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.801212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-router-certs\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.803398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.805955 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.813999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-session\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.814826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.816910 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e83887db-501f-4612-95c5-9874573e6cc3-v4-0-config-user-template-error\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.821396 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xzp\" (UniqueName: \"kubernetes.io/projected/e83887db-501f-4612-95c5-9874573e6cc3-kube-api-access-g7xzp\") pod \"oauth-openshift-cf4db658-jjpht\" (UID: \"e83887db-501f-4612-95c5-9874573e6cc3\") " pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:22 crc kubenswrapper[4931]: I0130 05:11:22.935595 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.213136 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cf4db658-jjpht"] Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.438711 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ceead9-96b4-4b3c-9fba-1288da84db97" path="/var/lib/kubelet/pods/45ceead9-96b4-4b3c-9fba-1288da84db97/volumes" Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.934035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" event={"ID":"e83887db-501f-4612-95c5-9874573e6cc3","Type":"ContainerStarted","Data":"63363c9fcfcece35928baa1bb7d981f575d30998580b5dc3d8e5d377e14ef296"} Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.934132 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" event={"ID":"e83887db-501f-4612-95c5-9874573e6cc3","Type":"ContainerStarted","Data":"5af29856727027285565767cd30400008f764d6832bb8e9c4c49b6d43b6e12f2"} Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.934481 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.963315 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" podStartSLOduration=28.963289058 podStartE2EDuration="28.963289058s" podCreationTimestamp="2026-01-30 05:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:11:23.962659949 +0000 UTC m=+219.332570296" watchObservedRunningTime="2026-01-30 05:11:23.963289058 +0000 UTC m=+219.333199325" Jan 30 05:11:23 crc kubenswrapper[4931]: I0130 05:11:23.979706 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-cf4db658-jjpht" Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.363823 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.364413 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.365179 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.366078 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.366202 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96" gracePeriod=600 Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.969105 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96" exitCode=0 Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.969231 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96"} Jan 30 05:11:27 crc kubenswrapper[4931]: I0130 05:11:27.969530 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"595c92fb582df913939826c54f51177c31890e24d0ac56595342acf9749b06a1"} Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.143753 4931 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145177 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145279 4931 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145703 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7" gracePeriod=15 Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145727 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409" gracePeriod=15 Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145703 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f" gracePeriod=15 Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145816 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944" gracePeriod=15 Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.145762 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9" gracePeriod=15 Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.148574 4931 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.148940 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.148965 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.148988 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.148998 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.149017 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149039 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.149069 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149079 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.149102 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149112 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.149130 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149141 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.149155 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149168 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149345 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149361 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149376 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149391 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149401 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.149415 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.186695 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.351782 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.351871 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.351897 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.351927 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.351951 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.351990 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.352010 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.352033 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.456941 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457397 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457488 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457440 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457558 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457596 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457646 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457674 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457732 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.457792 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.458317 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.458445 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.459006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.459047 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.459079 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.459112 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: I0130 05:11:42.481121 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:42 crc kubenswrapper[4931]: W0130 05:11:42.501656 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-260d401199a59151ab333ad8bbddcad4345a2392b184ac8ca482b074c3016a0d WatchSource:0}: Error finding container 260d401199a59151ab333ad8bbddcad4345a2392b184ac8ca482b074c3016a0d: Status 404 returned error can't find the container with id 260d401199a59151ab333ad8bbddcad4345a2392b184ac8ca482b074c3016a0d Jan 30 05:11:42 crc kubenswrapper[4931]: E0130 05:11:42.505103 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f6a28e8089b33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:11:42.504008499 +0000 UTC m=+237.873918766,LastTimestamp:2026-01-30 05:11:42.504008499 +0000 UTC m=+237.873918766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.105564 4931 generic.go:334] "Generic (PLEG): container finished" podID="29b6db44-5b56-401a-bbce-c9e55735350f" containerID="5262c0a3d9bad98410b15e5334a833765b519d07d9825f5243288324f99b437e" exitCode=0 Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.105685 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29b6db44-5b56-401a-bbce-c9e55735350f","Type":"ContainerDied","Data":"5262c0a3d9bad98410b15e5334a833765b519d07d9825f5243288324f99b437e"} Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.106888 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.107492 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.107987 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.109725 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.111732 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.113051 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7" exitCode=0 Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.113098 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944" exitCode=0 Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.113115 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9" exitCode=0 Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.113136 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409" exitCode=2 Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.113268 4931 scope.go:117] "RemoveContainer" containerID="13ede20d216824faa7a2228b01dd1e9d71f80c906010c828d7193df2a55e5b64" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.115863 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31"} Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.115927 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"260d401199a59151ab333ad8bbddcad4345a2392b184ac8ca482b074c3016a0d"} Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.117001 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.117525 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:43 crc kubenswrapper[4931]: I0130 05:11:43.118013 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.130960 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:11:44 crc kubenswrapper[4931]: E0130 05:11:44.140045 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f6a28e8089b33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:11:42.504008499 +0000 UTC m=+237.873918766,LastTimestamp:2026-01-30 05:11:42.504008499 +0000 UTC m=+237.873918766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.494942 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.495776 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.495962 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.605265 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29b6db44-5b56-401a-bbce-c9e55735350f-kube-api-access\") pod \"29b6db44-5b56-401a-bbce-c9e55735350f\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.605402 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-var-lock\") pod \"29b6db44-5b56-401a-bbce-c9e55735350f\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.605461 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-kubelet-dir\") pod \"29b6db44-5b56-401a-bbce-c9e55735350f\" (UID: \"29b6db44-5b56-401a-bbce-c9e55735350f\") " Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.605895 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29b6db44-5b56-401a-bbce-c9e55735350f" (UID: "29b6db44-5b56-401a-bbce-c9e55735350f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.605930 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-var-lock" (OuterVolumeSpecName: "var-lock") pod "29b6db44-5b56-401a-bbce-c9e55735350f" (UID: "29b6db44-5b56-401a-bbce-c9e55735350f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.612395 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b6db44-5b56-401a-bbce-c9e55735350f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29b6db44-5b56-401a-bbce-c9e55735350f" (UID: "29b6db44-5b56-401a-bbce-c9e55735350f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.707414 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29b6db44-5b56-401a-bbce-c9e55735350f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.707865 4931 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.707881 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29b6db44-5b56-401a-bbce-c9e55735350f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.750451 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.751376 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.752258 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.753202 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.754000 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.911116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.911354 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.911470 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.911772 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.911786 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.911902 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.912142 4931 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.912183 4931 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:44 crc kubenswrapper[4931]: I0130 05:11:44.912201 4931 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.143511 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29b6db44-5b56-401a-bbce-c9e55735350f","Type":"ContainerDied","Data":"b157ac78dbfd45487650da2507a964b9ef37da369d371e13ba3722a6cc6cbd9b"} Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.143604 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b157ac78dbfd45487650da2507a964b9ef37da369d371e13ba3722a6cc6cbd9b" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.143612 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.148898 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.150406 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f" exitCode=0 Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.150551 4931 scope.go:117] "RemoveContainer" containerID="e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.150655 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.182528 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.182881 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.183185 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.183806 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.188373 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.188968 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.194148 4931 scope.go:117] "RemoveContainer" containerID="6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.222235 4931 scope.go:117] "RemoveContainer" containerID="d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.243645 4931 scope.go:117] "RemoveContainer" containerID="f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.268411 4931 scope.go:117] "RemoveContainer" containerID="9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.289704 4931 scope.go:117] "RemoveContainer" containerID="48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.318482 4931 scope.go:117] "RemoveContainer" containerID="e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.319013 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\": container with ID starting with e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7 not found: ID does not exist" containerID="e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.319080 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7"} err="failed to get container status \"e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\": rpc error: code = NotFound desc = could not find container \"e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7\": container with ID starting with e28329c62307fafa9b09943c159da3008d3f43384301bd491ed17162224c7ef7 not found: ID does not exist" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.319122 4931 scope.go:117] "RemoveContainer" containerID="6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.319674 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\": container with ID starting with 6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944 not found: ID does not exist" containerID="6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.319704 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944"} err="failed to get container status \"6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\": rpc error: code = NotFound desc = could not find container \"6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944\": container with ID starting with 6ee6cf7ac43650602c4ebd09f855d56194bb78d999d693e3022fabc1856f2944 not found: ID does not exist" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.319726 4931 scope.go:117] "RemoveContainer" containerID="d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.320013 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\": container with ID starting with d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9 not found: ID does not exist" containerID="d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.320047 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9"} err="failed to get container status \"d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\": rpc error: code = NotFound desc = could not find container \"d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9\": container with ID starting with d4be4117fa91b6e5b03c1eaf39913d3ab5e17bd5f03e348e073edbab64a4def9 not found: ID does not exist" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.320064 4931 scope.go:117] "RemoveContainer" containerID="f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.320688 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\": container with ID starting with f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409 not found: ID does not exist" containerID="f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.320711 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409"} err="failed to get container status \"f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\": rpc error: code = NotFound desc = could not find container \"f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409\": container with ID starting with f526a8c83c6810286f460546da2c82bf646dc9ce8399a0dc632d3a3c311e4409 not found: ID does not exist" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.320733 4931 scope.go:117] "RemoveContainer" containerID="9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.321089 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\": container with ID starting with 9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f not found: ID does not exist" containerID="9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.321125 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f"} err="failed to get container status \"9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\": rpc error: code = NotFound desc = could not find container \"9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f\": container with ID starting with 9630e3419badc64acf0a6412c00e5bda59081ec55bb617498afeb6f03213693f not found: ID does not exist" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.321149 4931 scope.go:117] "RemoveContainer" containerID="48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.321700 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\": container with ID starting with 48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0 not found: ID does not exist" containerID="48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.321764 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0"} err="failed to get container status \"48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\": rpc error: code = NotFound desc = could not find container \"48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0\": container with ID starting with 48216451deed22ce5ed0c8ff3066c96a5b30134cf1b04c1f2f4b9d616a3d2fb0 not found: ID does not exist" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.431605 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.432365 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.433006 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.438462 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.980917 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.981249 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.981757 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.981988 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.982658 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:45 crc kubenswrapper[4931]: I0130 05:11:45.982724 4931 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 30 05:11:45 crc kubenswrapper[4931]: E0130 05:11:45.983054 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="200ms" Jan 30 05:11:46 crc kubenswrapper[4931]: E0130 05:11:46.184616 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="400ms" Jan 30 05:11:46 crc kubenswrapper[4931]: E0130 05:11:46.586684 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="800ms" Jan 30 05:11:47 crc kubenswrapper[4931]: E0130 05:11:47.388072 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="1.6s" Jan 30 05:11:48 crc kubenswrapper[4931]: E0130 05:11:48.989370 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="3.2s" Jan 30 05:11:52 crc kubenswrapper[4931]: E0130 05:11:52.191171 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="6.4s" Jan 30 05:11:54 crc kubenswrapper[4931]: E0130 05:11:54.142109 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f6a28e8089b33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:11:42.504008499 +0000 UTC m=+237.873918766,LastTimestamp:2026-01-30 05:11:42.504008499 +0000 UTC m=+237.873918766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:11:54 crc kubenswrapper[4931]: I0130 05:11:54.421811 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:54 crc kubenswrapper[4931]: I0130 05:11:54.422911 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:54 crc kubenswrapper[4931]: I0130 05:11:54.423519 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:54 crc kubenswrapper[4931]: I0130 05:11:54.452193 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:11:54 crc kubenswrapper[4931]: I0130 05:11:54.452254 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:11:54 crc kubenswrapper[4931]: E0130 05:11:54.453078 4931 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:54 crc kubenswrapper[4931]: I0130 05:11:54.453813 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:54 crc kubenswrapper[4931]: W0130 05:11:54.492344 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-77fc260e49c0241d02705c8f7f91523b6d95325b47a95fb3e424c2c9656b0812 WatchSource:0}: Error finding container 77fc260e49c0241d02705c8f7f91523b6d95325b47a95fb3e424c2c9656b0812: Status 404 returned error can't find the container with id 77fc260e49c0241d02705c8f7f91523b6d95325b47a95fb3e424c2c9656b0812 Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.229921 4931 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="eaf476f9566098a1a87403e025cdcd05160001f809e4cb9298ae59d5aa8b2ff1" exitCode=0 Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.230041 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"eaf476f9566098a1a87403e025cdcd05160001f809e4cb9298ae59d5aa8b2ff1"} Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.230534 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77fc260e49c0241d02705c8f7f91523b6d95325b47a95fb3e424c2c9656b0812"} Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.231000 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.231026 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:11:55 crc kubenswrapper[4931]: E0130 05:11:55.231736 4931 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.231714 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.232514 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.434755 4931 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.435339 4931 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:55 crc kubenswrapper[4931]: I0130 05:11:55.435901 4931 status_manager.go:851] "Failed to get status for pod" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.242809 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.243165 4931 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07" exitCode=1 Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.243212 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07"} Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.243927 4931 scope.go:117] "RemoveContainer" containerID="1c8fdc468479530a4757f1b448a63857d56a301237824e629ebffeb32cc50f07" Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.246605 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d932b05539c781d68b8d49d05ebdf07debc04383eacefa6ca55f030bb477fc32"} Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.246630 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"10ca6f3e71babbe5496ce5b48f0b41a5430db77a42c693fa1e50ac3da13ff27b"} Jan 30 05:11:56 crc kubenswrapper[4931]: I0130 05:11:56.246641 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f281df23b1d0c78f24af79910a223828a369ad98d97db182f94754867ac781d2"} Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.254721 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.254863 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8121e5a67bd8dca4f25f67bd0f1fbc5baa8e67403010ca8bc1bfb2df11c4e424"} Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.258621 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"961c4f79c9ec6746d30d5d3103df941a0dcd65cbbc6f5cb13c234d98200a880c"} Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.258668 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"43de9f31da3eb1eb0f205fd151d19d7acf071584c8c06a8e003134dc019fe428"} Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.258895 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.259006 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:11:57 crc kubenswrapper[4931]: I0130 05:11:57.259050 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:11:59 crc kubenswrapper[4931]: I0130 05:11:59.454580 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:59 crc kubenswrapper[4931]: I0130 05:11:59.454952 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:11:59 crc kubenswrapper[4931]: I0130 05:11:59.469294 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:12:00 crc kubenswrapper[4931]: I0130 05:12:00.461297 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:12:00 crc kubenswrapper[4931]: I0130 05:12:00.470204 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:12:01 crc kubenswrapper[4931]: I0130 05:12:01.299079 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:12:02 crc kubenswrapper[4931]: I0130 05:12:02.268715 4931 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:12:02 crc kubenswrapper[4931]: I0130 05:12:02.305386 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:12:02 crc kubenswrapper[4931]: I0130 05:12:02.305456 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:12:02 crc kubenswrapper[4931]: I0130 05:12:02.309583 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:12:02 crc kubenswrapper[4931]: I0130 05:12:02.311922 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="596fb761-2061-43fb-bcf3-41e724e78d86" Jan 30 05:12:03 crc kubenswrapper[4931]: I0130 05:12:03.313705 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:12:03 crc kubenswrapper[4931]: I0130 05:12:03.314243 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f92025d1-3392-4c42-802e-b549f0bf4e7f" Jan 30 05:12:05 crc kubenswrapper[4931]: I0130 05:12:05.443466 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="596fb761-2061-43fb-bcf3-41e724e78d86" Jan 30 05:12:11 crc kubenswrapper[4931]: I0130 05:12:11.420335 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:12:11 crc kubenswrapper[4931]: I0130 05:12:11.670754 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 05:12:12 crc kubenswrapper[4931]: I0130 05:12:12.382375 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 05:12:12 crc kubenswrapper[4931]: I0130 05:12:12.445093 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.289274 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.508117 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.815298 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.868310 4931 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.876144 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=31.876112647 podStartE2EDuration="31.876112647s" podCreationTimestamp="2026-01-30 05:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:02.090851541 +0000 UTC m=+257.460761828" watchObservedRunningTime="2026-01-30 05:12:13.876112647 +0000 UTC m=+269.246022944" Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.877410 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.877542 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.886124 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:12:13 crc kubenswrapper[4931]: I0130 05:12:13.912380 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.912345775 podStartE2EDuration="11.912345775s" podCreationTimestamp="2026-01-30 05:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:13.904674933 +0000 UTC m=+269.274585230" watchObservedRunningTime="2026-01-30 05:12:13.912345775 +0000 UTC m=+269.282256062" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.006519 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.191326 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.193789 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.270581 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.613588 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.671248 4931 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.740707 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.770416 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 05:12:14 crc kubenswrapper[4931]: I0130 05:12:14.842915 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.033943 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.042632 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.159235 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.324598 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.347219 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.390208 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.468663 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.714285 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.770489 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.781737 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.893017 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 05:12:15 crc kubenswrapper[4931]: I0130 05:12:15.901930 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.146006 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.275808 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.362809 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.369609 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.424208 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.719667 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.748621 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.777603 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.793595 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.807841 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.821234 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.893033 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 05:12:16 crc kubenswrapper[4931]: I0130 05:12:16.896403 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.149381 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.251647 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.283412 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.468066 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.500418 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.555910 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.608331 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.610183 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.637325 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.655504 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.680471 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.734136 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 05:12:17 crc kubenswrapper[4931]: I0130 05:12:17.976878 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.002737 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.024369 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.107002 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.122898 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.168183 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.185498 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.340941 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.377016 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.377513 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.456042 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.461973 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.527805 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.601625 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.626382 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.647559 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.738261 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.822223 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.872915 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.902788 4931 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 05:12:18 crc kubenswrapper[4931]: I0130 05:12:18.974313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.024746 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.115784 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.169832 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.218660 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.225360 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.306388 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.498466 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.498815 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.520525 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.569823 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.590995 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.641106 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.669694 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.689371 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.721821 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.743616 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.776603 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.793583 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.804376 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.846197 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.861281 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 05:12:19 crc kubenswrapper[4931]: I0130 05:12:19.958713 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.049264 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.062611 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.151237 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.190198 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.382379 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.389567 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.427504 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.428998 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.444677 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.493280 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.575079 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.577372 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.713969 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.714023 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.889740 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 05:12:20 crc kubenswrapper[4931]: I0130 05:12:20.897321 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.151237 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.188220 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.251181 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.333721 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.460481 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.470683 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.478021 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.512479 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.585253 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.596440 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.675240 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.700602 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.715032 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.751955 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.767016 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.797103 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.813678 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.846911 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.856830 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.885644 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.953511 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.954848 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.962177 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 05:12:21 crc kubenswrapper[4931]: I0130 05:12:21.982314 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.080394 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.095149 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.100190 4931 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.127081 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.130173 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.167206 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.224011 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.250377 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.266272 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.359693 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.500682 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.547816 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.661753 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.663042 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.842885 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.857258 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.932086 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.955328 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 05:12:22 crc kubenswrapper[4931]: I0130 05:12:22.964284 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.069733 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.158965 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.165164 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.232746 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.330575 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.362255 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.398717 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.431109 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.483162 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.494926 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.669954 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.670448 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.705842 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.787131 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.849627 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.859998 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.922004 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.938193 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.939896 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 05:12:23 crc kubenswrapper[4931]: I0130 05:12:23.991350 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.069957 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.076507 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.190002 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.206162 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.294535 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.413151 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.431697 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.503487 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.595025 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.640452 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.726655 4931 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.783664 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.795172 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.846029 4931 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.846556 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31" gracePeriod=5 Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.865896 4931 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.911239 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.915507 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.923795 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 05:12:24 crc kubenswrapper[4931]: I0130 05:12:24.945245 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.001200 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.171691 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.207806 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.258805 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.289336 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.425688 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.612465 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.695786 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.761077 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.806714 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.835919 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.876786 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.952814 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 05:12:25 crc kubenswrapper[4931]: I0130 05:12:25.979448 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.049447 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.094749 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.097661 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.178897 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.214560 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.328900 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.396146 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.407663 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.471532 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.489932 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.535390 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.651896 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.703501 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.723025 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frnwj"] Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.723408 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frnwj" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="registry-server" containerID="cri-o://df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca" gracePeriod=30 Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.724411 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.745178 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5fcn"] Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.745767 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k5fcn" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="registry-server" containerID="cri-o://6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa" gracePeriod=30 Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.760786 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-phq4q"] Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.761102 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerName="marketplace-operator" containerID="cri-o://dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935" gracePeriod=30 Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.767364 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.769133 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jp5s"] Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.769772 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7jp5s" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="registry-server" containerID="cri-o://c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" gracePeriod=30 Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.775122 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z64mf"] Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.775563 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z64mf" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="registry-server" containerID="cri-o://be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2" gracePeriod=30 Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.798037 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ng75v"] Jan 30 05:12:26 crc kubenswrapper[4931]: E0130 05:12:26.798354 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" containerName="installer" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.798372 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" containerName="installer" Jan 30 05:12:26 crc kubenswrapper[4931]: E0130 05:12:26.798481 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.798496 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.798672 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.798692 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b6db44-5b56-401a-bbce-c9e55735350f" containerName="installer" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.799481 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.874894 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.894093 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.900735 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.906040 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 05:12:26 crc kubenswrapper[4931]: E0130 05:12:26.928807 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf is running failed: container process not found" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:12:26 crc kubenswrapper[4931]: E0130 05:12:26.929579 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf is running failed: container process not found" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:12:26 crc kubenswrapper[4931]: E0130 05:12:26.930141 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf is running failed: container process not found" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:12:26 crc kubenswrapper[4931]: E0130 05:12:26.930213 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-7jp5s" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="registry-server" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.936699 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx24p\" (UniqueName: \"kubernetes.io/projected/29014adb-d772-451f-b4bf-9fdb5d417d1e-kube-api-access-fx24p\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.936778 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29014adb-d772-451f-b4bf-9fdb5d417d1e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:26 crc kubenswrapper[4931]: I0130 05:12:26.936805 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29014adb-d772-451f-b4bf-9fdb5d417d1e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.023655 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.038471 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx24p\" (UniqueName: \"kubernetes.io/projected/29014adb-d772-451f-b4bf-9fdb5d417d1e-kube-api-access-fx24p\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.038535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29014adb-d772-451f-b4bf-9fdb5d417d1e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.038554 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29014adb-d772-451f-b4bf-9fdb5d417d1e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.043474 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29014adb-d772-451f-b4bf-9fdb5d417d1e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.047993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29014adb-d772-451f-b4bf-9fdb5d417d1e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.059779 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.072365 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx24p\" (UniqueName: \"kubernetes.io/projected/29014adb-d772-451f-b4bf-9fdb5d417d1e-kube-api-access-fx24p\") pod \"marketplace-operator-79b997595-ng75v\" (UID: \"29014adb-d772-451f-b4bf-9fdb5d417d1e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.182670 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.189915 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.193349 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.221174 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.226286 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.234760 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.252036 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.266054 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.280682 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.299627 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344534 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg6xd\" (UniqueName: \"kubernetes.io/projected/9163b44e-4aa5-422c-a2fd-55747c8d506e-kube-api-access-zg6xd\") pod \"9163b44e-4aa5-422c-a2fd-55747c8d506e\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344666 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-utilities\") pod \"9163b44e-4aa5-422c-a2fd-55747c8d506e\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344764 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-utilities\") pod \"0dbdc3df-7306-41e4-93c6-d7d27d481789\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344806 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r28cq\" (UniqueName: \"kubernetes.io/projected/bc314d0c-da50-4607-93e1-5bece9c3b2b1-kube-api-access-r28cq\") pod \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344856 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-catalog-content\") pod \"bb356dde-8435-471d-a260-8966eeb15eb3\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gdhb\" (UniqueName: \"kubernetes.io/projected/bb356dde-8435-471d-a260-8966eeb15eb3-kube-api-access-9gdhb\") pod \"bb356dde-8435-471d-a260-8966eeb15eb3\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.344959 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-catalog-content\") pod \"9163b44e-4aa5-422c-a2fd-55747c8d506e\" (UID: \"9163b44e-4aa5-422c-a2fd-55747c8d506e\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.345000 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-utilities\") pod \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.345032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-catalog-content\") pod \"0dbdc3df-7306-41e4-93c6-d7d27d481789\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.345979 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-catalog-content\") pod \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346028 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-trusted-ca\") pod \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346094 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6cpz\" (UniqueName: \"kubernetes.io/projected/0dbdc3df-7306-41e4-93c6-d7d27d481789-kube-api-access-z6cpz\") pod \"0dbdc3df-7306-41e4-93c6-d7d27d481789\" (UID: \"0dbdc3df-7306-41e4-93c6-d7d27d481789\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346119 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8xcp\" (UniqueName: \"kubernetes.io/projected/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-kube-api-access-d8xcp\") pod \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\" (UID: \"9ac0e0dc-4375-4faf-a262-2cf4e9772a29\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346152 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-utilities\") pod \"bb356dde-8435-471d-a260-8966eeb15eb3\" (UID: \"bb356dde-8435-471d-a260-8966eeb15eb3\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346185 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-operator-metrics\") pod \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\" (UID: \"bc314d0c-da50-4607-93e1-5bece9c3b2b1\") " Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346294 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-utilities" (OuterVolumeSpecName: "utilities") pod "9ac0e0dc-4375-4faf-a262-2cf4e9772a29" (UID: "9ac0e0dc-4375-4faf-a262-2cf4e9772a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.346545 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.347022 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-utilities" (OuterVolumeSpecName: "utilities") pod "0dbdc3df-7306-41e4-93c6-d7d27d481789" (UID: "0dbdc3df-7306-41e4-93c6-d7d27d481789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.349054 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-utilities" (OuterVolumeSpecName: "utilities") pod "bb356dde-8435-471d-a260-8966eeb15eb3" (UID: "bb356dde-8435-471d-a260-8966eeb15eb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.349557 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bc314d0c-da50-4607-93e1-5bece9c3b2b1" (UID: "bc314d0c-da50-4607-93e1-5bece9c3b2b1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.350698 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-utilities" (OuterVolumeSpecName: "utilities") pod "9163b44e-4aa5-422c-a2fd-55747c8d506e" (UID: "9163b44e-4aa5-422c-a2fd-55747c8d506e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.355348 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bc314d0c-da50-4607-93e1-5bece9c3b2b1" (UID: "bc314d0c-da50-4607-93e1-5bece9c3b2b1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.355387 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-kube-api-access-d8xcp" (OuterVolumeSpecName: "kube-api-access-d8xcp") pod "9ac0e0dc-4375-4faf-a262-2cf4e9772a29" (UID: "9ac0e0dc-4375-4faf-a262-2cf4e9772a29"). InnerVolumeSpecName "kube-api-access-d8xcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.355413 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9163b44e-4aa5-422c-a2fd-55747c8d506e-kube-api-access-zg6xd" (OuterVolumeSpecName: "kube-api-access-zg6xd") pod "9163b44e-4aa5-422c-a2fd-55747c8d506e" (UID: "9163b44e-4aa5-422c-a2fd-55747c8d506e"). InnerVolumeSpecName "kube-api-access-zg6xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.355473 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc314d0c-da50-4607-93e1-5bece9c3b2b1-kube-api-access-r28cq" (OuterVolumeSpecName: "kube-api-access-r28cq") pod "bc314d0c-da50-4607-93e1-5bece9c3b2b1" (UID: "bc314d0c-da50-4607-93e1-5bece9c3b2b1"). InnerVolumeSpecName "kube-api-access-r28cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.355496 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb356dde-8435-471d-a260-8966eeb15eb3-kube-api-access-9gdhb" (OuterVolumeSpecName: "kube-api-access-9gdhb") pod "bb356dde-8435-471d-a260-8966eeb15eb3" (UID: "bb356dde-8435-471d-a260-8966eeb15eb3"). InnerVolumeSpecName "kube-api-access-9gdhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.367012 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbdc3df-7306-41e4-93c6-d7d27d481789-kube-api-access-z6cpz" (OuterVolumeSpecName: "kube-api-access-z6cpz") pod "0dbdc3df-7306-41e4-93c6-d7d27d481789" (UID: "0dbdc3df-7306-41e4-93c6-d7d27d481789"). InnerVolumeSpecName "kube-api-access-z6cpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.384901 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ac0e0dc-4375-4faf-a262-2cf4e9772a29" (UID: "9ac0e0dc-4375-4faf-a262-2cf4e9772a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.410121 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9163b44e-4aa5-422c-a2fd-55747c8d506e" (UID: "9163b44e-4aa5-422c-a2fd-55747c8d506e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.424852 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dbdc3df-7306-41e4-93c6-d7d27d481789" (UID: "0dbdc3df-7306-41e4-93c6-d7d27d481789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447291 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447445 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r28cq\" (UniqueName: \"kubernetes.io/projected/bc314d0c-da50-4607-93e1-5bece9c3b2b1-kube-api-access-r28cq\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447521 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gdhb\" (UniqueName: \"kubernetes.io/projected/bb356dde-8435-471d-a260-8966eeb15eb3-kube-api-access-9gdhb\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447583 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447653 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447713 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbdc3df-7306-41e4-93c6-d7d27d481789-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447780 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447838 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8xcp\" (UniqueName: \"kubernetes.io/projected/9ac0e0dc-4375-4faf-a262-2cf4e9772a29-kube-api-access-d8xcp\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447906 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6cpz\" (UniqueName: \"kubernetes.io/projected/0dbdc3df-7306-41e4-93c6-d7d27d481789-kube-api-access-z6cpz\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.447967 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.448024 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc314d0c-da50-4607-93e1-5bece9c3b2b1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.448082 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg6xd\" (UniqueName: \"kubernetes.io/projected/9163b44e-4aa5-422c-a2fd-55747c8d506e-kube-api-access-zg6xd\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.448146 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9163b44e-4aa5-422c-a2fd-55747c8d506e-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.462189 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.494011 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb356dde-8435-471d-a260-8966eeb15eb3" (UID: "bb356dde-8435-471d-a260-8966eeb15eb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.504123 4931 generic.go:334] "Generic (PLEG): container finished" podID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerID="df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca" exitCode=0 Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.504216 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerDied","Data":"df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.504248 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnwj" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.504405 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnwj" event={"ID":"0dbdc3df-7306-41e4-93c6-d7d27d481789","Type":"ContainerDied","Data":"8fc2bd9106d95cb2212067bb79c5743a637b67855826a61a2a9690fea3308441"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.504985 4931 scope.go:117] "RemoveContainer" containerID="df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.509879 4931 generic.go:334] "Generic (PLEG): container finished" podID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerID="6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa" exitCode=0 Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.509961 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerDied","Data":"6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.509983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5fcn" event={"ID":"9163b44e-4aa5-422c-a2fd-55747c8d506e","Type":"ContainerDied","Data":"072fd632153afb9e250c4d51854168ac5eaa4674e8d0bff4bbfe11fe55d97dbc"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.510159 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5fcn" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.512318 4931 generic.go:334] "Generic (PLEG): container finished" podID="bb356dde-8435-471d-a260-8966eeb15eb3" containerID="be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2" exitCode=0 Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.512368 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerDied","Data":"be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.512386 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z64mf" event={"ID":"bb356dde-8435-471d-a260-8966eeb15eb3","Type":"ContainerDied","Data":"a6a9276eab6557cd642ac08c2583f1c3b08c9bbb62478c22c66b2f818922633b"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.512465 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z64mf" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.517148 4931 generic.go:334] "Generic (PLEG): container finished" podID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerID="dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935" exitCode=0 Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.517215 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" event={"ID":"bc314d0c-da50-4607-93e1-5bece9c3b2b1","Type":"ContainerDied","Data":"dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.517234 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" event={"ID":"bc314d0c-da50-4607-93e1-5bece9c3b2b1","Type":"ContainerDied","Data":"b8dffc3066e9941e3da7e55a7eddcae34aa88188f6b968755e41658b1568e4e5"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.517576 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-phq4q" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.527348 4931 generic.go:334] "Generic (PLEG): container finished" podID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" exitCode=0 Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.527414 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jp5s" event={"ID":"9ac0e0dc-4375-4faf-a262-2cf4e9772a29","Type":"ContainerDied","Data":"c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.527462 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jp5s" event={"ID":"9ac0e0dc-4375-4faf-a262-2cf4e9772a29","Type":"ContainerDied","Data":"827a507dec87e3e9291f3f56b6d8162668e69da1d6e51e16d8c5431ea4ab1518"} Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.527584 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jp5s" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.528930 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frnwj"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.533773 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frnwj"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.539521 4931 scope.go:117] "RemoveContainer" containerID="a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.556981 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-phq4q"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.558462 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb356dde-8435-471d-a260-8966eeb15eb3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.559897 4931 scope.go:117] "RemoveContainer" containerID="47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.560356 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-phq4q"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.579295 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z64mf"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.581653 4931 scope.go:117] "RemoveContainer" containerID="df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.582410 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca\": container with ID starting with df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca not found: ID does not exist" containerID="df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.582581 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca"} err="failed to get container status \"df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca\": rpc error: code = NotFound desc = could not find container \"df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca\": container with ID starting with df0b357181e3eedcf60e2088c82c16833b5fbf8f654133be047c4df773ca01ca not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.582662 4931 scope.go:117] "RemoveContainer" containerID="a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.583383 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8\": container with ID starting with a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8 not found: ID does not exist" containerID="a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.583464 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8"} err="failed to get container status \"a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8\": rpc error: code = NotFound desc = could not find container \"a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8\": container with ID starting with a50f7b6d3ede0641c074c41b51e3f0fffb9f73055f923a85d79534dc71c276d8 not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.583509 4931 scope.go:117] "RemoveContainer" containerID="47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.584057 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b\": container with ID starting with 47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b not found: ID does not exist" containerID="47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.584107 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b"} err="failed to get container status \"47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b\": rpc error: code = NotFound desc = could not find container \"47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b\": container with ID starting with 47721f40853566da291d4128aa27cfd8e9088a579539c3cf063e110cae98ba9b not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.584145 4931 scope.go:117] "RemoveContainer" containerID="6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.584495 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z64mf"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.587740 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jp5s"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.590677 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jp5s"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.593460 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5fcn"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.596489 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k5fcn"] Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.601686 4931 scope.go:117] "RemoveContainer" containerID="70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.615086 4931 scope.go:117] "RemoveContainer" containerID="395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.650289 4931 scope.go:117] "RemoveContainer" containerID="6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.651149 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa\": container with ID starting with 6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa not found: ID does not exist" containerID="6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.651230 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa"} err="failed to get container status \"6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa\": rpc error: code = NotFound desc = could not find container \"6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa\": container with ID starting with 6a7c490a1ca576cf6b2bf25d389b7adc58b74fbf8028b2f89a729132152547fa not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.651288 4931 scope.go:117] "RemoveContainer" containerID="70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.651908 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6\": container with ID starting with 70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6 not found: ID does not exist" containerID="70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.651961 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6"} err="failed to get container status \"70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6\": rpc error: code = NotFound desc = could not find container \"70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6\": container with ID starting with 70058c728a740db130f42fbd757863af66006cb2424ed6ee705528054a5e47b6 not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.651998 4931 scope.go:117] "RemoveContainer" containerID="395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.652591 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae\": container with ID starting with 395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae not found: ID does not exist" containerID="395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.652785 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae"} err="failed to get container status \"395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae\": rpc error: code = NotFound desc = could not find container \"395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae\": container with ID starting with 395003f8081bcbed1e60f901d15e3e3273a10ea7d416310fe7a937a87b9ab0ae not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.652959 4931 scope.go:117] "RemoveContainer" containerID="be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.674991 4931 scope.go:117] "RemoveContainer" containerID="6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.696637 4931 scope.go:117] "RemoveContainer" containerID="27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.718902 4931 scope.go:117] "RemoveContainer" containerID="be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.720109 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2\": container with ID starting with be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2 not found: ID does not exist" containerID="be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.720148 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2"} err="failed to get container status \"be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2\": rpc error: code = NotFound desc = could not find container \"be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2\": container with ID starting with be84f384f34a985ca6074056a3f0a86f8302b573afd7c5d671ba2a6cd796e0a2 not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.720187 4931 scope.go:117] "RemoveContainer" containerID="6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.720573 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06\": container with ID starting with 6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06 not found: ID does not exist" containerID="6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.720611 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06"} err="failed to get container status \"6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06\": rpc error: code = NotFound desc = could not find container \"6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06\": container with ID starting with 6c297be527ffd29be1f24337f1122c053fca9ee7947104d32085a2d58afaed06 not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.720632 4931 scope.go:117] "RemoveContainer" containerID="27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.721037 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f\": container with ID starting with 27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f not found: ID does not exist" containerID="27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.721087 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f"} err="failed to get container status \"27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f\": rpc error: code = NotFound desc = could not find container \"27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f\": container with ID starting with 27fb758631cfe852aecd73c3499352e9792cd3594cffa3ea4c324b21b40b055f not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.721119 4931 scope.go:117] "RemoveContainer" containerID="dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.742494 4931 scope.go:117] "RemoveContainer" containerID="dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.744092 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935\": container with ID starting with dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935 not found: ID does not exist" containerID="dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.744161 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935"} err="failed to get container status \"dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935\": rpc error: code = NotFound desc = could not find container \"dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935\": container with ID starting with dffe4917fe91f4a7ea81fe708baf5df13ef743b12684b18852f155d38db20935 not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.744190 4931 scope.go:117] "RemoveContainer" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.762927 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.764801 4931 scope.go:117] "RemoveContainer" containerID="709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.786835 4931 scope.go:117] "RemoveContainer" containerID="ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.814079 4931 scope.go:117] "RemoveContainer" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.822119 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf\": container with ID starting with c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf not found: ID does not exist" containerID="c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.823117 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf"} err="failed to get container status \"c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf\": rpc error: code = NotFound desc = could not find container \"c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf\": container with ID starting with c58dee2d75112faf5278471fa675a22e31796885008d0a213b4f2e18d60bfcdf not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.823247 4931 scope.go:117] "RemoveContainer" containerID="709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.824167 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd\": container with ID starting with 709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd not found: ID does not exist" containerID="709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.824235 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd"} err="failed to get container status \"709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd\": rpc error: code = NotFound desc = could not find container \"709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd\": container with ID starting with 709b951cf92c6bbce01c8ab7f77c6c23615d324ebd972a2d9fee7993dc0333dd not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.824279 4931 scope.go:117] "RemoveContainer" containerID="ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2" Jan 30 05:12:27 crc kubenswrapper[4931]: E0130 05:12:27.824801 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2\": container with ID starting with ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2 not found: ID does not exist" containerID="ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.824860 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2"} err="failed to get container status \"ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2\": rpc error: code = NotFound desc = could not find container \"ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2\": container with ID starting with ed987c960643926b76f2a7a57feffb55f8defd2210965e18745055f82bde89c2 not found: ID does not exist" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.841599 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.849443 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 05:12:27 crc kubenswrapper[4931]: I0130 05:12:27.869580 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 05:12:28 crc kubenswrapper[4931]: I0130 05:12:28.257514 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 05:12:28 crc kubenswrapper[4931]: I0130 05:12:28.335358 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ng75v"] Jan 30 05:12:28 crc kubenswrapper[4931]: I0130 05:12:28.644739 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 05:12:28 crc kubenswrapper[4931]: I0130 05:12:28.945551 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 05:12:28 crc kubenswrapper[4931]: I0130 05:12:28.992660 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 05:12:29 crc kubenswrapper[4931]: I0130 05:12:29.432634 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" path="/var/lib/kubelet/pods/0dbdc3df-7306-41e4-93c6-d7d27d481789/volumes" Jan 30 05:12:29 crc kubenswrapper[4931]: I0130 05:12:29.433547 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" path="/var/lib/kubelet/pods/9163b44e-4aa5-422c-a2fd-55747c8d506e/volumes" Jan 30 05:12:29 crc kubenswrapper[4931]: I0130 05:12:29.434240 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" path="/var/lib/kubelet/pods/9ac0e0dc-4375-4faf-a262-2cf4e9772a29/volumes" Jan 30 05:12:29 crc kubenswrapper[4931]: I0130 05:12:29.435749 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" path="/var/lib/kubelet/pods/bb356dde-8435-471d-a260-8966eeb15eb3/volumes" Jan 30 05:12:29 crc kubenswrapper[4931]: I0130 05:12:29.436658 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" path="/var/lib/kubelet/pods/bc314d0c-da50-4607-93e1-5bece9c3b2b1/volumes" Jan 30 05:12:29 crc kubenswrapper[4931]: I0130 05:12:29.676803 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.047125 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 05:12:30 crc kubenswrapper[4931]: E0130 05:12:30.124957 4931 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 05:12:30 crc kubenswrapper[4931]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-ng75v_openshift-marketplace_29014adb-d772-451f-b4bf-9fdb5d417d1e_0(45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d): error adding pod openshift-marketplace_marketplace-operator-79b997595-ng75v to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d" Netns:"/var/run/netns/421ebf21-8cef-416e-9b55-b397c7c51bc2" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-ng75v;K8S_POD_INFRA_CONTAINER_ID=45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d;K8S_POD_UID=29014adb-d772-451f-b4bf-9fdb5d417d1e" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-ng75v] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-ng75v/29014adb-d772-451f-b4bf-9fdb5d417d1e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-ng75v in out of cluster comm: pod "marketplace-operator-79b997595-ng75v" not found Jan 30 05:12:30 crc kubenswrapper[4931]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:12:30 crc kubenswrapper[4931]: > Jan 30 05:12:30 crc kubenswrapper[4931]: E0130 05:12:30.125088 4931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 05:12:30 crc kubenswrapper[4931]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-ng75v_openshift-marketplace_29014adb-d772-451f-b4bf-9fdb5d417d1e_0(45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d): error adding pod openshift-marketplace_marketplace-operator-79b997595-ng75v to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d" Netns:"/var/run/netns/421ebf21-8cef-416e-9b55-b397c7c51bc2" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-ng75v;K8S_POD_INFRA_CONTAINER_ID=45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d;K8S_POD_UID=29014adb-d772-451f-b4bf-9fdb5d417d1e" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-ng75v] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-ng75v/29014adb-d772-451f-b4bf-9fdb5d417d1e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-ng75v in out of cluster comm: pod "marketplace-operator-79b997595-ng75v" not found Jan 30 05:12:30 crc kubenswrapper[4931]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:12:30 crc kubenswrapper[4931]: > pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:30 crc kubenswrapper[4931]: E0130 05:12:30.125126 4931 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 30 05:12:30 crc kubenswrapper[4931]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-ng75v_openshift-marketplace_29014adb-d772-451f-b4bf-9fdb5d417d1e_0(45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d): error adding pod openshift-marketplace_marketplace-operator-79b997595-ng75v to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d" Netns:"/var/run/netns/421ebf21-8cef-416e-9b55-b397c7c51bc2" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-ng75v;K8S_POD_INFRA_CONTAINER_ID=45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d;K8S_POD_UID=29014adb-d772-451f-b4bf-9fdb5d417d1e" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-ng75v] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-ng75v/29014adb-d772-451f-b4bf-9fdb5d417d1e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-ng75v in out of cluster comm: pod "marketplace-operator-79b997595-ng75v" not found Jan 30 05:12:30 crc kubenswrapper[4931]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:12:30 crc kubenswrapper[4931]: > pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:30 crc kubenswrapper[4931]: E0130 05:12:30.125271 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-ng75v_openshift-marketplace(29014adb-d772-451f-b4bf-9fdb5d417d1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-ng75v_openshift-marketplace(29014adb-d772-451f-b4bf-9fdb5d417d1e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-ng75v_openshift-marketplace_29014adb-d772-451f-b4bf-9fdb5d417d1e_0(45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d): error adding pod openshift-marketplace_marketplace-operator-79b997595-ng75v to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d\\\" Netns:\\\"/var/run/netns/421ebf21-8cef-416e-9b55-b397c7c51bc2\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-ng75v;K8S_POD_INFRA_CONTAINER_ID=45be532072b47c75c02d9d8ceedac9361372f18babde4e28708c3421d7f1b46d;K8S_POD_UID=29014adb-d772-451f-b4bf-9fdb5d417d1e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-ng75v] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-ng75v/29014adb-d772-451f-b4bf-9fdb5d417d1e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-ng75v in out of cluster comm: pod \\\"marketplace-operator-79b997595-ng75v\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" podUID="29014adb-d772-451f-b4bf-9fdb5d417d1e" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.300163 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.440992 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.441066 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.554543 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.555832 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.555882 4931 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31" exitCode=137 Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.555968 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.556588 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.556967 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.557115 4931 scope.go:117] "RemoveContainer" containerID="2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.579609 4931 scope.go:117] "RemoveContainer" containerID="2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31" Jan 30 05:12:30 crc kubenswrapper[4931]: E0130 05:12:30.580251 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31\": container with ID starting with 2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31 not found: ID does not exist" containerID="2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.580313 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31"} err="failed to get container status \"2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31\": rpc error: code = NotFound desc = could not find container \"2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31\": container with ID starting with 2d4ef79a860dedfaa63778b9eec997980772fe753dd08276c911199c82d54e31 not found: ID does not exist" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.598817 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.601668 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.610983 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611058 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611157 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611182 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611219 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611305 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611396 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611382 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611477 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611583 4931 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.611600 4931 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.625303 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.713672 4931 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.714349 4931 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.714362 4931 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.737486 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.775391 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ng75v"] Jan 30 05:12:30 crc kubenswrapper[4931]: I0130 05:12:30.795572 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.289618 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.430257 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.431088 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.446313 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.446351 4931 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8bbd6ff1-b870-4c5c-a24c-91b05d22f7bf" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.452896 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.452943 4931 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8bbd6ff1-b870-4c5c-a24c-91b05d22f7bf" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.566274 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" event={"ID":"29014adb-d772-451f-b4bf-9fdb5d417d1e","Type":"ContainerStarted","Data":"4eed271da89e3844d484a968ce5746fa8c0a3cc42efa472504fdfda70ee56b74"} Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.566360 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" event={"ID":"29014adb-d772-451f-b4bf-9fdb5d417d1e","Type":"ContainerStarted","Data":"1c7e8199aad72629af78cacd4130304244471f22d2b45dbb3626a50ad4e91fc9"} Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.567652 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.573505 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" Jan 30 05:12:31 crc kubenswrapper[4931]: I0130 05:12:31.590082 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ng75v" podStartSLOduration=5.590052116 podStartE2EDuration="5.590052116s" podCreationTimestamp="2026-01-30 05:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:31.585191516 +0000 UTC m=+286.955101813" watchObservedRunningTime="2026-01-30 05:12:31.590052116 +0000 UTC m=+286.959962373" Jan 30 05:12:32 crc kubenswrapper[4931]: I0130 05:12:32.239980 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 05:12:45 crc kubenswrapper[4931]: I0130 05:12:45.188133 4931 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 30 05:12:47 crc kubenswrapper[4931]: I0130 05:12:47.624765 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsn4r"] Jan 30 05:12:47 crc kubenswrapper[4931]: I0130 05:12:47.625315 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" podUID="61a1f22c-baac-4356-9d01-ec2b51700b3a" containerName="controller-manager" containerID="cri-o://5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071" gracePeriod=30 Jan 30 05:12:47 crc kubenswrapper[4931]: I0130 05:12:47.724339 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4"] Jan 30 05:12:47 crc kubenswrapper[4931]: I0130 05:12:47.724676 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" podUID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" containerName="route-controller-manager" containerID="cri-o://8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491" gracePeriod=30 Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.059188 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.134310 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.178886 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-config\") pod \"61a1f22c-baac-4356-9d01-ec2b51700b3a\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.178946 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61a1f22c-baac-4356-9d01-ec2b51700b3a-serving-cert\") pod \"61a1f22c-baac-4356-9d01-ec2b51700b3a\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.178986 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-client-ca\") pod \"61a1f22c-baac-4356-9d01-ec2b51700b3a\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.179011 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2kc7\" (UniqueName: \"kubernetes.io/projected/61a1f22c-baac-4356-9d01-ec2b51700b3a-kube-api-access-b2kc7\") pod \"61a1f22c-baac-4356-9d01-ec2b51700b3a\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.179032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-proxy-ca-bundles\") pod \"61a1f22c-baac-4356-9d01-ec2b51700b3a\" (UID: \"61a1f22c-baac-4356-9d01-ec2b51700b3a\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.179997 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "61a1f22c-baac-4356-9d01-ec2b51700b3a" (UID: "61a1f22c-baac-4356-9d01-ec2b51700b3a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.180062 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-client-ca" (OuterVolumeSpecName: "client-ca") pod "61a1f22c-baac-4356-9d01-ec2b51700b3a" (UID: "61a1f22c-baac-4356-9d01-ec2b51700b3a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.180097 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-config" (OuterVolumeSpecName: "config") pod "61a1f22c-baac-4356-9d01-ec2b51700b3a" (UID: "61a1f22c-baac-4356-9d01-ec2b51700b3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.186243 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a1f22c-baac-4356-9d01-ec2b51700b3a-kube-api-access-b2kc7" (OuterVolumeSpecName: "kube-api-access-b2kc7") pod "61a1f22c-baac-4356-9d01-ec2b51700b3a" (UID: "61a1f22c-baac-4356-9d01-ec2b51700b3a"). InnerVolumeSpecName "kube-api-access-b2kc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.187383 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a1f22c-baac-4356-9d01-ec2b51700b3a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "61a1f22c-baac-4356-9d01-ec2b51700b3a" (UID: "61a1f22c-baac-4356-9d01-ec2b51700b3a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.279850 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22qxm\" (UniqueName: \"kubernetes.io/projected/4fd326f4-63cb-4c1d-bb6c-98118a45f714-kube-api-access-22qxm\") pod \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.279984 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-config\") pod \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280070 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-client-ca\") pod \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280142 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd326f4-63cb-4c1d-bb6c-98118a45f714-serving-cert\") pod \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\" (UID: \"4fd326f4-63cb-4c1d-bb6c-98118a45f714\") " Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280621 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280651 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61a1f22c-baac-4356-9d01-ec2b51700b3a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280668 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280688 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2kc7\" (UniqueName: \"kubernetes.io/projected/61a1f22c-baac-4356-9d01-ec2b51700b3a-kube-api-access-b2kc7\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280708 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61a1f22c-baac-4356-9d01-ec2b51700b3a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.280953 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-config" (OuterVolumeSpecName: "config") pod "4fd326f4-63cb-4c1d-bb6c-98118a45f714" (UID: "4fd326f4-63cb-4c1d-bb6c-98118a45f714"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.281112 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-client-ca" (OuterVolumeSpecName: "client-ca") pod "4fd326f4-63cb-4c1d-bb6c-98118a45f714" (UID: "4fd326f4-63cb-4c1d-bb6c-98118a45f714"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.283974 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd326f4-63cb-4c1d-bb6c-98118a45f714-kube-api-access-22qxm" (OuterVolumeSpecName: "kube-api-access-22qxm") pod "4fd326f4-63cb-4c1d-bb6c-98118a45f714" (UID: "4fd326f4-63cb-4c1d-bb6c-98118a45f714"). InnerVolumeSpecName "kube-api-access-22qxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.284696 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd326f4-63cb-4c1d-bb6c-98118a45f714-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4fd326f4-63cb-4c1d-bb6c-98118a45f714" (UID: "4fd326f4-63cb-4c1d-bb6c-98118a45f714"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.382447 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22qxm\" (UniqueName: \"kubernetes.io/projected/4fd326f4-63cb-4c1d-bb6c-98118a45f714-kube-api-access-22qxm\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.382504 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.382514 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd326f4-63cb-4c1d-bb6c-98118a45f714-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.382524 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd326f4-63cb-4c1d-bb6c-98118a45f714-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.428462 4931 generic.go:334] "Generic (PLEG): container finished" podID="61a1f22c-baac-4356-9d01-ec2b51700b3a" containerID="5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071" exitCode=0 Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.428543 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" event={"ID":"61a1f22c-baac-4356-9d01-ec2b51700b3a","Type":"ContainerDied","Data":"5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071"} Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.428577 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" event={"ID":"61a1f22c-baac-4356-9d01-ec2b51700b3a","Type":"ContainerDied","Data":"e5797b6657e3c9082bc25bca94daca7b60cb46a9c442bf1c2289963ba55e2ade"} Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.428579 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsn4r" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.428600 4931 scope.go:117] "RemoveContainer" containerID="5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.438765 4931 generic.go:334] "Generic (PLEG): container finished" podID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" containerID="8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491" exitCode=0 Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.438830 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.438833 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" event={"ID":"4fd326f4-63cb-4c1d-bb6c-98118a45f714","Type":"ContainerDied","Data":"8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491"} Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.439050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4" event={"ID":"4fd326f4-63cb-4c1d-bb6c-98118a45f714","Type":"ContainerDied","Data":"833fcf0086ce5d914597f0c997c10afab54c09e9f589df3d6d360cb20264d686"} Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.469501 4931 scope.go:117] "RemoveContainer" containerID="5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071" Jan 30 05:12:48 crc kubenswrapper[4931]: E0130 05:12:48.470592 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071\": container with ID starting with 5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071 not found: ID does not exist" containerID="5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.470678 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071"} err="failed to get container status \"5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071\": rpc error: code = NotFound desc = could not find container \"5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071\": container with ID starting with 5f95b1aab4f0cd9bb04c69533b5380769db0181d2dc9005aafa2ab56e08f0071 not found: ID does not exist" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.470751 4931 scope.go:117] "RemoveContainer" containerID="8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.478500 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsn4r"] Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.484076 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsn4r"] Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.491658 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4"] Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.494627 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5zjn4"] Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.507773 4931 scope.go:117] "RemoveContainer" containerID="8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491" Jan 30 05:12:48 crc kubenswrapper[4931]: E0130 05:12:48.508399 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491\": container with ID starting with 8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491 not found: ID does not exist" containerID="8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491" Jan 30 05:12:48 crc kubenswrapper[4931]: I0130 05:12:48.508498 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491"} err="failed to get container status \"8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491\": rpc error: code = NotFound desc = could not find container \"8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491\": container with ID starting with 8284502c7c6af356e24dd93841b664f4e4ab158665d96da9e6e1f25234da0491 not found: ID does not exist" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.433296 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" path="/var/lib/kubelet/pods/4fd326f4-63cb-4c1d-bb6c-98118a45f714/volumes" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.434805 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a1f22c-baac-4356-9d01-ec2b51700b3a" path="/var/lib/kubelet/pods/61a1f22c-baac-4356-9d01-ec2b51700b3a/volumes" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.620809 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5"] Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621153 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621167 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621183 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621190 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621199 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621206 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621216 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621225 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621235 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621242 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621256 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621262 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="extract-content" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621270 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerName="marketplace-operator" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621276 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerName="marketplace-operator" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621287 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621294 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621301 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621307 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621314 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a1f22c-baac-4356-9d01-ec2b51700b3a" containerName="controller-manager" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621321 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a1f22c-baac-4356-9d01-ec2b51700b3a" containerName="controller-manager" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621328 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621334 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621344 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621352 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621359 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621364 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621372 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" containerName="route-controller-manager" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621379 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" containerName="route-controller-manager" Jan 30 05:12:49 crc kubenswrapper[4931]: E0130 05:12:49.621386 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621392 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="extract-utilities" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621538 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9163b44e-4aa5-422c-a2fd-55747c8d506e" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621547 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbdc3df-7306-41e4-93c6-d7d27d481789" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621562 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc314d0c-da50-4607-93e1-5bece9c3b2b1" containerName="marketplace-operator" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621570 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a1f22c-baac-4356-9d01-ec2b51700b3a" containerName="controller-manager" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621579 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd326f4-63cb-4c1d-bb6c-98118a45f714" containerName="route-controller-manager" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621587 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb356dde-8435-471d-a260-8966eeb15eb3" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.621595 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac0e0dc-4375-4faf-a262-2cf4e9772a29" containerName="registry-server" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.622110 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.623335 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-796947dbf8-vrtc2"] Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.623745 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.624121 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.625453 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629042 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629201 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629332 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629486 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629712 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629828 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629946 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.629953 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.630234 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.630644 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.638747 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.645165 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-796947dbf8-vrtc2"] Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.651500 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5"] Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703163 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24kq\" (UniqueName: \"kubernetes.io/projected/22e7c3b5-dbbe-499e-84b0-b581db2401be-kube-api-access-q24kq\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx8n5\" (UniqueName: \"kubernetes.io/projected/6252d8d5-c05c-492d-adc0-37e03d1c8999-kube-api-access-zx8n5\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703254 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e7c3b5-dbbe-499e-84b0-b581db2401be-serving-cert\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703279 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-proxy-ca-bundles\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6252d8d5-c05c-492d-adc0-37e03d1c8999-config\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703317 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6252d8d5-c05c-492d-adc0-37e03d1c8999-serving-cert\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703348 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-config\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703373 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6252d8d5-c05c-492d-adc0-37e03d1c8999-client-ca\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.703392 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-client-ca\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.804933 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6252d8d5-c05c-492d-adc0-37e03d1c8999-serving-cert\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805011 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-config\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805041 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6252d8d5-c05c-492d-adc0-37e03d1c8999-client-ca\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805060 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-client-ca\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805100 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q24kq\" (UniqueName: \"kubernetes.io/projected/22e7c3b5-dbbe-499e-84b0-b581db2401be-kube-api-access-q24kq\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805129 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx8n5\" (UniqueName: \"kubernetes.io/projected/6252d8d5-c05c-492d-adc0-37e03d1c8999-kube-api-access-zx8n5\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805156 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e7c3b5-dbbe-499e-84b0-b581db2401be-serving-cert\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805665 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-proxy-ca-bundles\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.805695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6252d8d5-c05c-492d-adc0-37e03d1c8999-config\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.806466 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-client-ca\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.806920 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6252d8d5-c05c-492d-adc0-37e03d1c8999-client-ca\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.807048 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-proxy-ca-bundles\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.807185 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6252d8d5-c05c-492d-adc0-37e03d1c8999-config\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.808618 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-config\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.811269 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e7c3b5-dbbe-499e-84b0-b581db2401be-serving-cert\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.814237 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6252d8d5-c05c-492d-adc0-37e03d1c8999-serving-cert\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.824265 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx8n5\" (UniqueName: \"kubernetes.io/projected/6252d8d5-c05c-492d-adc0-37e03d1c8999-kube-api-access-zx8n5\") pod \"route-controller-manager-59fc96bcb9-lbvj5\" (UID: \"6252d8d5-c05c-492d-adc0-37e03d1c8999\") " pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.825468 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24kq\" (UniqueName: \"kubernetes.io/projected/22e7c3b5-dbbe-499e-84b0-b581db2401be-kube-api-access-q24kq\") pod \"controller-manager-796947dbf8-vrtc2\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.950195 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:49 crc kubenswrapper[4931]: I0130 05:12:49.961113 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.207289 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-796947dbf8-vrtc2"] Jan 30 05:12:50 crc kubenswrapper[4931]: W0130 05:12:50.219695 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22e7c3b5_dbbe_499e_84b0_b581db2401be.slice/crio-8730f1dad4de33142daffe01ac15a9b43daa10f63aaa49338319e0fede7681d4 WatchSource:0}: Error finding container 8730f1dad4de33142daffe01ac15a9b43daa10f63aaa49338319e0fede7681d4: Status 404 returned error can't find the container with id 8730f1dad4de33142daffe01ac15a9b43daa10f63aaa49338319e0fede7681d4 Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.241972 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5"] Jan 30 05:12:50 crc kubenswrapper[4931]: W0130 05:12:50.286443 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6252d8d5_c05c_492d_adc0_37e03d1c8999.slice/crio-e1ee5ddcd91c92063356ce95f477542c9860273020ffe09123142143907a303e WatchSource:0}: Error finding container e1ee5ddcd91c92063356ce95f477542c9860273020ffe09123142143907a303e: Status 404 returned error can't find the container with id e1ee5ddcd91c92063356ce95f477542c9860273020ffe09123142143907a303e Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.457264 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" event={"ID":"22e7c3b5-dbbe-499e-84b0-b581db2401be","Type":"ContainerStarted","Data":"334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051"} Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.457883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" event={"ID":"22e7c3b5-dbbe-499e-84b0-b581db2401be","Type":"ContainerStarted","Data":"8730f1dad4de33142daffe01ac15a9b43daa10f63aaa49338319e0fede7681d4"} Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.457915 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.460185 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" event={"ID":"6252d8d5-c05c-492d-adc0-37e03d1c8999","Type":"ContainerStarted","Data":"be7c814ba8fd871965485b4f6cee86d7878e1312bde78e615d3cbcc32e172174"} Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.460239 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" event={"ID":"6252d8d5-c05c-492d-adc0-37e03d1c8999","Type":"ContainerStarted","Data":"e1ee5ddcd91c92063356ce95f477542c9860273020ffe09123142143907a303e"} Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.460438 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.461707 4931 patch_prober.go:28] interesting pod/route-controller-manager-59fc96bcb9-lbvj5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.461860 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" podUID="6252d8d5-c05c-492d-adc0-37e03d1c8999" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.467495 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.493655 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" podStartSLOduration=3.493631136 podStartE2EDuration="3.493631136s" podCreationTimestamp="2026-01-30 05:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:50.491734951 +0000 UTC m=+305.861645208" watchObservedRunningTime="2026-01-30 05:12:50.493631136 +0000 UTC m=+305.863541393" Jan 30 05:12:50 crc kubenswrapper[4931]: I0130 05:12:50.564998 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" podStartSLOduration=3.564974698 podStartE2EDuration="3.564974698s" podCreationTimestamp="2026-01-30 05:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:50.560463008 +0000 UTC m=+305.930373285" watchObservedRunningTime="2026-01-30 05:12:50.564974698 +0000 UTC m=+305.934884965" Jan 30 05:12:51 crc kubenswrapper[4931]: I0130 05:12:51.470826 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59fc96bcb9-lbvj5" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.708964 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-629s4"] Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.711086 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.722600 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-629s4"] Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.884512 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5aa81d9-e89e-4958-b823-73da6250ba31-registry-certificates\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.884707 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5aa81d9-e89e-4958-b823-73da6250ba31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.884752 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-bound-sa-token\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.884797 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn6rg\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-kube-api-access-mn6rg\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.884939 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-registry-tls\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.885038 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5aa81d9-e89e-4958-b823-73da6250ba31-trusted-ca\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.885063 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5aa81d9-e89e-4958-b823-73da6250ba31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.885198 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.908029 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987018 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5aa81d9-e89e-4958-b823-73da6250ba31-registry-certificates\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5aa81d9-e89e-4958-b823-73da6250ba31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987226 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-bound-sa-token\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987275 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn6rg\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-kube-api-access-mn6rg\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987331 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-registry-tls\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987380 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5aa81d9-e89e-4958-b823-73da6250ba31-trusted-ca\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.987838 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5aa81d9-e89e-4958-b823-73da6250ba31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.990410 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5aa81d9-e89e-4958-b823-73da6250ba31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.988404 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5aa81d9-e89e-4958-b823-73da6250ba31-registry-certificates\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.990322 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5aa81d9-e89e-4958-b823-73da6250ba31-trusted-ca\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.995467 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-registry-tls\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:02 crc kubenswrapper[4931]: I0130 05:13:02.995591 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5aa81d9-e89e-4958-b823-73da6250ba31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:03 crc kubenswrapper[4931]: I0130 05:13:03.008692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-bound-sa-token\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:03 crc kubenswrapper[4931]: I0130 05:13:03.009301 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn6rg\" (UniqueName: \"kubernetes.io/projected/f5aa81d9-e89e-4958-b823-73da6250ba31-kube-api-access-mn6rg\") pod \"image-registry-66df7c8f76-629s4\" (UID: \"f5aa81d9-e89e-4958-b823-73da6250ba31\") " pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:03 crc kubenswrapper[4931]: I0130 05:13:03.050077 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:03 crc kubenswrapper[4931]: I0130 05:13:03.546400 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-629s4"] Jan 30 05:13:03 crc kubenswrapper[4931]: W0130 05:13:03.553996 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5aa81d9_e89e_4958_b823_73da6250ba31.slice/crio-d349d35cc57e9d1fc28c678fba86d94ef517778f251e24bc650cd8a859591d76 WatchSource:0}: Error finding container d349d35cc57e9d1fc28c678fba86d94ef517778f251e24bc650cd8a859591d76: Status 404 returned error can't find the container with id d349d35cc57e9d1fc28c678fba86d94ef517778f251e24bc650cd8a859591d76 Jan 30 05:13:04 crc kubenswrapper[4931]: I0130 05:13:04.564706 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" event={"ID":"f5aa81d9-e89e-4958-b823-73da6250ba31","Type":"ContainerStarted","Data":"cbd93577eba7683d1f1840cf871ba2bfc25b76cfa0a06797054e30eed27e259a"} Jan 30 05:13:04 crc kubenswrapper[4931]: I0130 05:13:04.565205 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:04 crc kubenswrapper[4931]: I0130 05:13:04.565224 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" event={"ID":"f5aa81d9-e89e-4958-b823-73da6250ba31","Type":"ContainerStarted","Data":"d349d35cc57e9d1fc28c678fba86d94ef517778f251e24bc650cd8a859591d76"} Jan 30 05:13:04 crc kubenswrapper[4931]: I0130 05:13:04.613615 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" podStartSLOduration=2.613595025 podStartE2EDuration="2.613595025s" podCreationTimestamp="2026-01-30 05:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:13:04.608208769 +0000 UTC m=+319.978119036" watchObservedRunningTime="2026-01-30 05:13:04.613595025 +0000 UTC m=+319.983505282" Jan 30 05:13:23 crc kubenswrapper[4931]: I0130 05:13:23.057650 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-629s4" Jan 30 05:13:23 crc kubenswrapper[4931]: I0130 05:13:23.192840 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f8zg7"] Jan 30 05:13:27 crc kubenswrapper[4931]: I0130 05:13:27.363591 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:13:27 crc kubenswrapper[4931]: I0130 05:13:27.365750 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:13:27 crc kubenswrapper[4931]: I0130 05:13:27.651654 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-796947dbf8-vrtc2"] Jan 30 05:13:27 crc kubenswrapper[4931]: I0130 05:13:27.652681 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" podUID="22e7c3b5-dbbe-499e-84b0-b581db2401be" containerName="controller-manager" containerID="cri-o://334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051" gracePeriod=30 Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.063298 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.170281 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-client-ca\") pod \"22e7c3b5-dbbe-499e-84b0-b581db2401be\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.170370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-config\") pod \"22e7c3b5-dbbe-499e-84b0-b581db2401be\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.170479 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-proxy-ca-bundles\") pod \"22e7c3b5-dbbe-499e-84b0-b581db2401be\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.170536 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q24kq\" (UniqueName: \"kubernetes.io/projected/22e7c3b5-dbbe-499e-84b0-b581db2401be-kube-api-access-q24kq\") pod \"22e7c3b5-dbbe-499e-84b0-b581db2401be\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.170571 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e7c3b5-dbbe-499e-84b0-b581db2401be-serving-cert\") pod \"22e7c3b5-dbbe-499e-84b0-b581db2401be\" (UID: \"22e7c3b5-dbbe-499e-84b0-b581db2401be\") " Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.172500 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-client-ca" (OuterVolumeSpecName: "client-ca") pod "22e7c3b5-dbbe-499e-84b0-b581db2401be" (UID: "22e7c3b5-dbbe-499e-84b0-b581db2401be"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.172777 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "22e7c3b5-dbbe-499e-84b0-b581db2401be" (UID: "22e7c3b5-dbbe-499e-84b0-b581db2401be"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.173743 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-config" (OuterVolumeSpecName: "config") pod "22e7c3b5-dbbe-499e-84b0-b581db2401be" (UID: "22e7c3b5-dbbe-499e-84b0-b581db2401be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.180930 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22e7c3b5-dbbe-499e-84b0-b581db2401be-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "22e7c3b5-dbbe-499e-84b0-b581db2401be" (UID: "22e7c3b5-dbbe-499e-84b0-b581db2401be"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.180971 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e7c3b5-dbbe-499e-84b0-b581db2401be-kube-api-access-q24kq" (OuterVolumeSpecName: "kube-api-access-q24kq") pod "22e7c3b5-dbbe-499e-84b0-b581db2401be" (UID: "22e7c3b5-dbbe-499e-84b0-b581db2401be"). InnerVolumeSpecName "kube-api-access-q24kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.272700 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.272820 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q24kq\" (UniqueName: \"kubernetes.io/projected/22e7c3b5-dbbe-499e-84b0-b581db2401be-kube-api-access-q24kq\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.272851 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e7c3b5-dbbe-499e-84b0-b581db2401be-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.272869 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.272887 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e7c3b5-dbbe-499e-84b0-b581db2401be-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.748084 4931 generic.go:334] "Generic (PLEG): container finished" podID="22e7c3b5-dbbe-499e-84b0-b581db2401be" containerID="334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051" exitCode=0 Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.748154 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" event={"ID":"22e7c3b5-dbbe-499e-84b0-b581db2401be","Type":"ContainerDied","Data":"334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051"} Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.748188 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.748215 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796947dbf8-vrtc2" event={"ID":"22e7c3b5-dbbe-499e-84b0-b581db2401be","Type":"ContainerDied","Data":"8730f1dad4de33142daffe01ac15a9b43daa10f63aaa49338319e0fede7681d4"} Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.748249 4931 scope.go:117] "RemoveContainer" containerID="334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.777540 4931 scope.go:117] "RemoveContainer" containerID="334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051" Jan 30 05:13:28 crc kubenswrapper[4931]: E0130 05:13:28.778205 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051\": container with ID starting with 334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051 not found: ID does not exist" containerID="334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.778300 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051"} err="failed to get container status \"334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051\": rpc error: code = NotFound desc = could not find container \"334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051\": container with ID starting with 334ca40b7e5247624f4b661300f84d8d55aa7741bb4085f58747cd976544f051 not found: ID does not exist" Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.809024 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-796947dbf8-vrtc2"] Jan 30 05:13:28 crc kubenswrapper[4931]: I0130 05:13:28.815100 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-796947dbf8-vrtc2"] Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.435516 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e7c3b5-dbbe-499e-84b0-b581db2401be" path="/var/lib/kubelet/pods/22e7c3b5-dbbe-499e-84b0-b581db2401be/volumes" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.679065 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67db4dc676-9s7v8"] Jan 30 05:13:29 crc kubenswrapper[4931]: E0130 05:13:29.679923 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e7c3b5-dbbe-499e-84b0-b581db2401be" containerName="controller-manager" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.679948 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e7c3b5-dbbe-499e-84b0-b581db2401be" containerName="controller-manager" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.680150 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e7c3b5-dbbe-499e-84b0-b581db2401be" containerName="controller-manager" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.680760 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.685686 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.686037 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.686302 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.686730 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.687017 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.695326 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.699104 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.715756 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67db4dc676-9s7v8"] Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.799077 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-client-ca\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.799467 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dde9b145-fcfe-4d25-81bf-9eeb73805640-serving-cert\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.799562 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-proxy-ca-bundles\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.799701 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-config\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.799819 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22n2\" (UniqueName: \"kubernetes.io/projected/dde9b145-fcfe-4d25-81bf-9eeb73805640-kube-api-access-s22n2\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.901883 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-config\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.902142 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22n2\" (UniqueName: \"kubernetes.io/projected/dde9b145-fcfe-4d25-81bf-9eeb73805640-kube-api-access-s22n2\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.902201 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-client-ca\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.902260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dde9b145-fcfe-4d25-81bf-9eeb73805640-serving-cert\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.903352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-client-ca\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.903816 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-proxy-ca-bundles\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.904701 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-proxy-ca-bundles\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.905459 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde9b145-fcfe-4d25-81bf-9eeb73805640-config\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.916316 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dde9b145-fcfe-4d25-81bf-9eeb73805640-serving-cert\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:29 crc kubenswrapper[4931]: I0130 05:13:29.930397 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22n2\" (UniqueName: \"kubernetes.io/projected/dde9b145-fcfe-4d25-81bf-9eeb73805640-kube-api-access-s22n2\") pod \"controller-manager-67db4dc676-9s7v8\" (UID: \"dde9b145-fcfe-4d25-81bf-9eeb73805640\") " pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:30 crc kubenswrapper[4931]: I0130 05:13:30.012659 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:30 crc kubenswrapper[4931]: I0130 05:13:30.260329 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67db4dc676-9s7v8"] Jan 30 05:13:30 crc kubenswrapper[4931]: I0130 05:13:30.768025 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" event={"ID":"dde9b145-fcfe-4d25-81bf-9eeb73805640","Type":"ContainerStarted","Data":"79ad1673422cd9fe737fc31029457d07d4dc0b75ffec4f181f02b42877da04cf"} Jan 30 05:13:30 crc kubenswrapper[4931]: I0130 05:13:30.768113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" event={"ID":"dde9b145-fcfe-4d25-81bf-9eeb73805640","Type":"ContainerStarted","Data":"c4ac253b910233a2b86b7a54b421450094e8ed2c4080cf539e6c53fd41b4df35"} Jan 30 05:13:30 crc kubenswrapper[4931]: I0130 05:13:30.803586 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" podStartSLOduration=3.803564286 podStartE2EDuration="3.803564286s" podCreationTimestamp="2026-01-30 05:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:13:30.800098909 +0000 UTC m=+346.170009166" watchObservedRunningTime="2026-01-30 05:13:30.803564286 +0000 UTC m=+346.173474543" Jan 30 05:13:31 crc kubenswrapper[4931]: I0130 05:13:31.782185 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:31 crc kubenswrapper[4931]: I0130 05:13:31.791064 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67db4dc676-9s7v8" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.153815 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kf2zk"] Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.161641 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.163517 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kf2zk"] Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.165181 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.279709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kl88\" (UniqueName: \"kubernetes.io/projected/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-kube-api-access-8kl88\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.279796 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-utilities\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.279923 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-catalog-content\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.343748 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wn8rd"] Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.346198 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.349172 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.358644 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn8rd"] Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.381534 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-catalog-content\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.381615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kl88\" (UniqueName: \"kubernetes.io/projected/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-kube-api-access-8kl88\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.381657 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-utilities\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.382164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-catalog-content\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.382212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-utilities\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.435106 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kl88\" (UniqueName: \"kubernetes.io/projected/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-kube-api-access-8kl88\") pod \"community-operators-kf2zk\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.482826 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-catalog-content\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.482996 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2896n\" (UniqueName: \"kubernetes.io/projected/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-kube-api-access-2896n\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.483140 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-utilities\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.498047 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.584262 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-catalog-content\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.584863 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2896n\" (UniqueName: \"kubernetes.io/projected/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-kube-api-access-2896n\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.584910 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-catalog-content\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.584956 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-utilities\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.585544 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-utilities\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.610364 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2896n\" (UniqueName: \"kubernetes.io/projected/f88493be-1e8e-47b8-9ac7-d035ba0b6e36-kube-api-access-2896n\") pod \"certified-operators-wn8rd\" (UID: \"f88493be-1e8e-47b8-9ac7-d035ba0b6e36\") " pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.714476 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:44 crc kubenswrapper[4931]: I0130 05:13:44.999032 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kf2zk"] Jan 30 05:13:45 crc kubenswrapper[4931]: W0130 05:13:45.004783 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7fc26b_b0a0_4ed3_973a_d14f3118f495.slice/crio-1886542e78e49a17a6d2a06541b8bb125f47e58db49595a42b19bb6b2b8126f7 WatchSource:0}: Error finding container 1886542e78e49a17a6d2a06541b8bb125f47e58db49595a42b19bb6b2b8126f7: Status 404 returned error can't find the container with id 1886542e78e49a17a6d2a06541b8bb125f47e58db49595a42b19bb6b2b8126f7 Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.194928 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn8rd"] Jan 30 05:13:45 crc kubenswrapper[4931]: W0130 05:13:45.205311 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf88493be_1e8e_47b8_9ac7_d035ba0b6e36.slice/crio-b6af702879f7d9cff5a84f904295f2aa1e0f5d5d5643273bfb3341cada78a420 WatchSource:0}: Error finding container b6af702879f7d9cff5a84f904295f2aa1e0f5d5d5643273bfb3341cada78a420: Status 404 returned error can't find the container with id b6af702879f7d9cff5a84f904295f2aa1e0f5d5d5643273bfb3341cada78a420 Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.899745 4931 generic.go:334] "Generic (PLEG): container finished" podID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerID="354aad0cad4b5a2844a0aaa97a5d9c4e75d0d2f7996caccea5b63021c15588c0" exitCode=0 Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.899852 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerDied","Data":"354aad0cad4b5a2844a0aaa97a5d9c4e75d0d2f7996caccea5b63021c15588c0"} Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.899894 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerStarted","Data":"1886542e78e49a17a6d2a06541b8bb125f47e58db49595a42b19bb6b2b8126f7"} Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.901911 4931 generic.go:334] "Generic (PLEG): container finished" podID="f88493be-1e8e-47b8-9ac7-d035ba0b6e36" containerID="45c24f494b946d56f859342104d13bbfa98146933c4ff15c61aeb6bdc04ed7e5" exitCode=0 Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.901960 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn8rd" event={"ID":"f88493be-1e8e-47b8-9ac7-d035ba0b6e36","Type":"ContainerDied","Data":"45c24f494b946d56f859342104d13bbfa98146933c4ff15c61aeb6bdc04ed7e5"} Jan 30 05:13:45 crc kubenswrapper[4931]: I0130 05:13:45.901991 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn8rd" event={"ID":"f88493be-1e8e-47b8-9ac7-d035ba0b6e36","Type":"ContainerStarted","Data":"b6af702879f7d9cff5a84f904295f2aa1e0f5d5d5643273bfb3341cada78a420"} Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.549171 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w6b74"] Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.551944 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.561931 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.563523 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6b74"] Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.718868 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aacb80d-976e-4059-9c84-857aab618f4e-catalog-content\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.719510 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrdj\" (UniqueName: \"kubernetes.io/projected/5aacb80d-976e-4059-9c84-857aab618f4e-kube-api-access-dkrdj\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.719568 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aacb80d-976e-4059-9c84-857aab618f4e-utilities\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.750280 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kg222"] Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.753342 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.759049 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.765630 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg222"] Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.821040 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aacb80d-976e-4059-9c84-857aab618f4e-catalog-content\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.821123 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrdj\" (UniqueName: \"kubernetes.io/projected/5aacb80d-976e-4059-9c84-857aab618f4e-kube-api-access-dkrdj\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.821177 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aacb80d-976e-4059-9c84-857aab618f4e-utilities\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.822839 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aacb80d-976e-4059-9c84-857aab618f4e-utilities\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.822878 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aacb80d-976e-4059-9c84-857aab618f4e-catalog-content\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.867743 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrdj\" (UniqueName: \"kubernetes.io/projected/5aacb80d-976e-4059-9c84-857aab618f4e-kube-api-access-dkrdj\") pod \"redhat-marketplace-w6b74\" (UID: \"5aacb80d-976e-4059-9c84-857aab618f4e\") " pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.892898 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.915317 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerStarted","Data":"3075fb652ae28ffe627ee2fb3e561168de8141b7ab9f92e2f6fdc3f70ab564db"} Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.918874 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn8rd" event={"ID":"f88493be-1e8e-47b8-9ac7-d035ba0b6e36","Type":"ContainerStarted","Data":"27fd229db93cfe74643d84e10ec520a9d77f0a3857d8b1d7dc0212a63749cb0c"} Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.923737 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxzw\" (UniqueName: \"kubernetes.io/projected/4c0c107d-a03c-479f-b127-2824affd9b35-kube-api-access-9cxzw\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.925204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0c107d-a03c-479f-b127-2824affd9b35-catalog-content\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:46 crc kubenswrapper[4931]: I0130 05:13:46.926709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0c107d-a03c-479f-b127-2824affd9b35-utilities\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.028023 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxzw\" (UniqueName: \"kubernetes.io/projected/4c0c107d-a03c-479f-b127-2824affd9b35-kube-api-access-9cxzw\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.028461 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0c107d-a03c-479f-b127-2824affd9b35-catalog-content\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.029077 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0c107d-a03c-479f-b127-2824affd9b35-catalog-content\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.029211 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0c107d-a03c-479f-b127-2824affd9b35-utilities\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.029533 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0c107d-a03c-479f-b127-2824affd9b35-utilities\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.051869 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxzw\" (UniqueName: \"kubernetes.io/projected/4c0c107d-a03c-479f-b127-2824affd9b35-kube-api-access-9cxzw\") pod \"redhat-operators-kg222\" (UID: \"4c0c107d-a03c-479f-b127-2824affd9b35\") " pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.108548 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.334843 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6b74"] Jan 30 05:13:47 crc kubenswrapper[4931]: W0130 05:13:47.340036 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aacb80d_976e_4059_9c84_857aab618f4e.slice/crio-6ea9b713e2411eb4cf5ff9e021770cc75e6120c762332593b8e9c9fef28ef132 WatchSource:0}: Error finding container 6ea9b713e2411eb4cf5ff9e021770cc75e6120c762332593b8e9c9fef28ef132: Status 404 returned error can't find the container with id 6ea9b713e2411eb4cf5ff9e021770cc75e6120c762332593b8e9c9fef28ef132 Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.557199 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg222"] Jan 30 05:13:47 crc kubenswrapper[4931]: W0130 05:13:47.632535 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0c107d_a03c_479f_b127_2824affd9b35.slice/crio-b4419f302b91b29cf39b49013976953759032f4b6207ce1e5990915f2ffd1bd4 WatchSource:0}: Error finding container b4419f302b91b29cf39b49013976953759032f4b6207ce1e5990915f2ffd1bd4: Status 404 returned error can't find the container with id b4419f302b91b29cf39b49013976953759032f4b6207ce1e5990915f2ffd1bd4 Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.930088 4931 generic.go:334] "Generic (PLEG): container finished" podID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerID="3075fb652ae28ffe627ee2fb3e561168de8141b7ab9f92e2f6fdc3f70ab564db" exitCode=0 Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.930212 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerDied","Data":"3075fb652ae28ffe627ee2fb3e561168de8141b7ab9f92e2f6fdc3f70ab564db"} Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.933535 4931 generic.go:334] "Generic (PLEG): container finished" podID="4c0c107d-a03c-479f-b127-2824affd9b35" containerID="6186d49243afe7d666fe2ccf6dce075fb9428c4fc6045951ddee6f7395f960d2" exitCode=0 Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.933617 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg222" event={"ID":"4c0c107d-a03c-479f-b127-2824affd9b35","Type":"ContainerDied","Data":"6186d49243afe7d666fe2ccf6dce075fb9428c4fc6045951ddee6f7395f960d2"} Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.933661 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg222" event={"ID":"4c0c107d-a03c-479f-b127-2824affd9b35","Type":"ContainerStarted","Data":"b4419f302b91b29cf39b49013976953759032f4b6207ce1e5990915f2ffd1bd4"} Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.944957 4931 generic.go:334] "Generic (PLEG): container finished" podID="f88493be-1e8e-47b8-9ac7-d035ba0b6e36" containerID="27fd229db93cfe74643d84e10ec520a9d77f0a3857d8b1d7dc0212a63749cb0c" exitCode=0 Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.945050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn8rd" event={"ID":"f88493be-1e8e-47b8-9ac7-d035ba0b6e36","Type":"ContainerDied","Data":"27fd229db93cfe74643d84e10ec520a9d77f0a3857d8b1d7dc0212a63749cb0c"} Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.951484 4931 generic.go:334] "Generic (PLEG): container finished" podID="5aacb80d-976e-4059-9c84-857aab618f4e" containerID="6b360c827ca6ade30a85377101c3662f34c4012c21bad96a60cbcf8efc2c37fb" exitCode=0 Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.951538 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6b74" event={"ID":"5aacb80d-976e-4059-9c84-857aab618f4e","Type":"ContainerDied","Data":"6b360c827ca6ade30a85377101c3662f34c4012c21bad96a60cbcf8efc2c37fb"} Jan 30 05:13:47 crc kubenswrapper[4931]: I0130 05:13:47.951594 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6b74" event={"ID":"5aacb80d-976e-4059-9c84-857aab618f4e","Type":"ContainerStarted","Data":"6ea9b713e2411eb4cf5ff9e021770cc75e6120c762332593b8e9c9fef28ef132"} Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.249495 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" podUID="32e4a367-9945-4fdb-b5bc-4c8d35512264" containerName="registry" containerID="cri-o://a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9" gracePeriod=30 Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.844110 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955540 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32e4a367-9945-4fdb-b5bc-4c8d35512264-ca-trust-extracted\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955596 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-tls\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955635 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-certificates\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955703 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nltg\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-kube-api-access-6nltg\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955810 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955831 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32e4a367-9945-4fdb-b5bc-4c8d35512264-installation-pull-secrets\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.955854 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-trusted-ca\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.956757 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.956864 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-bound-sa-token\") pod \"32e4a367-9945-4fdb-b5bc-4c8d35512264\" (UID: \"32e4a367-9945-4fdb-b5bc-4c8d35512264\") " Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.957016 4931 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.962709 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-kube-api-access-6nltg" (OuterVolumeSpecName: "kube-api-access-6nltg") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "kube-api-access-6nltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.963039 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.964522 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e4a367-9945-4fdb-b5bc-4c8d35512264-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.964822 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.968948 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.969388 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg222" event={"ID":"4c0c107d-a03c-479f-b127-2824affd9b35","Type":"ContainerStarted","Data":"165b5c448d74db26e17321b06cbf97fa28ab34612e8d88e1196f010e5fec0247"} Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.973888 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.987155 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn8rd" event={"ID":"f88493be-1e8e-47b8-9ac7-d035ba0b6e36","Type":"ContainerStarted","Data":"07276f5318ed5db8e2469700b70de6a9f88c4112e661c5f07bcfe2b44242a25c"} Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.989749 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32e4a367-9945-4fdb-b5bc-4c8d35512264-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "32e4a367-9945-4fdb-b5bc-4c8d35512264" (UID: "32e4a367-9945-4fdb-b5bc-4c8d35512264"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.993753 4931 generic.go:334] "Generic (PLEG): container finished" podID="32e4a367-9945-4fdb-b5bc-4c8d35512264" containerID="a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9" exitCode=0 Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.993888 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.994730 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" event={"ID":"32e4a367-9945-4fdb-b5bc-4c8d35512264","Type":"ContainerDied","Data":"a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9"} Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.994794 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f8zg7" event={"ID":"32e4a367-9945-4fdb-b5bc-4c8d35512264","Type":"ContainerDied","Data":"79ebc9473f22f72df11aa297cb419ebdd7c57ca36caf670a91a0d056621b7c54"} Jan 30 05:13:48 crc kubenswrapper[4931]: I0130 05:13:48.994816 4931 scope.go:117] "RemoveContainer" containerID="a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.003363 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerStarted","Data":"f2435e1bbad2c342892feef9048c365b9106973a68f19a101621695990d1928d"} Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.038665 4931 scope.go:117] "RemoveContainer" containerID="a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.050118 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wn8rd" podStartSLOduration=2.601924856 podStartE2EDuration="5.049810353s" podCreationTimestamp="2026-01-30 05:13:44 +0000 UTC" firstStartedPulling="2026-01-30 05:13:45.905306821 +0000 UTC m=+361.275217098" lastFinishedPulling="2026-01-30 05:13:48.353192298 +0000 UTC m=+363.723102595" observedRunningTime="2026-01-30 05:13:49.029712047 +0000 UTC m=+364.399622334" watchObservedRunningTime="2026-01-30 05:13:49.049810353 +0000 UTC m=+364.419720630" Jan 30 05:13:49 crc kubenswrapper[4931]: E0130 05:13:49.055931 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9\": container with ID starting with a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9 not found: ID does not exist" containerID="a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.056004 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9"} err="failed to get container status \"a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9\": rpc error: code = NotFound desc = could not find container \"a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9\": container with ID starting with a4b4ba0915f07fcf004390096fa0954885a608b2a848ca36a182ab0388cbfba9 not found: ID does not exist" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.058706 4931 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32e4a367-9945-4fdb-b5bc-4c8d35512264-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.059130 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32e4a367-9945-4fdb-b5bc-4c8d35512264-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.059365 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.059570 4931 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32e4a367-9945-4fdb-b5bc-4c8d35512264-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.059760 4931 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.059967 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nltg\" (UniqueName: \"kubernetes.io/projected/32e4a367-9945-4fdb-b5bc-4c8d35512264-kube-api-access-6nltg\") on node \"crc\" DevicePath \"\"" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.084138 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f8zg7"] Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.088052 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f8zg7"] Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.088785 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kf2zk" podStartSLOduration=2.586018518 podStartE2EDuration="5.08876755s" podCreationTimestamp="2026-01-30 05:13:44 +0000 UTC" firstStartedPulling="2026-01-30 05:13:45.906647088 +0000 UTC m=+361.276557355" lastFinishedPulling="2026-01-30 05:13:48.40939612 +0000 UTC m=+363.779306387" observedRunningTime="2026-01-30 05:13:49.079813038 +0000 UTC m=+364.449723295" watchObservedRunningTime="2026-01-30 05:13:49.08876755 +0000 UTC m=+364.458677807" Jan 30 05:13:49 crc kubenswrapper[4931]: I0130 05:13:49.429803 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e4a367-9945-4fdb-b5bc-4c8d35512264" path="/var/lib/kubelet/pods/32e4a367-9945-4fdb-b5bc-4c8d35512264/volumes" Jan 30 05:13:50 crc kubenswrapper[4931]: I0130 05:13:50.012615 4931 generic.go:334] "Generic (PLEG): container finished" podID="4c0c107d-a03c-479f-b127-2824affd9b35" containerID="165b5c448d74db26e17321b06cbf97fa28ab34612e8d88e1196f010e5fec0247" exitCode=0 Jan 30 05:13:50 crc kubenswrapper[4931]: I0130 05:13:50.012742 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg222" event={"ID":"4c0c107d-a03c-479f-b127-2824affd9b35","Type":"ContainerDied","Data":"165b5c448d74db26e17321b06cbf97fa28ab34612e8d88e1196f010e5fec0247"} Jan 30 05:13:50 crc kubenswrapper[4931]: I0130 05:13:50.015871 4931 generic.go:334] "Generic (PLEG): container finished" podID="5aacb80d-976e-4059-9c84-857aab618f4e" containerID="9e41ae65676a2a4218fe39ac6aa91019b0cff14cc71a71f96119ceb918456d3a" exitCode=0 Jan 30 05:13:50 crc kubenswrapper[4931]: I0130 05:13:50.016294 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6b74" event={"ID":"5aacb80d-976e-4059-9c84-857aab618f4e","Type":"ContainerDied","Data":"9e41ae65676a2a4218fe39ac6aa91019b0cff14cc71a71f96119ceb918456d3a"} Jan 30 05:13:51 crc kubenswrapper[4931]: I0130 05:13:51.026619 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg222" event={"ID":"4c0c107d-a03c-479f-b127-2824affd9b35","Type":"ContainerStarted","Data":"01338e299128fcc6de41c64c196d979b4e4f88cb88b53aa2c7878ac629faa42f"} Jan 30 05:13:51 crc kubenswrapper[4931]: I0130 05:13:51.033201 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6b74" event={"ID":"5aacb80d-976e-4059-9c84-857aab618f4e","Type":"ContainerStarted","Data":"8762433b8429fe35611223268c6addbe95078f2256b8feee27b88c7c4a2321ad"} Jan 30 05:13:51 crc kubenswrapper[4931]: I0130 05:13:51.062122 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kg222" podStartSLOduration=2.522917738 podStartE2EDuration="5.062098906s" podCreationTimestamp="2026-01-30 05:13:46 +0000 UTC" firstStartedPulling="2026-01-30 05:13:47.936162876 +0000 UTC m=+363.306073133" lastFinishedPulling="2026-01-30 05:13:50.475344044 +0000 UTC m=+365.845254301" observedRunningTime="2026-01-30 05:13:51.054966505 +0000 UTC m=+366.424876802" watchObservedRunningTime="2026-01-30 05:13:51.062098906 +0000 UTC m=+366.432009193" Jan 30 05:13:51 crc kubenswrapper[4931]: I0130 05:13:51.087642 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w6b74" podStartSLOduration=2.585274394 podStartE2EDuration="5.087624215s" podCreationTimestamp="2026-01-30 05:13:46 +0000 UTC" firstStartedPulling="2026-01-30 05:13:47.953788622 +0000 UTC m=+363.323698889" lastFinishedPulling="2026-01-30 05:13:50.456138463 +0000 UTC m=+365.826048710" observedRunningTime="2026-01-30 05:13:51.081999976 +0000 UTC m=+366.451910273" watchObservedRunningTime="2026-01-30 05:13:51.087624215 +0000 UTC m=+366.457534472" Jan 30 05:13:54 crc kubenswrapper[4931]: I0130 05:13:54.498573 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:54 crc kubenswrapper[4931]: I0130 05:13:54.499282 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:54 crc kubenswrapper[4931]: I0130 05:13:54.581014 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:54 crc kubenswrapper[4931]: I0130 05:13:54.715053 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:54 crc kubenswrapper[4931]: I0130 05:13:54.715148 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:54 crc kubenswrapper[4931]: I0130 05:13:54.778390 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:55 crc kubenswrapper[4931]: I0130 05:13:55.131738 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 05:13:55 crc kubenswrapper[4931]: I0130 05:13:55.137004 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wn8rd" Jan 30 05:13:56 crc kubenswrapper[4931]: I0130 05:13:56.894202 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:56 crc kubenswrapper[4931]: I0130 05:13:56.894743 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:56 crc kubenswrapper[4931]: I0130 05:13:56.968987 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:57 crc kubenswrapper[4931]: I0130 05:13:57.109512 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:57 crc kubenswrapper[4931]: I0130 05:13:57.109580 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:13:57 crc kubenswrapper[4931]: I0130 05:13:57.168213 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w6b74" Jan 30 05:13:57 crc kubenswrapper[4931]: I0130 05:13:57.364117 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:13:57 crc kubenswrapper[4931]: I0130 05:13:57.364219 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:13:58 crc kubenswrapper[4931]: I0130 05:13:58.171384 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kg222" podUID="4c0c107d-a03c-479f-b127-2824affd9b35" containerName="registry-server" probeResult="failure" output=< Jan 30 05:13:58 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 05:13:58 crc kubenswrapper[4931]: > Jan 30 05:14:07 crc kubenswrapper[4931]: I0130 05:14:07.181637 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:14:07 crc kubenswrapper[4931]: I0130 05:14:07.236195 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kg222" Jan 30 05:14:27 crc kubenswrapper[4931]: I0130 05:14:27.363700 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:14:27 crc kubenswrapper[4931]: I0130 05:14:27.364518 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:14:27 crc kubenswrapper[4931]: I0130 05:14:27.364607 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:14:27 crc kubenswrapper[4931]: I0130 05:14:27.365513 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"595c92fb582df913939826c54f51177c31890e24d0ac56595342acf9749b06a1"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:14:27 crc kubenswrapper[4931]: I0130 05:14:27.365582 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://595c92fb582df913939826c54f51177c31890e24d0ac56595342acf9749b06a1" gracePeriod=600 Jan 30 05:14:28 crc kubenswrapper[4931]: I0130 05:14:28.309479 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="595c92fb582df913939826c54f51177c31890e24d0ac56595342acf9749b06a1" exitCode=0 Jan 30 05:14:28 crc kubenswrapper[4931]: I0130 05:14:28.309575 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"595c92fb582df913939826c54f51177c31890e24d0ac56595342acf9749b06a1"} Jan 30 05:14:28 crc kubenswrapper[4931]: I0130 05:14:28.309915 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"ca23316a7a7a0870cd6ce778a3ddf7b3692d29f58078872d0288efcbee40c2e8"} Jan 30 05:14:28 crc kubenswrapper[4931]: I0130 05:14:28.309968 4931 scope.go:117] "RemoveContainer" containerID="f1f2cffd1648795ac3e32c2c575ebedaff06806cb99eadf1e6def95e47f01d96" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.210457 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb"] Jan 30 05:15:00 crc kubenswrapper[4931]: E0130 05:15:00.211506 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e4a367-9945-4fdb-b5bc-4c8d35512264" containerName="registry" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.211530 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e4a367-9945-4fdb-b5bc-4c8d35512264" containerName="registry" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.211694 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e4a367-9945-4fdb-b5bc-4c8d35512264" containerName="registry" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.212340 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.215094 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.223784 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.231396 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb"] Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.396983 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjfz\" (UniqueName: \"kubernetes.io/projected/2119e7a8-c484-4aef-ac04-c3f82433738d-kube-api-access-bgjfz\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.397333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2119e7a8-c484-4aef-ac04-c3f82433738d-config-volume\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.397601 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2119e7a8-c484-4aef-ac04-c3f82433738d-secret-volume\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.499268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2119e7a8-c484-4aef-ac04-c3f82433738d-config-volume\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.499403 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2119e7a8-c484-4aef-ac04-c3f82433738d-secret-volume\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.499480 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjfz\" (UniqueName: \"kubernetes.io/projected/2119e7a8-c484-4aef-ac04-c3f82433738d-kube-api-access-bgjfz\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.500986 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2119e7a8-c484-4aef-ac04-c3f82433738d-config-volume\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.510699 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2119e7a8-c484-4aef-ac04-c3f82433738d-secret-volume\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.529663 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjfz\" (UniqueName: \"kubernetes.io/projected/2119e7a8-c484-4aef-ac04-c3f82433738d-kube-api-access-bgjfz\") pod \"collect-profiles-29495835-kk9gb\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:00 crc kubenswrapper[4931]: I0130 05:15:00.545190 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:01 crc kubenswrapper[4931]: I0130 05:15:01.054172 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb"] Jan 30 05:15:01 crc kubenswrapper[4931]: W0130 05:15:01.066581 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2119e7a8_c484_4aef_ac04_c3f82433738d.slice/crio-94874094e900641691e4daed7904fb2f898cf4600c87bf824f1a717fb45da16c WatchSource:0}: Error finding container 94874094e900641691e4daed7904fb2f898cf4600c87bf824f1a717fb45da16c: Status 404 returned error can't find the container with id 94874094e900641691e4daed7904fb2f898cf4600c87bf824f1a717fb45da16c Jan 30 05:15:01 crc kubenswrapper[4931]: I0130 05:15:01.607855 4931 generic.go:334] "Generic (PLEG): container finished" podID="2119e7a8-c484-4aef-ac04-c3f82433738d" containerID="93024ef1482e0faf5c83b31d25bb0153752fe08f7d8619cc6cdb7d2120e5e084" exitCode=0 Jan 30 05:15:01 crc kubenswrapper[4931]: I0130 05:15:01.607971 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" event={"ID":"2119e7a8-c484-4aef-ac04-c3f82433738d","Type":"ContainerDied","Data":"93024ef1482e0faf5c83b31d25bb0153752fe08f7d8619cc6cdb7d2120e5e084"} Jan 30 05:15:01 crc kubenswrapper[4931]: I0130 05:15:01.608340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" event={"ID":"2119e7a8-c484-4aef-ac04-c3f82433738d","Type":"ContainerStarted","Data":"94874094e900641691e4daed7904fb2f898cf4600c87bf824f1a717fb45da16c"} Jan 30 05:15:02 crc kubenswrapper[4931]: I0130 05:15:02.956757 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.140839 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2119e7a8-c484-4aef-ac04-c3f82433738d-config-volume\") pod \"2119e7a8-c484-4aef-ac04-c3f82433738d\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.140959 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgjfz\" (UniqueName: \"kubernetes.io/projected/2119e7a8-c484-4aef-ac04-c3f82433738d-kube-api-access-bgjfz\") pod \"2119e7a8-c484-4aef-ac04-c3f82433738d\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.141062 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2119e7a8-c484-4aef-ac04-c3f82433738d-secret-volume\") pod \"2119e7a8-c484-4aef-ac04-c3f82433738d\" (UID: \"2119e7a8-c484-4aef-ac04-c3f82433738d\") " Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.142472 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2119e7a8-c484-4aef-ac04-c3f82433738d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2119e7a8-c484-4aef-ac04-c3f82433738d" (UID: "2119e7a8-c484-4aef-ac04-c3f82433738d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.148370 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2119e7a8-c484-4aef-ac04-c3f82433738d-kube-api-access-bgjfz" (OuterVolumeSpecName: "kube-api-access-bgjfz") pod "2119e7a8-c484-4aef-ac04-c3f82433738d" (UID: "2119e7a8-c484-4aef-ac04-c3f82433738d"). InnerVolumeSpecName "kube-api-access-bgjfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.151762 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2119e7a8-c484-4aef-ac04-c3f82433738d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2119e7a8-c484-4aef-ac04-c3f82433738d" (UID: "2119e7a8-c484-4aef-ac04-c3f82433738d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.243502 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2119e7a8-c484-4aef-ac04-c3f82433738d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.243557 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2119e7a8-c484-4aef-ac04-c3f82433738d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.243577 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgjfz\" (UniqueName: \"kubernetes.io/projected/2119e7a8-c484-4aef-ac04-c3f82433738d-kube-api-access-bgjfz\") on node \"crc\" DevicePath \"\"" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.633126 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" event={"ID":"2119e7a8-c484-4aef-ac04-c3f82433738d","Type":"ContainerDied","Data":"94874094e900641691e4daed7904fb2f898cf4600c87bf824f1a717fb45da16c"} Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.633184 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb" Jan 30 05:15:03 crc kubenswrapper[4931]: I0130 05:15:03.633210 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94874094e900641691e4daed7904fb2f898cf4600c87bf824f1a717fb45da16c" Jan 30 05:16:27 crc kubenswrapper[4931]: I0130 05:16:27.362975 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:16:27 crc kubenswrapper[4931]: I0130 05:16:27.363848 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:16:57 crc kubenswrapper[4931]: I0130 05:16:57.363220 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:16:57 crc kubenswrapper[4931]: I0130 05:16:57.363994 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.363084 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.363848 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.363918 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.364925 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca23316a7a7a0870cd6ce778a3ddf7b3692d29f58078872d0288efcbee40c2e8"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.365074 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://ca23316a7a7a0870cd6ce778a3ddf7b3692d29f58078872d0288efcbee40c2e8" gracePeriod=600 Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.694887 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="ca23316a7a7a0870cd6ce778a3ddf7b3692d29f58078872d0288efcbee40c2e8" exitCode=0 Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.694992 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"ca23316a7a7a0870cd6ce778a3ddf7b3692d29f58078872d0288efcbee40c2e8"} Jan 30 05:17:27 crc kubenswrapper[4931]: I0130 05:17:27.695554 4931 scope.go:117] "RemoveContainer" containerID="595c92fb582df913939826c54f51177c31890e24d0ac56595342acf9749b06a1" Jan 30 05:17:28 crc kubenswrapper[4931]: I0130 05:17:28.709467 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"1794ca6ffdd404c39dffe9fa048526a3a78869de00d876a52d3bd280c8bbc2a2"} Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.554392 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bshbf"] Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556027 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-controller" containerID="cri-o://61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556252 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="sbdb" containerID="cri-o://9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556318 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="nbdb" containerID="cri-o://a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556374 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="northd" containerID="cri-o://839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556463 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556580 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-acl-logging" containerID="cri-o://baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.556904 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-node" containerID="cri-o://a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.622188 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" containerID="cri-o://cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" gracePeriod=30 Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.943514 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/3.log" Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.946122 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovn-acl-logging/0.log" Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.946771 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovn-controller/0.log" Jan 30 05:18:19 crc kubenswrapper[4931]: I0130 05:18:19.947613 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039164 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s68jr"] Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039534 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039553 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039567 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="sbdb" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039576 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="sbdb" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039586 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039593 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039604 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="nbdb" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039610 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="nbdb" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039624 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039630 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039638 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039644 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039654 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="northd" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039662 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="northd" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039672 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039680 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039691 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039701 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039710 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-acl-logging" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039718 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-acl-logging" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039731 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2119e7a8-c484-4aef-ac04-c3f82433738d" containerName="collect-profiles" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039737 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2119e7a8-c484-4aef-ac04-c3f82433738d" containerName="collect-profiles" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039745 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kubecfg-setup" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039751 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kubecfg-setup" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.039761 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-node" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039768 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-node" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039902 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="sbdb" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039916 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039924 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039934 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="nbdb" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039942 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2119e7a8-c484-4aef-ac04-c3f82433738d" containerName="collect-profiles" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039951 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-node" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039959 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039966 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="northd" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039974 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039980 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovn-acl-logging" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039990 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.039997 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.040139 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.040146 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.040339 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" containerName="ovnkube-controller" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.042563 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.070940 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-bin\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.070992 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-slash\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071043 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-systemd\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071114 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-ovn\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071150 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-node-log\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071207 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-script-lib\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071235 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-log-socket\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071254 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-slash" (OuterVolumeSpecName: "host-slash") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071283 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-openvswitch\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071304 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-ovn-kubernetes\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071320 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071324 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-systemd-units\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-netd\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071404 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-etc-openvswitch\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071530 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-config\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071559 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-env-overrides\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071583 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071628 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-var-lib-openvswitch\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071758 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-netns\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-kubelet\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071855 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwbjv\" (UniqueName: \"kubernetes.io/projected/556d9fc5-72b4-4134-8074-1e9d07012763-kube-api-access-rwbjv\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.071995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556d9fc5-72b4-4134-8074-1e9d07012763-ovn-node-metrics-cert\") pod \"556d9fc5-72b4-4134-8074-1e9d07012763\" (UID: \"556d9fc5-72b4-4134-8074-1e9d07012763\") " Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.072063 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.072371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.072480 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-node-log" (OuterVolumeSpecName: "node-log") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.072625 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073085 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073330 4931 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073365 4931 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-slash\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073375 4931 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073384 4931 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-node-log\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073394 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073404 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073413 4931 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073596 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-log-socket" (OuterVolumeSpecName: "log-socket") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073620 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073658 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073680 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073697 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073748 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073808 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.073885 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.074101 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.082568 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556d9fc5-72b4-4134-8074-1e9d07012763-kube-api-access-rwbjv" (OuterVolumeSpecName: "kube-api-access-rwbjv") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "kube-api-access-rwbjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.084544 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/2.log" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.085296 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/1.log" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.085353 4931 generic.go:334] "Generic (PLEG): container finished" podID="b17d6adf-e35b-4bf8-9ab2-e6720e595835" containerID="9cbe0bfee502f12e8f2f3a6f1a461efb27353f5529809ccc54fecbb26b304ada" exitCode=2 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.085473 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerDied","Data":"9cbe0bfee502f12e8f2f3a6f1a461efb27353f5529809ccc54fecbb26b304ada"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.085538 4931 scope.go:117] "RemoveContainer" containerID="c041a8359fd83023525663275e8ceb17e995c49053a1ecb2b1822b95caf07eb0" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.085871 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556d9fc5-72b4-4134-8074-1e9d07012763-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.086399 4931 scope.go:117] "RemoveContainer" containerID="9cbe0bfee502f12e8f2f3a6f1a461efb27353f5529809ccc54fecbb26b304ada" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.086825 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lm7vv_openshift-multus(b17d6adf-e35b-4bf8-9ab2-e6720e595835)\"" pod="openshift-multus/multus-lm7vv" podUID="b17d6adf-e35b-4bf8-9ab2-e6720e595835" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.094994 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovnkube-controller/3.log" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.097014 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "556d9fc5-72b4-4134-8074-1e9d07012763" (UID: "556d9fc5-72b4-4134-8074-1e9d07012763"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.098275 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovn-acl-logging/0.log" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.098890 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bshbf_556d9fc5-72b4-4134-8074-1e9d07012763/ovn-controller/0.log" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099289 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" exitCode=0 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099318 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" exitCode=0 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099328 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" exitCode=0 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099341 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" exitCode=0 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099354 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" exitCode=0 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099366 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" exitCode=0 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099375 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" exitCode=143 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099387 4931 generic.go:334] "Generic (PLEG): container finished" podID="556d9fc5-72b4-4134-8074-1e9d07012763" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" exitCode=143 Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099415 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099487 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099526 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099540 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099554 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099563 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099571 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099578 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099586 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099594 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099602 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099610 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099618 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099628 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099640 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099648 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099656 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099664 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099671 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099679 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099686 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099693 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099701 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099708 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099718 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099728 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099736 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099744 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099753 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099760 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099768 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099775 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099783 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099790 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099798 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099808 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" event={"ID":"556d9fc5-72b4-4134-8074-1e9d07012763","Type":"ContainerDied","Data":"ae003bf2c8441af0b322798040d7d0e26c38e678b0b4800e8ee8c379eec9e42a"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099819 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099827 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099835 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099842 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099849 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099857 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099865 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099872 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099879 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.099886 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.100022 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bshbf" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.123032 4931 scope.go:117] "RemoveContainer" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.153505 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bshbf"] Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.160053 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bshbf"] Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.160569 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.174990 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovnkube-script-lib\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175073 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-ovn\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175115 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovn-node-metrics-cert\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-kubelet\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175183 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-slash\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175226 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-log-socket\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175283 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-node-log\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175320 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-systemd-units\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175346 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovnkube-config\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175376 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-env-overrides\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175447 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-run-netns\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.175479 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.176635 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-cni-netd\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.176784 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-systemd\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.176993 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177080 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-etc-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177124 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177156 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-var-lib-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177288 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-cni-bin\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177407 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-797v8\" (UniqueName: \"kubernetes.io/projected/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-kube-api-access-797v8\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177633 4931 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177658 4931 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-log-socket\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177671 4931 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177684 4931 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177695 4931 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177705 4931 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177716 4931 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177727 4931 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556d9fc5-72b4-4134-8074-1e9d07012763-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177738 4931 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177748 4931 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177759 4931 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556d9fc5-72b4-4134-8074-1e9d07012763-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177790 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwbjv\" (UniqueName: \"kubernetes.io/projected/556d9fc5-72b4-4134-8074-1e9d07012763-kube-api-access-rwbjv\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.177802 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556d9fc5-72b4-4134-8074-1e9d07012763-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.186098 4931 scope.go:117] "RemoveContainer" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.209321 4931 scope.go:117] "RemoveContainer" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.224828 4931 scope.go:117] "RemoveContainer" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.241735 4931 scope.go:117] "RemoveContainer" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.259911 4931 scope.go:117] "RemoveContainer" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.275078 4931 scope.go:117] "RemoveContainer" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279291 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-ovn\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279334 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovn-node-metrics-cert\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279363 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-kubelet\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279386 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-slash\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279415 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-log-socket\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279465 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-node-log\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279487 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-systemd-units\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279508 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovnkube-config\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279528 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-env-overrides\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279556 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-run-netns\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279602 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-cni-netd\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279631 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-systemd\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279678 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-etc-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279723 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279747 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-var-lib-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279771 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-cni-bin\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279796 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-797v8\" (UniqueName: \"kubernetes.io/projected/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-kube-api-access-797v8\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.279821 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovnkube-script-lib\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-etc-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280161 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280115 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-run-netns\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280253 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280251 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-var-lib-openvswitch\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280235 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280312 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-cni-bin\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280318 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-ovn\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280220 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-run-systemd\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280187 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-cni-netd\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280364 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-kubelet\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280462 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-systemd-units\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280466 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-log-socket\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-node-log\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280574 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-host-slash\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280660 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovnkube-script-lib\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.280947 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-env-overrides\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.281208 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovnkube-config\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.284279 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-ovn-node-metrics-cert\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.297473 4931 scope.go:117] "RemoveContainer" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.312877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-797v8\" (UniqueName: \"kubernetes.io/projected/43fde21b-c04b-428e-a4bb-4f6e4969bd5f-kube-api-access-797v8\") pod \"ovnkube-node-s68jr\" (UID: \"43fde21b-c04b-428e-a4bb-4f6e4969bd5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.313386 4931 scope.go:117] "RemoveContainer" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.330187 4931 scope.go:117] "RemoveContainer" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.330736 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": container with ID starting with cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c not found: ID does not exist" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.330820 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} err="failed to get container status \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": rpc error: code = NotFound desc = could not find container \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": container with ID starting with cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.330887 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.331461 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": container with ID starting with 7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c not found: ID does not exist" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.331504 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} err="failed to get container status \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": rpc error: code = NotFound desc = could not find container \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": container with ID starting with 7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.331551 4931 scope.go:117] "RemoveContainer" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.332116 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": container with ID starting with 9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8 not found: ID does not exist" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.332172 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} err="failed to get container status \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": rpc error: code = NotFound desc = could not find container \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": container with ID starting with 9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.332212 4931 scope.go:117] "RemoveContainer" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.332677 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": container with ID starting with a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29 not found: ID does not exist" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.332704 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} err="failed to get container status \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": rpc error: code = NotFound desc = could not find container \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": container with ID starting with a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.332727 4931 scope.go:117] "RemoveContainer" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.333061 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": container with ID starting with 839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512 not found: ID does not exist" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.333111 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} err="failed to get container status \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": rpc error: code = NotFound desc = could not find container \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": container with ID starting with 839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.333143 4931 scope.go:117] "RemoveContainer" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.333581 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": container with ID starting with 42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3 not found: ID does not exist" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.333606 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} err="failed to get container status \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": rpc error: code = NotFound desc = could not find container \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": container with ID starting with 42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.333623 4931 scope.go:117] "RemoveContainer" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.333853 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": container with ID starting with a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b not found: ID does not exist" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.333871 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} err="failed to get container status \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": rpc error: code = NotFound desc = could not find container \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": container with ID starting with a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.333888 4931 scope.go:117] "RemoveContainer" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.334248 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": container with ID starting with baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8 not found: ID does not exist" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.334268 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} err="failed to get container status \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": rpc error: code = NotFound desc = could not find container \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": container with ID starting with baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.334282 4931 scope.go:117] "RemoveContainer" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.334701 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": container with ID starting with 61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0 not found: ID does not exist" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.334738 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} err="failed to get container status \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": rpc error: code = NotFound desc = could not find container \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": container with ID starting with 61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.334758 4931 scope.go:117] "RemoveContainer" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" Jan 30 05:18:20 crc kubenswrapper[4931]: E0130 05:18:20.335099 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": container with ID starting with e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929 not found: ID does not exist" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.335155 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} err="failed to get container status \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": rpc error: code = NotFound desc = could not find container \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": container with ID starting with e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.335185 4931 scope.go:117] "RemoveContainer" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.335591 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} err="failed to get container status \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": rpc error: code = NotFound desc = could not find container \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": container with ID starting with cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.335617 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.335967 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} err="failed to get container status \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": rpc error: code = NotFound desc = could not find container \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": container with ID starting with 7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.335999 4931 scope.go:117] "RemoveContainer" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.336366 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} err="failed to get container status \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": rpc error: code = NotFound desc = could not find container \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": container with ID starting with 9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.336402 4931 scope.go:117] "RemoveContainer" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.336794 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} err="failed to get container status \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": rpc error: code = NotFound desc = could not find container \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": container with ID starting with a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.336819 4931 scope.go:117] "RemoveContainer" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.337534 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} err="failed to get container status \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": rpc error: code = NotFound desc = could not find container \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": container with ID starting with 839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.337572 4931 scope.go:117] "RemoveContainer" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.337916 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} err="failed to get container status \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": rpc error: code = NotFound desc = could not find container \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": container with ID starting with 42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.337942 4931 scope.go:117] "RemoveContainer" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.338293 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} err="failed to get container status \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": rpc error: code = NotFound desc = could not find container \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": container with ID starting with a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.338329 4931 scope.go:117] "RemoveContainer" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.338744 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} err="failed to get container status \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": rpc error: code = NotFound desc = could not find container \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": container with ID starting with baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.338792 4931 scope.go:117] "RemoveContainer" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.339155 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} err="failed to get container status \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": rpc error: code = NotFound desc = could not find container \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": container with ID starting with 61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.339183 4931 scope.go:117] "RemoveContainer" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.339509 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} err="failed to get container status \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": rpc error: code = NotFound desc = could not find container \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": container with ID starting with e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.339541 4931 scope.go:117] "RemoveContainer" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.339853 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} err="failed to get container status \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": rpc error: code = NotFound desc = could not find container \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": container with ID starting with cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.339884 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.340191 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} err="failed to get container status \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": rpc error: code = NotFound desc = could not find container \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": container with ID starting with 7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.340222 4931 scope.go:117] "RemoveContainer" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.340560 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} err="failed to get container status \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": rpc error: code = NotFound desc = could not find container \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": container with ID starting with 9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.340599 4931 scope.go:117] "RemoveContainer" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.340954 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} err="failed to get container status \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": rpc error: code = NotFound desc = could not find container \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": container with ID starting with a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.340979 4931 scope.go:117] "RemoveContainer" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.341472 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} err="failed to get container status \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": rpc error: code = NotFound desc = could not find container \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": container with ID starting with 839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.341564 4931 scope.go:117] "RemoveContainer" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.342099 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} err="failed to get container status \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": rpc error: code = NotFound desc = could not find container \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": container with ID starting with 42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.342123 4931 scope.go:117] "RemoveContainer" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.342707 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} err="failed to get container status \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": rpc error: code = NotFound desc = could not find container \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": container with ID starting with a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.342734 4931 scope.go:117] "RemoveContainer" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.343050 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} err="failed to get container status \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": rpc error: code = NotFound desc = could not find container \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": container with ID starting with baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.343081 4931 scope.go:117] "RemoveContainer" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.343398 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} err="failed to get container status \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": rpc error: code = NotFound desc = could not find container \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": container with ID starting with 61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.343441 4931 scope.go:117] "RemoveContainer" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.343757 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} err="failed to get container status \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": rpc error: code = NotFound desc = could not find container \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": container with ID starting with e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.343787 4931 scope.go:117] "RemoveContainer" containerID="cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.344088 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c"} err="failed to get container status \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": rpc error: code = NotFound desc = could not find container \"cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c\": container with ID starting with cd400fd007bec7543ab705386136ac794f8c4b56a2679a370f013421aa13560c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.344141 4931 scope.go:117] "RemoveContainer" containerID="7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.344445 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c"} err="failed to get container status \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": rpc error: code = NotFound desc = could not find container \"7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c\": container with ID starting with 7ef95db790dfeb20be4030c02df79f432dee70d8b4f237ec1ec01553b5c1f98c not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.344471 4931 scope.go:117] "RemoveContainer" containerID="9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.344762 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8"} err="failed to get container status \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": rpc error: code = NotFound desc = could not find container \"9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8\": container with ID starting with 9306b8ab9acd91e0b773b5e755ba4c868d5dbf2863dae9eed3edf578a3b5bbb8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.344791 4931 scope.go:117] "RemoveContainer" containerID="a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.345160 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29"} err="failed to get container status \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": rpc error: code = NotFound desc = could not find container \"a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29\": container with ID starting with a8e71ffb4316ce6909502b021dd80cc5d80c7ce1f480eaeb11a6a4198c0e7b29 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.345215 4931 scope.go:117] "RemoveContainer" containerID="839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.345593 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512"} err="failed to get container status \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": rpc error: code = NotFound desc = could not find container \"839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512\": container with ID starting with 839a02e2f55c01661faa021e3705a8969ef395cd9fbbce16ee22b3e4a1c82512 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.345618 4931 scope.go:117] "RemoveContainer" containerID="42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.345913 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3"} err="failed to get container status \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": rpc error: code = NotFound desc = could not find container \"42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3\": container with ID starting with 42be2b3ccb29f66b8f4ff7c6c5daa729046e541adf8447dac6987339efabbbd3 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.345940 4931 scope.go:117] "RemoveContainer" containerID="a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.346209 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b"} err="failed to get container status \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": rpc error: code = NotFound desc = could not find container \"a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b\": container with ID starting with a8837cc5d6b53dfbeeb0b552e656854dd0111ff1e35ff8fdf92c418fe3870b7b not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.346235 4931 scope.go:117] "RemoveContainer" containerID="baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.346587 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8"} err="failed to get container status \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": rpc error: code = NotFound desc = could not find container \"baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8\": container with ID starting with baa376478cdeb9a9f72f7c08d46154891a0e1b1d67d40ed2b31e48b3275806a8 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.346617 4931 scope.go:117] "RemoveContainer" containerID="61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.346895 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0"} err="failed to get container status \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": rpc error: code = NotFound desc = could not find container \"61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0\": container with ID starting with 61e4e9b07acc1ee07042237be757b14f1b8680e2bdae3a4c64390e055b00bfb0 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.346935 4931 scope.go:117] "RemoveContainer" containerID="e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.347335 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929"} err="failed to get container status \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": rpc error: code = NotFound desc = could not find container \"e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929\": container with ID starting with e4e6ad71fc07be5e4f4786c4302cf0df65534936f3d55a6a2976a8fa9f15d929 not found: ID does not exist" Jan 30 05:18:20 crc kubenswrapper[4931]: I0130 05:18:20.359794 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:21 crc kubenswrapper[4931]: I0130 05:18:21.110459 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/2.log" Jan 30 05:18:21 crc kubenswrapper[4931]: I0130 05:18:21.116879 4931 generic.go:334] "Generic (PLEG): container finished" podID="43fde21b-c04b-428e-a4bb-4f6e4969bd5f" containerID="4a4975462ab619a6fe83eeec32e381f839f4a0b39193714651129b6f8513eea2" exitCode=0 Jan 30 05:18:21 crc kubenswrapper[4931]: I0130 05:18:21.116937 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerDied","Data":"4a4975462ab619a6fe83eeec32e381f839f4a0b39193714651129b6f8513eea2"} Jan 30 05:18:21 crc kubenswrapper[4931]: I0130 05:18:21.116980 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"448490ef9d880fe87b6b86021fd5d8abf9e1ea792e944858594d2bfebb08c624"} Jan 30 05:18:21 crc kubenswrapper[4931]: I0130 05:18:21.431091 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556d9fc5-72b4-4134-8074-1e9d07012763" path="/var/lib/kubelet/pods/556d9fc5-72b4-4134-8074-1e9d07012763/volumes" Jan 30 05:18:22 crc kubenswrapper[4931]: I0130 05:18:22.127574 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"7765234d2d4274b4269d0d1225cb6927813c464396ba77aec3ae59ff4dea7ac1"} Jan 30 05:18:22 crc kubenswrapper[4931]: I0130 05:18:22.127977 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"bba9b8c2003f76cb846598347185800b35b93599340fe987706c8a31e115cdde"} Jan 30 05:18:22 crc kubenswrapper[4931]: I0130 05:18:22.127998 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"aa67ee282b6ac130c8c72c8dcde4cbb3d54b1a96cbfb0c2927617f89f7096859"} Jan 30 05:18:22 crc kubenswrapper[4931]: I0130 05:18:22.128017 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"cfa995b9fb3214346aa12c010b006cd6c0dfd6dd8d4da1ca01a02a72cbc91337"} Jan 30 05:18:22 crc kubenswrapper[4931]: I0130 05:18:22.128036 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"917280023b05a7246efa0dd53af4fa0193bd41e1f06b879b46889c99f98f88c9"} Jan 30 05:18:22 crc kubenswrapper[4931]: I0130 05:18:22.128054 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"adbc78a70d9bb6a1934105580e943aef2e50b51b8a076e9f8c8a63802e5a552c"} Jan 30 05:18:25 crc kubenswrapper[4931]: I0130 05:18:25.159271 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"eafd81b26ecd62c7e0c251b2c5c1c810c1a880314f61f58fb274c6c1bc655810"} Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.770713 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mlqzd"] Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.771887 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.774201 4931 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-ff66z" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.775208 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.775388 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.775890 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.883384 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f395498-8955-4aa5-b283-62e5b12505f1-node-mnt\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.883611 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vf8z\" (UniqueName: \"kubernetes.io/projected/7f395498-8955-4aa5-b283-62e5b12505f1-kube-api-access-9vf8z\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.883688 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f395498-8955-4aa5-b283-62e5b12505f1-crc-storage\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.984905 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f395498-8955-4aa5-b283-62e5b12505f1-node-mnt\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.985266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vf8z\" (UniqueName: \"kubernetes.io/projected/7f395498-8955-4aa5-b283-62e5b12505f1-kube-api-access-9vf8z\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.985288 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f395498-8955-4aa5-b283-62e5b12505f1-crc-storage\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.985678 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f395498-8955-4aa5-b283-62e5b12505f1-node-mnt\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:26 crc kubenswrapper[4931]: I0130 05:18:26.986132 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f395498-8955-4aa5-b283-62e5b12505f1-crc-storage\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.011785 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vf8z\" (UniqueName: \"kubernetes.io/projected/7f395498-8955-4aa5-b283-62e5b12505f1-kube-api-access-9vf8z\") pod \"crc-storage-crc-mlqzd\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.092074 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.113376 4931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(737c11a956ee077a82213f856be422791ad108dda5dbd2b38e8b7724d86caa2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.113458 4931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(737c11a956ee077a82213f856be422791ad108dda5dbd2b38e8b7724d86caa2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.113482 4931 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(737c11a956ee077a82213f856be422791ad108dda5dbd2b38e8b7724d86caa2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.113545 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(737c11a956ee077a82213f856be422791ad108dda5dbd2b38e8b7724d86caa2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-mlqzd" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.202828 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" event={"ID":"43fde21b-c04b-428e-a4bb-4f6e4969bd5f","Type":"ContainerStarted","Data":"f494d142d3810a8c02eb414cfcd8f08530c3d20aeb73cd663fd224f262416510"} Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.203453 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.203530 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.203571 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.241123 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.242589 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.245605 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" podStartSLOduration=7.245583745 podStartE2EDuration="7.245583745s" podCreationTimestamp="2026-01-30 05:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:18:27.241410226 +0000 UTC m=+642.611320503" watchObservedRunningTime="2026-01-30 05:18:27.245583745 +0000 UTC m=+642.615494002" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.492937 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mlqzd"] Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.493516 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: I0130 05:18:27.494316 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.520717 4931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(40ec2700f34632fc2691d55591e0d01e9f867adc960b314c3cde6c470e1c0e87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.520834 4931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(40ec2700f34632fc2691d55591e0d01e9f867adc960b314c3cde6c470e1c0e87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.520867 4931 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(40ec2700f34632fc2691d55591e0d01e9f867adc960b314c3cde6c470e1c0e87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:27 crc kubenswrapper[4931]: E0130 05:18:27.520951 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(40ec2700f34632fc2691d55591e0d01e9f867adc960b314c3cde6c470e1c0e87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-mlqzd" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" Jan 30 05:18:35 crc kubenswrapper[4931]: I0130 05:18:35.426998 4931 scope.go:117] "RemoveContainer" containerID="9cbe0bfee502f12e8f2f3a6f1a461efb27353f5529809ccc54fecbb26b304ada" Jan 30 05:18:35 crc kubenswrapper[4931]: E0130 05:18:35.428056 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lm7vv_openshift-multus(b17d6adf-e35b-4bf8-9ab2-e6720e595835)\"" pod="openshift-multus/multus-lm7vv" podUID="b17d6adf-e35b-4bf8-9ab2-e6720e595835" Jan 30 05:18:38 crc kubenswrapper[4931]: I0130 05:18:38.420987 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:38 crc kubenswrapper[4931]: I0130 05:18:38.421778 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:38 crc kubenswrapper[4931]: E0130 05:18:38.461527 4931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(422641cfdadcfe112831ce54cf208175350f6941b2c889305fecd5510bf2deed): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:18:38 crc kubenswrapper[4931]: E0130 05:18:38.461624 4931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(422641cfdadcfe112831ce54cf208175350f6941b2c889305fecd5510bf2deed): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:38 crc kubenswrapper[4931]: E0130 05:18:38.461659 4931 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(422641cfdadcfe112831ce54cf208175350f6941b2c889305fecd5510bf2deed): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:38 crc kubenswrapper[4931]: E0130 05:18:38.461739 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(422641cfdadcfe112831ce54cf208175350f6941b2c889305fecd5510bf2deed): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-mlqzd" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" Jan 30 05:18:48 crc kubenswrapper[4931]: I0130 05:18:48.422191 4931 scope.go:117] "RemoveContainer" containerID="9cbe0bfee502f12e8f2f3a6f1a461efb27353f5529809ccc54fecbb26b304ada" Jan 30 05:18:49 crc kubenswrapper[4931]: I0130 05:18:49.376047 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lm7vv_b17d6adf-e35b-4bf8-9ab2-e6720e595835/kube-multus/2.log" Jan 30 05:18:49 crc kubenswrapper[4931]: I0130 05:18:49.376520 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lm7vv" event={"ID":"b17d6adf-e35b-4bf8-9ab2-e6720e595835","Type":"ContainerStarted","Data":"60801b60c842bc20aee7bc70499d177ba3a056474ac483bd7ce3e22f46834e1d"} Jan 30 05:18:49 crc kubenswrapper[4931]: I0130 05:18:49.421258 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:49 crc kubenswrapper[4931]: I0130 05:18:49.421904 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:49 crc kubenswrapper[4931]: E0130 05:18:49.487486 4931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(e0beac559643fc87ac54071364a74e16a75eb767f0589d87e2a18090fb0c942e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:18:49 crc kubenswrapper[4931]: E0130 05:18:49.487609 4931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(e0beac559643fc87ac54071364a74e16a75eb767f0589d87e2a18090fb0c942e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:49 crc kubenswrapper[4931]: E0130 05:18:49.487663 4931 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(e0beac559643fc87ac54071364a74e16a75eb767f0589d87e2a18090fb0c942e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:18:49 crc kubenswrapper[4931]: E0130 05:18:49.487773 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-mlqzd_crc-storage(7f395498-8955-4aa5-b283-62e5b12505f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-mlqzd_crc-storage_7f395498-8955-4aa5-b283-62e5b12505f1_0(e0beac559643fc87ac54071364a74e16a75eb767f0589d87e2a18090fb0c942e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-mlqzd" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" Jan 30 05:18:50 crc kubenswrapper[4931]: I0130 05:18:50.400654 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s68jr" Jan 30 05:19:00 crc kubenswrapper[4931]: I0130 05:19:00.421139 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:19:00 crc kubenswrapper[4931]: I0130 05:19:00.422393 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:19:00 crc kubenswrapper[4931]: I0130 05:19:00.688567 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mlqzd"] Jan 30 05:19:00 crc kubenswrapper[4931]: I0130 05:19:00.703017 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:19:01 crc kubenswrapper[4931]: I0130 05:19:01.493568 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mlqzd" event={"ID":"7f395498-8955-4aa5-b283-62e5b12505f1","Type":"ContainerStarted","Data":"a5bbe8e3c7fb83e204f7e74e58ca6a9e3036aba44081e4acf435a3516fa09a86"} Jan 30 05:19:02 crc kubenswrapper[4931]: I0130 05:19:02.508288 4931 generic.go:334] "Generic (PLEG): container finished" podID="7f395498-8955-4aa5-b283-62e5b12505f1" containerID="ceeb8bcdff334f1b3490e1ee30443dff7dd6fd17a3f2d90428a1f38ad6f3cd5e" exitCode=0 Jan 30 05:19:02 crc kubenswrapper[4931]: I0130 05:19:02.508507 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mlqzd" event={"ID":"7f395498-8955-4aa5-b283-62e5b12505f1","Type":"ContainerDied","Data":"ceeb8bcdff334f1b3490e1ee30443dff7dd6fd17a3f2d90428a1f38ad6f3cd5e"} Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.837671 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.874359 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f395498-8955-4aa5-b283-62e5b12505f1-node-mnt\") pod \"7f395498-8955-4aa5-b283-62e5b12505f1\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.874415 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f395498-8955-4aa5-b283-62e5b12505f1-crc-storage\") pod \"7f395498-8955-4aa5-b283-62e5b12505f1\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.874450 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vf8z\" (UniqueName: \"kubernetes.io/projected/7f395498-8955-4aa5-b283-62e5b12505f1-kube-api-access-9vf8z\") pod \"7f395498-8955-4aa5-b283-62e5b12505f1\" (UID: \"7f395498-8955-4aa5-b283-62e5b12505f1\") " Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.874556 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f395498-8955-4aa5-b283-62e5b12505f1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7f395498-8955-4aa5-b283-62e5b12505f1" (UID: "7f395498-8955-4aa5-b283-62e5b12505f1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.880951 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f395498-8955-4aa5-b283-62e5b12505f1-kube-api-access-9vf8z" (OuterVolumeSpecName: "kube-api-access-9vf8z") pod "7f395498-8955-4aa5-b283-62e5b12505f1" (UID: "7f395498-8955-4aa5-b283-62e5b12505f1"). InnerVolumeSpecName "kube-api-access-9vf8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.896323 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f395498-8955-4aa5-b283-62e5b12505f1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7f395498-8955-4aa5-b283-62e5b12505f1" (UID: "7f395498-8955-4aa5-b283-62e5b12505f1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.975628 4931 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7f395498-8955-4aa5-b283-62e5b12505f1-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.975670 4931 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7f395498-8955-4aa5-b283-62e5b12505f1-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:03 crc kubenswrapper[4931]: I0130 05:19:03.975684 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vf8z\" (UniqueName: \"kubernetes.io/projected/7f395498-8955-4aa5-b283-62e5b12505f1-kube-api-access-9vf8z\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:04 crc kubenswrapper[4931]: I0130 05:19:04.525994 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mlqzd" event={"ID":"7f395498-8955-4aa5-b283-62e5b12505f1","Type":"ContainerDied","Data":"a5bbe8e3c7fb83e204f7e74e58ca6a9e3036aba44081e4acf435a3516fa09a86"} Jan 30 05:19:04 crc kubenswrapper[4931]: I0130 05:19:04.526305 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5bbe8e3c7fb83e204f7e74e58ca6a9e3036aba44081e4acf435a3516fa09a86" Jan 30 05:19:04 crc kubenswrapper[4931]: I0130 05:19:04.526121 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlqzd" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.208906 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4"] Jan 30 05:19:12 crc kubenswrapper[4931]: E0130 05:19:12.209724 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" containerName="storage" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.209744 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" containerName="storage" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.209929 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" containerName="storage" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.211069 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.213499 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.224384 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4"] Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.338876 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.339246 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.339327 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpd46\" (UniqueName: \"kubernetes.io/projected/52241d6a-5526-4d2b-baeb-e1fd0361a188-kube-api-access-lpd46\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.441487 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.441575 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpd46\" (UniqueName: \"kubernetes.io/projected/52241d6a-5526-4d2b-baeb-e1fd0361a188-kube-api-access-lpd46\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.441642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.442402 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.442972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.475777 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpd46\" (UniqueName: \"kubernetes.io/projected/52241d6a-5526-4d2b-baeb-e1fd0361a188-kube-api-access-lpd46\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:12 crc kubenswrapper[4931]: I0130 05:19:12.537569 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:13 crc kubenswrapper[4931]: I0130 05:19:13.015710 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4"] Jan 30 05:19:13 crc kubenswrapper[4931]: I0130 05:19:13.589747 4931 generic.go:334] "Generic (PLEG): container finished" podID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerID="9e96d1070728e64bf500446afd3ca1d2228d224750611a62c09395ec57bc5ab0" exitCode=0 Jan 30 05:19:13 crc kubenswrapper[4931]: I0130 05:19:13.589861 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" event={"ID":"52241d6a-5526-4d2b-baeb-e1fd0361a188","Type":"ContainerDied","Data":"9e96d1070728e64bf500446afd3ca1d2228d224750611a62c09395ec57bc5ab0"} Jan 30 05:19:13 crc kubenswrapper[4931]: I0130 05:19:13.590207 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" event={"ID":"52241d6a-5526-4d2b-baeb-e1fd0361a188","Type":"ContainerStarted","Data":"ef6f7ae52213ba6bff1be6d319e5c662728138a3bc0b1903a9f9435f2e6d5101"} Jan 30 05:19:15 crc kubenswrapper[4931]: I0130 05:19:15.606184 4931 generic.go:334] "Generic (PLEG): container finished" podID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerID="b6f970948e85c4ef45a48cc9b5b4ef2675ec23befd30a8e3faeb7d9d92cc97b5" exitCode=0 Jan 30 05:19:15 crc kubenswrapper[4931]: I0130 05:19:15.606476 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" event={"ID":"52241d6a-5526-4d2b-baeb-e1fd0361a188","Type":"ContainerDied","Data":"b6f970948e85c4ef45a48cc9b5b4ef2675ec23befd30a8e3faeb7d9d92cc97b5"} Jan 30 05:19:16 crc kubenswrapper[4931]: I0130 05:19:16.617852 4931 generic.go:334] "Generic (PLEG): container finished" podID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerID="2590304120f8b42483614a45981a2e4b17d4d576509eea7818260063069d09e5" exitCode=0 Jan 30 05:19:16 crc kubenswrapper[4931]: I0130 05:19:16.618080 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" event={"ID":"52241d6a-5526-4d2b-baeb-e1fd0361a188","Type":"ContainerDied","Data":"2590304120f8b42483614a45981a2e4b17d4d576509eea7818260063069d09e5"} Jan 30 05:19:17 crc kubenswrapper[4931]: I0130 05:19:17.955202 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.017043 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpd46\" (UniqueName: \"kubernetes.io/projected/52241d6a-5526-4d2b-baeb-e1fd0361a188-kube-api-access-lpd46\") pod \"52241d6a-5526-4d2b-baeb-e1fd0361a188\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.017192 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-bundle\") pod \"52241d6a-5526-4d2b-baeb-e1fd0361a188\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.017263 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-util\") pod \"52241d6a-5526-4d2b-baeb-e1fd0361a188\" (UID: \"52241d6a-5526-4d2b-baeb-e1fd0361a188\") " Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.018717 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-bundle" (OuterVolumeSpecName: "bundle") pod "52241d6a-5526-4d2b-baeb-e1fd0361a188" (UID: "52241d6a-5526-4d2b-baeb-e1fd0361a188"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.023762 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52241d6a-5526-4d2b-baeb-e1fd0361a188-kube-api-access-lpd46" (OuterVolumeSpecName: "kube-api-access-lpd46") pod "52241d6a-5526-4d2b-baeb-e1fd0361a188" (UID: "52241d6a-5526-4d2b-baeb-e1fd0361a188"). InnerVolumeSpecName "kube-api-access-lpd46". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.056439 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-util" (OuterVolumeSpecName: "util") pod "52241d6a-5526-4d2b-baeb-e1fd0361a188" (UID: "52241d6a-5526-4d2b-baeb-e1fd0361a188"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.119471 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.119535 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52241d6a-5526-4d2b-baeb-e1fd0361a188-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.119560 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpd46\" (UniqueName: \"kubernetes.io/projected/52241d6a-5526-4d2b-baeb-e1fd0361a188-kube-api-access-lpd46\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.635369 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" event={"ID":"52241d6a-5526-4d2b-baeb-e1fd0361a188","Type":"ContainerDied","Data":"ef6f7ae52213ba6bff1be6d319e5c662728138a3bc0b1903a9f9435f2e6d5101"} Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.635450 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef6f7ae52213ba6bff1be6d319e5c662728138a3bc0b1903a9f9435f2e6d5101" Jan 30 05:19:18 crc kubenswrapper[4931]: I0130 05:19:18.635543 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.697135 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5tdhq"] Jan 30 05:19:20 crc kubenswrapper[4931]: E0130 05:19:20.697574 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="util" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.697585 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="util" Jan 30 05:19:20 crc kubenswrapper[4931]: E0130 05:19:20.697597 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="pull" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.697603 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="pull" Jan 30 05:19:20 crc kubenswrapper[4931]: E0130 05:19:20.697612 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="extract" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.697618 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="extract" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.697700 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52241d6a-5526-4d2b-baeb-e1fd0361a188" containerName="extract" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.698060 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.699643 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gxxfj" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.700028 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.700572 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.708414 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5tdhq"] Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.756482 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv9xw\" (UniqueName: \"kubernetes.io/projected/1c291268-6fc4-48a1-94dc-1e9e052e7bc6-kube-api-access-zv9xw\") pod \"nmstate-operator-646758c888-5tdhq\" (UID: \"1c291268-6fc4-48a1-94dc-1e9e052e7bc6\") " pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.857902 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv9xw\" (UniqueName: \"kubernetes.io/projected/1c291268-6fc4-48a1-94dc-1e9e052e7bc6-kube-api-access-zv9xw\") pod \"nmstate-operator-646758c888-5tdhq\" (UID: \"1c291268-6fc4-48a1-94dc-1e9e052e7bc6\") " pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" Jan 30 05:19:20 crc kubenswrapper[4931]: I0130 05:19:20.878479 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv9xw\" (UniqueName: \"kubernetes.io/projected/1c291268-6fc4-48a1-94dc-1e9e052e7bc6-kube-api-access-zv9xw\") pod \"nmstate-operator-646758c888-5tdhq\" (UID: \"1c291268-6fc4-48a1-94dc-1e9e052e7bc6\") " pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" Jan 30 05:19:21 crc kubenswrapper[4931]: I0130 05:19:21.012057 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" Jan 30 05:19:21 crc kubenswrapper[4931]: I0130 05:19:21.283992 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5tdhq"] Jan 30 05:19:21 crc kubenswrapper[4931]: W0130 05:19:21.289938 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c291268_6fc4_48a1_94dc_1e9e052e7bc6.slice/crio-175c92fcf7f5771dd798b8c59937aa27735d46da1aa93fcff2dd15cbb070211d WatchSource:0}: Error finding container 175c92fcf7f5771dd798b8c59937aa27735d46da1aa93fcff2dd15cbb070211d: Status 404 returned error can't find the container with id 175c92fcf7f5771dd798b8c59937aa27735d46da1aa93fcff2dd15cbb070211d Jan 30 05:19:21 crc kubenswrapper[4931]: I0130 05:19:21.655816 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" event={"ID":"1c291268-6fc4-48a1-94dc-1e9e052e7bc6","Type":"ContainerStarted","Data":"175c92fcf7f5771dd798b8c59937aa27735d46da1aa93fcff2dd15cbb070211d"} Jan 30 05:19:24 crc kubenswrapper[4931]: I0130 05:19:24.684767 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" event={"ID":"1c291268-6fc4-48a1-94dc-1e9e052e7bc6","Type":"ContainerStarted","Data":"621677c1aef1ee0502a74c45f6339f605cec24c4c5cc5583afa3ef7b97829907"} Jan 30 05:19:24 crc kubenswrapper[4931]: I0130 05:19:24.717083 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-5tdhq" podStartSLOduration=2.499103366 podStartE2EDuration="4.717056583s" podCreationTimestamp="2026-01-30 05:19:20 +0000 UTC" firstStartedPulling="2026-01-30 05:19:21.292055165 +0000 UTC m=+696.661965432" lastFinishedPulling="2026-01-30 05:19:23.510008392 +0000 UTC m=+698.879918649" observedRunningTime="2026-01-30 05:19:24.711310456 +0000 UTC m=+700.081220753" watchObservedRunningTime="2026-01-30 05:19:24.717056583 +0000 UTC m=+700.086966880" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.709516 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2z4jr"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.710361 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.722213 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2z4jr"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.727853 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.728729 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.729974 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.733026 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fp7dl" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.752709 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.789173 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6mhzq"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.789819 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823453 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llc2h\" (UniqueName: \"kubernetes.io/projected/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-kube-api-access-llc2h\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823524 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-ovs-socket\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823546 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-nmstate-lock\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a115b68a-a9ad-44db-90f5-1f016556956a-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-krf2l\" (UID: \"a115b68a-a9ad-44db-90f5-1f016556956a\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823603 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc6sc\" (UniqueName: \"kubernetes.io/projected/01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c-kube-api-access-sc6sc\") pod \"nmstate-metrics-54757c584b-2z4jr\" (UID: \"01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823628 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-dbus-socket\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.823660 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flztx\" (UniqueName: \"kubernetes.io/projected/a115b68a-a9ad-44db-90f5-1f016556956a-kube-api-access-flztx\") pod \"nmstate-webhook-8474b5b9d8-krf2l\" (UID: \"a115b68a-a9ad-44db-90f5-1f016556956a\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.865614 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.866471 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.871069 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.871397 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.871540 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-khtrv" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.872976 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2"] Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.924971 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-dbus-socket\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925060 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8800ae15-51ee-4310-889d-3608008986bd-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925130 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flztx\" (UniqueName: \"kubernetes.io/projected/a115b68a-a9ad-44db-90f5-1f016556956a-kube-api-access-flztx\") pod \"nmstate-webhook-8474b5b9d8-krf2l\" (UID: \"a115b68a-a9ad-44db-90f5-1f016556956a\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925161 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hk8n\" (UniqueName: \"kubernetes.io/projected/8800ae15-51ee-4310-889d-3608008986bd-kube-api-access-7hk8n\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925285 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llc2h\" (UniqueName: \"kubernetes.io/projected/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-kube-api-access-llc2h\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925295 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-dbus-socket\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925320 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8800ae15-51ee-4310-889d-3608008986bd-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925378 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-ovs-socket\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925408 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-nmstate-lock\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925475 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a115b68a-a9ad-44db-90f5-1f016556956a-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-krf2l\" (UID: \"a115b68a-a9ad-44db-90f5-1f016556956a\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925498 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc6sc\" (UniqueName: \"kubernetes.io/projected/01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c-kube-api-access-sc6sc\") pod \"nmstate-metrics-54757c584b-2z4jr\" (UID: \"01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925603 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-ovs-socket\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.925660 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-nmstate-lock\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.932934 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a115b68a-a9ad-44db-90f5-1f016556956a-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-krf2l\" (UID: \"a115b68a-a9ad-44db-90f5-1f016556956a\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.939511 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc6sc\" (UniqueName: \"kubernetes.io/projected/01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c-kube-api-access-sc6sc\") pod \"nmstate-metrics-54757c584b-2z4jr\" (UID: \"01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.939519 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llc2h\" (UniqueName: \"kubernetes.io/projected/66e77bed-ca3a-4cfe-874c-d6874c52ab0e-kube-api-access-llc2h\") pod \"nmstate-handler-6mhzq\" (UID: \"66e77bed-ca3a-4cfe-874c-d6874c52ab0e\") " pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:25 crc kubenswrapper[4931]: I0130 05:19:25.940231 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flztx\" (UniqueName: \"kubernetes.io/projected/a115b68a-a9ad-44db-90f5-1f016556956a-kube-api-access-flztx\") pod \"nmstate-webhook-8474b5b9d8-krf2l\" (UID: \"a115b68a-a9ad-44db-90f5-1f016556956a\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.026231 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8800ae15-51ee-4310-889d-3608008986bd-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: E0130 05:19:26.026555 4931 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 30 05:19:26 crc kubenswrapper[4931]: E0130 05:19:26.026653 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8800ae15-51ee-4310-889d-3608008986bd-plugin-serving-cert podName:8800ae15-51ee-4310-889d-3608008986bd nodeName:}" failed. No retries permitted until 2026-01-30 05:19:26.526627577 +0000 UTC m=+701.896537884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8800ae15-51ee-4310-889d-3608008986bd-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-mwzz2" (UID: "8800ae15-51ee-4310-889d-3608008986bd") : secret "plugin-serving-cert" not found Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.026715 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hk8n\" (UniqueName: \"kubernetes.io/projected/8800ae15-51ee-4310-889d-3608008986bd-kube-api-access-7hk8n\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.026879 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8800ae15-51ee-4310-889d-3608008986bd-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.027637 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.027861 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8800ae15-51ee-4310-889d-3608008986bd-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.045731 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.046677 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hk8n\" (UniqueName: \"kubernetes.io/projected/8800ae15-51ee-4310-889d-3608008986bd-kube-api-access-7hk8n\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.055554 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5977cc965f-tjfns"] Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.060310 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.070074 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5977cc965f-tjfns"] Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.102708 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-oauth-serving-cert\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62gt\" (UniqueName: \"kubernetes.io/projected/5ce22043-f6b4-4294-8522-339a87e7b68a-kube-api-access-j62gt\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127604 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-service-ca\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127628 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ce22043-f6b4-4294-8522-339a87e7b68a-console-oauth-config\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127646 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ce22043-f6b4-4294-8522-339a87e7b68a-console-serving-cert\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127675 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-trusted-ca-bundle\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.127710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-console-config\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229140 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-console-config\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229488 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-oauth-serving-cert\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229521 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62gt\" (UniqueName: \"kubernetes.io/projected/5ce22043-f6b4-4294-8522-339a87e7b68a-kube-api-access-j62gt\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229538 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-service-ca\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229561 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ce22043-f6b4-4294-8522-339a87e7b68a-console-oauth-config\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229580 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ce22043-f6b4-4294-8522-339a87e7b68a-console-serving-cert\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.229614 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-trusted-ca-bundle\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.230019 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-console-config\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.230571 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-oauth-serving-cert\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.230613 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-trusted-ca-bundle\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.230619 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ce22043-f6b4-4294-8522-339a87e7b68a-service-ca\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.233847 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ce22043-f6b4-4294-8522-339a87e7b68a-console-serving-cert\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.236205 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ce22043-f6b4-4294-8522-339a87e7b68a-console-oauth-config\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.245687 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62gt\" (UniqueName: \"kubernetes.io/projected/5ce22043-f6b4-4294-8522-339a87e7b68a-kube-api-access-j62gt\") pod \"console-5977cc965f-tjfns\" (UID: \"5ce22043-f6b4-4294-8522-339a87e7b68a\") " pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.426636 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.517237 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l"] Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.533611 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8800ae15-51ee-4310-889d-3608008986bd-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.541337 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8800ae15-51ee-4310-889d-3608008986bd-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mwzz2\" (UID: \"8800ae15-51ee-4310-889d-3608008986bd\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.576210 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2z4jr"] Jan 30 05:19:26 crc kubenswrapper[4931]: W0130 05:19:26.591160 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e6ed8f_a69f_4e32_b275_6ea9a5cebf1c.slice/crio-530ad43d810bf2ae458c6609c9427c28be64ec1ae5d6cceba20ec0377c60d263 WatchSource:0}: Error finding container 530ad43d810bf2ae458c6609c9427c28be64ec1ae5d6cceba20ec0377c60d263: Status 404 returned error can't find the container with id 530ad43d810bf2ae458c6609c9427c28be64ec1ae5d6cceba20ec0377c60d263 Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.694952 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" event={"ID":"a115b68a-a9ad-44db-90f5-1f016556956a","Type":"ContainerStarted","Data":"27ff35d51ed7fe223c212d077ed66e09d42ee05f18c01ce45798ed92a9c58657"} Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.695687 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" event={"ID":"01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c","Type":"ContainerStarted","Data":"530ad43d810bf2ae458c6609c9427c28be64ec1ae5d6cceba20ec0377c60d263"} Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.696617 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6mhzq" event={"ID":"66e77bed-ca3a-4cfe-874c-d6874c52ab0e","Type":"ContainerStarted","Data":"e84ba3b0367eb90cfc282570737824f48e8d330dc37c08681d53d96c66655473"} Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.702685 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5977cc965f-tjfns"] Jan 30 05:19:26 crc kubenswrapper[4931]: W0130 05:19:26.711343 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce22043_f6b4_4294_8522_339a87e7b68a.slice/crio-78dcd5682a9e6c3d5454103207ad8a6fdc5ee7ad29bcf0d273d0c702d1e16c3d WatchSource:0}: Error finding container 78dcd5682a9e6c3d5454103207ad8a6fdc5ee7ad29bcf0d273d0c702d1e16c3d: Status 404 returned error can't find the container with id 78dcd5682a9e6c3d5454103207ad8a6fdc5ee7ad29bcf0d273d0c702d1e16c3d Jan 30 05:19:26 crc kubenswrapper[4931]: I0130 05:19:26.785650 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.083759 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2"] Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.363293 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.363377 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.704353 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5977cc965f-tjfns" event={"ID":"5ce22043-f6b4-4294-8522-339a87e7b68a","Type":"ContainerStarted","Data":"313f21e9c9f42aa6e433cbbfe8c7f75d13dffb7ee072c68df8f5f2720218bc4f"} Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.704441 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5977cc965f-tjfns" event={"ID":"5ce22043-f6b4-4294-8522-339a87e7b68a","Type":"ContainerStarted","Data":"78dcd5682a9e6c3d5454103207ad8a6fdc5ee7ad29bcf0d273d0c702d1e16c3d"} Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.706257 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" event={"ID":"8800ae15-51ee-4310-889d-3608008986bd","Type":"ContainerStarted","Data":"27994bc45d61429b45f71fd70a54a4edd07c28f0cf60088b3894cadc11473b47"} Jan 30 05:19:27 crc kubenswrapper[4931]: I0130 05:19:27.724268 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5977cc965f-tjfns" podStartSLOduration=1.72424194 podStartE2EDuration="1.72424194s" podCreationTimestamp="2026-01-30 05:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:19:27.719836652 +0000 UTC m=+703.089746909" watchObservedRunningTime="2026-01-30 05:19:27.72424194 +0000 UTC m=+703.094152227" Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.727938 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6mhzq" event={"ID":"66e77bed-ca3a-4cfe-874c-d6874c52ab0e","Type":"ContainerStarted","Data":"7630dbb4ff8d4f848de55d2dd84579beace9374a92232dae6ae7e8cb77b2d1f0"} Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.728616 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.730666 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" event={"ID":"8800ae15-51ee-4310-889d-3608008986bd","Type":"ContainerStarted","Data":"3a865d76a867ef3b8859eff6e42dafb59f7f43dc44b5a90b29caf1523c57ec76"} Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.733917 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" event={"ID":"a115b68a-a9ad-44db-90f5-1f016556956a","Type":"ContainerStarted","Data":"237f5a71a2f9711fa60cd985670de36aa38ade9a068890f966034467bae49f4c"} Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.734633 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.736247 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" event={"ID":"01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c","Type":"ContainerStarted","Data":"3bbb20decbdc6b6c78b7947a9c77966b342061e41ee8a4b9a0616c1b9f99ba19"} Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.753747 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6mhzq" podStartSLOduration=2.352185375 podStartE2EDuration="4.753731503s" podCreationTimestamp="2026-01-30 05:19:25 +0000 UTC" firstStartedPulling="2026-01-30 05:19:26.136596885 +0000 UTC m=+701.506507142" lastFinishedPulling="2026-01-30 05:19:28.538143013 +0000 UTC m=+703.908053270" observedRunningTime="2026-01-30 05:19:29.751996292 +0000 UTC m=+705.121906589" watchObservedRunningTime="2026-01-30 05:19:29.753731503 +0000 UTC m=+705.123641760" Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.778513 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" podStartSLOduration=2.736762852 podStartE2EDuration="4.77849036s" podCreationTimestamp="2026-01-30 05:19:25 +0000 UTC" firstStartedPulling="2026-01-30 05:19:26.52624159 +0000 UTC m=+701.896151877" lastFinishedPulling="2026-01-30 05:19:28.567969128 +0000 UTC m=+703.937879385" observedRunningTime="2026-01-30 05:19:29.767196193 +0000 UTC m=+705.137106490" watchObservedRunningTime="2026-01-30 05:19:29.77849036 +0000 UTC m=+705.148400637" Jan 30 05:19:29 crc kubenswrapper[4931]: I0130 05:19:29.844863 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mwzz2" podStartSLOduration=2.393908874 podStartE2EDuration="4.844834124s" podCreationTimestamp="2026-01-30 05:19:25 +0000 UTC" firstStartedPulling="2026-01-30 05:19:27.096698378 +0000 UTC m=+702.466608635" lastFinishedPulling="2026-01-30 05:19:29.547623608 +0000 UTC m=+704.917533885" observedRunningTime="2026-01-30 05:19:29.843535086 +0000 UTC m=+705.213445363" watchObservedRunningTime="2026-01-30 05:19:29.844834124 +0000 UTC m=+705.214744421" Jan 30 05:19:31 crc kubenswrapper[4931]: I0130 05:19:31.754982 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" event={"ID":"01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c","Type":"ContainerStarted","Data":"56c2626c9f36c80fd038f53f467d581b5d6e1fbcd527b45985cc7cdb7ece24e3"} Jan 30 05:19:31 crc kubenswrapper[4931]: I0130 05:19:31.785076 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-2z4jr" podStartSLOduration=2.545185019 podStartE2EDuration="6.78504617s" podCreationTimestamp="2026-01-30 05:19:25 +0000 UTC" firstStartedPulling="2026-01-30 05:19:26.597410503 +0000 UTC m=+701.967320800" lastFinishedPulling="2026-01-30 05:19:30.837271684 +0000 UTC m=+706.207181951" observedRunningTime="2026-01-30 05:19:31.781755754 +0000 UTC m=+707.151666041" watchObservedRunningTime="2026-01-30 05:19:31.78504617 +0000 UTC m=+707.154956467" Jan 30 05:19:36 crc kubenswrapper[4931]: I0130 05:19:36.140564 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6mhzq" Jan 30 05:19:36 crc kubenswrapper[4931]: I0130 05:19:36.428172 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:36 crc kubenswrapper[4931]: I0130 05:19:36.429353 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:36 crc kubenswrapper[4931]: I0130 05:19:36.435521 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:36 crc kubenswrapper[4931]: I0130 05:19:36.794337 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5977cc965f-tjfns" Jan 30 05:19:36 crc kubenswrapper[4931]: I0130 05:19:36.871527 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ff4lr"] Jan 30 05:19:46 crc kubenswrapper[4931]: I0130 05:19:46.054453 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-krf2l" Jan 30 05:19:57 crc kubenswrapper[4931]: I0130 05:19:57.363729 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:19:57 crc kubenswrapper[4931]: I0130 05:19:57.364463 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:20:01 crc kubenswrapper[4931]: I0130 05:20:01.923875 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ff4lr" podUID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" containerName="console" containerID="cri-o://0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469" gracePeriod=15 Jan 30 05:20:01 crc kubenswrapper[4931]: I0130 05:20:01.949514 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6"] Jan 30 05:20:01 crc kubenswrapper[4931]: I0130 05:20:01.950740 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:01 crc kubenswrapper[4931]: I0130 05:20:01.955142 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 05:20:01 crc kubenswrapper[4931]: I0130 05:20:01.980883 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6"] Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.144300 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw68b\" (UniqueName: \"kubernetes.io/projected/150d0383-4876-424e-b189-6ce3cceccb72-kube-api-access-hw68b\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.144822 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.145001 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.246756 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.246851 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.247283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.247323 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.247378 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw68b\" (UniqueName: \"kubernetes.io/projected/150d0383-4876-424e-b189-6ce3cceccb72-kube-api-access-hw68b\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.253537 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ff4lr_cf0e8eba-09e8-4d9c-87de-9c57583e7276/console/0.log" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.253684 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.287463 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw68b\" (UniqueName: \"kubernetes.io/projected/150d0383-4876-424e-b189-6ce3cceccb72-kube-api-access-hw68b\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.347805 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-oauth-serving-cert\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.348186 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-serving-cert\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.348705 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349098 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-oauth-config\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349177 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-trusted-ca-bundle\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349247 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-config\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349276 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-service-ca\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349315 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmxjn\" (UniqueName: \"kubernetes.io/projected/cf0e8eba-09e8-4d9c-87de-9c57583e7276-kube-api-access-cmxjn\") pod \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\" (UID: \"cf0e8eba-09e8-4d9c-87de-9c57583e7276\") " Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349842 4931 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.349949 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-config" (OuterVolumeSpecName: "console-config") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.350012 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.350679 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-service-ca" (OuterVolumeSpecName: "service-ca") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.354273 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.354760 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0e8eba-09e8-4d9c-87de-9c57583e7276-kube-api-access-cmxjn" (OuterVolumeSpecName: "kube-api-access-cmxjn") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "kube-api-access-cmxjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.354769 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cf0e8eba-09e8-4d9c-87de-9c57583e7276" (UID: "cf0e8eba-09e8-4d9c-87de-9c57583e7276"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.452863 4931 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.452918 4931 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.452937 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.452954 4931 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.452971 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e8eba-09e8-4d9c-87de-9c57583e7276-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.453023 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmxjn\" (UniqueName: \"kubernetes.io/projected/cf0e8eba-09e8-4d9c-87de-9c57583e7276-kube-api-access-cmxjn\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.569866 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:02 crc kubenswrapper[4931]: I0130 05:20:02.817861 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6"] Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.020660 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ff4lr_cf0e8eba-09e8-4d9c-87de-9c57583e7276/console/0.log" Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.020747 4931 generic.go:334] "Generic (PLEG): container finished" podID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" containerID="0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469" exitCode=2 Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.020848 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ff4lr" Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.020850 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ff4lr" event={"ID":"cf0e8eba-09e8-4d9c-87de-9c57583e7276","Type":"ContainerDied","Data":"0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469"} Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.021098 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ff4lr" event={"ID":"cf0e8eba-09e8-4d9c-87de-9c57583e7276","Type":"ContainerDied","Data":"6ef4e3652e767b58bcd714efc40fa7c13d1316dc132366e3239b8378ad811289"} Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.021147 4931 scope.go:117] "RemoveContainer" containerID="0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469" Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.025061 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" event={"ID":"150d0383-4876-424e-b189-6ce3cceccb72","Type":"ContainerStarted","Data":"b8ea0d39de1aa49f8377b1ecb36b19511758a7d0acc80c3cd8e472a335dae098"} Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.025120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" event={"ID":"150d0383-4876-424e-b189-6ce3cceccb72","Type":"ContainerStarted","Data":"21ab14cb6f82da964f1a6c0f8bdda169d8c7afca5fe9ac4060f3a5f594f49867"} Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.106740 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ff4lr"] Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.107745 4931 scope.go:117] "RemoveContainer" containerID="0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469" Jan 30 05:20:03 crc kubenswrapper[4931]: E0130 05:20:03.108752 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469\": container with ID starting with 0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469 not found: ID does not exist" containerID="0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469" Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.108817 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469"} err="failed to get container status \"0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469\": rpc error: code = NotFound desc = could not find container \"0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469\": container with ID starting with 0fe79afe64820780e0fe369f0129eb24a6f58368212600b876f958b976ab3469 not found: ID does not exist" Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.112557 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ff4lr"] Jan 30 05:20:03 crc kubenswrapper[4931]: I0130 05:20:03.437189 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" path="/var/lib/kubelet/pods/cf0e8eba-09e8-4d9c-87de-9c57583e7276/volumes" Jan 30 05:20:04 crc kubenswrapper[4931]: I0130 05:20:04.035519 4931 generic.go:334] "Generic (PLEG): container finished" podID="150d0383-4876-424e-b189-6ce3cceccb72" containerID="b8ea0d39de1aa49f8377b1ecb36b19511758a7d0acc80c3cd8e472a335dae098" exitCode=0 Jan 30 05:20:04 crc kubenswrapper[4931]: I0130 05:20:04.035571 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" event={"ID":"150d0383-4876-424e-b189-6ce3cceccb72","Type":"ContainerDied","Data":"b8ea0d39de1aa49f8377b1ecb36b19511758a7d0acc80c3cd8e472a335dae098"} Jan 30 05:20:06 crc kubenswrapper[4931]: I0130 05:20:06.057329 4931 generic.go:334] "Generic (PLEG): container finished" podID="150d0383-4876-424e-b189-6ce3cceccb72" containerID="43de2be3b10e7b3f73c9e9d4410f86d05869ae3591a6ffa147ddbbc37f0a585d" exitCode=0 Jan 30 05:20:06 crc kubenswrapper[4931]: I0130 05:20:06.057493 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" event={"ID":"150d0383-4876-424e-b189-6ce3cceccb72","Type":"ContainerDied","Data":"43de2be3b10e7b3f73c9e9d4410f86d05869ae3591a6ffa147ddbbc37f0a585d"} Jan 30 05:20:07 crc kubenswrapper[4931]: I0130 05:20:07.069629 4931 generic.go:334] "Generic (PLEG): container finished" podID="150d0383-4876-424e-b189-6ce3cceccb72" containerID="df26810aae5b9065c104376b590b00682c6e6d7f7635ff7f535375e2d768201f" exitCode=0 Jan 30 05:20:07 crc kubenswrapper[4931]: I0130 05:20:07.069698 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" event={"ID":"150d0383-4876-424e-b189-6ce3cceccb72","Type":"ContainerDied","Data":"df26810aae5b9065c104376b590b00682c6e6d7f7635ff7f535375e2d768201f"} Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.380326 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.543409 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw68b\" (UniqueName: \"kubernetes.io/projected/150d0383-4876-424e-b189-6ce3cceccb72-kube-api-access-hw68b\") pod \"150d0383-4876-424e-b189-6ce3cceccb72\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.543574 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-bundle\") pod \"150d0383-4876-424e-b189-6ce3cceccb72\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.543631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-util\") pod \"150d0383-4876-424e-b189-6ce3cceccb72\" (UID: \"150d0383-4876-424e-b189-6ce3cceccb72\") " Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.544505 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-bundle" (OuterVolumeSpecName: "bundle") pod "150d0383-4876-424e-b189-6ce3cceccb72" (UID: "150d0383-4876-424e-b189-6ce3cceccb72"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.548541 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150d0383-4876-424e-b189-6ce3cceccb72-kube-api-access-hw68b" (OuterVolumeSpecName: "kube-api-access-hw68b") pod "150d0383-4876-424e-b189-6ce3cceccb72" (UID: "150d0383-4876-424e-b189-6ce3cceccb72"). InnerVolumeSpecName "kube-api-access-hw68b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.583199 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-util" (OuterVolumeSpecName: "util") pod "150d0383-4876-424e-b189-6ce3cceccb72" (UID: "150d0383-4876-424e-b189-6ce3cceccb72"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.644842 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.644889 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150d0383-4876-424e-b189-6ce3cceccb72-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:08 crc kubenswrapper[4931]: I0130 05:20:08.644908 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw68b\" (UniqueName: \"kubernetes.io/projected/150d0383-4876-424e-b189-6ce3cceccb72-kube-api-access-hw68b\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:09 crc kubenswrapper[4931]: I0130 05:20:09.087976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" event={"ID":"150d0383-4876-424e-b189-6ce3cceccb72","Type":"ContainerDied","Data":"21ab14cb6f82da964f1a6c0f8bdda169d8c7afca5fe9ac4060f3a5f594f49867"} Jan 30 05:20:09 crc kubenswrapper[4931]: I0130 05:20:09.088036 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ab14cb6f82da964f1a6c0f8bdda169d8c7afca5fe9ac4060f3a5f594f49867" Jan 30 05:20:09 crc kubenswrapper[4931]: I0130 05:20:09.088072 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6" Jan 30 05:20:19 crc kubenswrapper[4931]: I0130 05:20:19.440375 4931 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.095670 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg"] Jan 30 05:20:20 crc kubenswrapper[4931]: E0130 05:20:20.095863 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="extract" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.095875 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="extract" Jan 30 05:20:20 crc kubenswrapper[4931]: E0130 05:20:20.095894 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="pull" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.095900 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="pull" Jan 30 05:20:20 crc kubenswrapper[4931]: E0130 05:20:20.095911 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="util" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.095916 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="util" Jan 30 05:20:20 crc kubenswrapper[4931]: E0130 05:20:20.095926 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" containerName="console" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.095932 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" containerName="console" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.096018 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="150d0383-4876-424e-b189-6ce3cceccb72" containerName="extract" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.096033 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0e8eba-09e8-4d9c-87de-9c57583e7276" containerName="console" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.096403 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.104246 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.104560 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.104523 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.104658 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-f5n8j" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.104781 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.161524 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg"] Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.296878 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-webhook-cert\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.296922 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ckk\" (UniqueName: \"kubernetes.io/projected/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-kube-api-access-n8ckk\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.296974 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-apiservice-cert\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.351352 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf"] Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.352177 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.354821 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.354934 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.355224 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bqx46" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.388912 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf"] Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.398160 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47321851-ef2d-47a3-949a-58f2e87df8dd-webhook-cert\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.398249 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-apiservice-cert\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.398314 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjzn\" (UniqueName: \"kubernetes.io/projected/47321851-ef2d-47a3-949a-58f2e87df8dd-kube-api-access-rqjzn\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.398362 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-webhook-cert\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.398393 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8ckk\" (UniqueName: \"kubernetes.io/projected/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-kube-api-access-n8ckk\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.398445 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47321851-ef2d-47a3-949a-58f2e87df8dd-apiservice-cert\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.405592 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-webhook-cert\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.407101 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-apiservice-cert\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.454042 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8ckk\" (UniqueName: \"kubernetes.io/projected/164111f5-1bd4-4fc2-84f5-7418ee6e7e62-kube-api-access-n8ckk\") pod \"metallb-operator-controller-manager-6969d469fc-rzjqg\" (UID: \"164111f5-1bd4-4fc2-84f5-7418ee6e7e62\") " pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.500990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47321851-ef2d-47a3-949a-58f2e87df8dd-apiservice-cert\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.501062 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47321851-ef2d-47a3-949a-58f2e87df8dd-webhook-cert\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.501111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjzn\" (UniqueName: \"kubernetes.io/projected/47321851-ef2d-47a3-949a-58f2e87df8dd-kube-api-access-rqjzn\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.509982 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47321851-ef2d-47a3-949a-58f2e87df8dd-apiservice-cert\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.522490 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47321851-ef2d-47a3-949a-58f2e87df8dd-webhook-cert\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.532230 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjzn\" (UniqueName: \"kubernetes.io/projected/47321851-ef2d-47a3-949a-58f2e87df8dd-kube-api-access-rqjzn\") pod \"metallb-operator-webhook-server-7659bb7b4d-ssrqf\" (UID: \"47321851-ef2d-47a3-949a-58f2e87df8dd\") " pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.675251 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.711251 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.880177 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf"] Jan 30 05:20:20 crc kubenswrapper[4931]: W0130 05:20:20.896210 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47321851_ef2d_47a3_949a_58f2e87df8dd.slice/crio-3ffa202fc561fca0cfedec3a1fd08b9c9ea54f4c21dffcfd3ad688c30952b6ed WatchSource:0}: Error finding container 3ffa202fc561fca0cfedec3a1fd08b9c9ea54f4c21dffcfd3ad688c30952b6ed: Status 404 returned error can't find the container with id 3ffa202fc561fca0cfedec3a1fd08b9c9ea54f4c21dffcfd3ad688c30952b6ed Jan 30 05:20:20 crc kubenswrapper[4931]: I0130 05:20:20.931210 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg"] Jan 30 05:20:20 crc kubenswrapper[4931]: W0130 05:20:20.935541 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164111f5_1bd4_4fc2_84f5_7418ee6e7e62.slice/crio-62739bcd7c9196a59d8e45e81b5b4a936544a9cfd2d710b9b8d6024a71442c07 WatchSource:0}: Error finding container 62739bcd7c9196a59d8e45e81b5b4a936544a9cfd2d710b9b8d6024a71442c07: Status 404 returned error can't find the container with id 62739bcd7c9196a59d8e45e81b5b4a936544a9cfd2d710b9b8d6024a71442c07 Jan 30 05:20:21 crc kubenswrapper[4931]: I0130 05:20:21.175359 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" event={"ID":"164111f5-1bd4-4fc2-84f5-7418ee6e7e62","Type":"ContainerStarted","Data":"62739bcd7c9196a59d8e45e81b5b4a936544a9cfd2d710b9b8d6024a71442c07"} Jan 30 05:20:21 crc kubenswrapper[4931]: I0130 05:20:21.176944 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" event={"ID":"47321851-ef2d-47a3-949a-58f2e87df8dd","Type":"ContainerStarted","Data":"3ffa202fc561fca0cfedec3a1fd08b9c9ea54f4c21dffcfd3ad688c30952b6ed"} Jan 30 05:20:25 crc kubenswrapper[4931]: I0130 05:20:25.203539 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" event={"ID":"164111f5-1bd4-4fc2-84f5-7418ee6e7e62","Type":"ContainerStarted","Data":"b7c1b4383a10a13ffb4a3e2d895cf39fa28755c38c12fc5357eda97029d6852e"} Jan 30 05:20:25 crc kubenswrapper[4931]: I0130 05:20:25.204147 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:20:25 crc kubenswrapper[4931]: I0130 05:20:25.226768 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" podStartSLOduration=2.007064976 podStartE2EDuration="5.226753892s" podCreationTimestamp="2026-01-30 05:20:20 +0000 UTC" firstStartedPulling="2026-01-30 05:20:20.938157926 +0000 UTC m=+756.308068193" lastFinishedPulling="2026-01-30 05:20:24.157846852 +0000 UTC m=+759.527757109" observedRunningTime="2026-01-30 05:20:25.22636151 +0000 UTC m=+760.596271767" watchObservedRunningTime="2026-01-30 05:20:25.226753892 +0000 UTC m=+760.596664149" Jan 30 05:20:27 crc kubenswrapper[4931]: I0130 05:20:27.363754 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:20:27 crc kubenswrapper[4931]: I0130 05:20:27.364122 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:20:27 crc kubenswrapper[4931]: I0130 05:20:27.364199 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:20:27 crc kubenswrapper[4931]: I0130 05:20:27.365256 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1794ca6ffdd404c39dffe9fa048526a3a78869de00d876a52d3bd280c8bbc2a2"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:20:27 crc kubenswrapper[4931]: I0130 05:20:27.365389 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://1794ca6ffdd404c39dffe9fa048526a3a78869de00d876a52d3bd280c8bbc2a2" gracePeriod=600 Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.225716 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="1794ca6ffdd404c39dffe9fa048526a3a78869de00d876a52d3bd280c8bbc2a2" exitCode=0 Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.225810 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"1794ca6ffdd404c39dffe9fa048526a3a78869de00d876a52d3bd280c8bbc2a2"} Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.226491 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"60aa2b4543ca7f8bb21bef3c167f0da099829d6ddc544f185e09f8c4de74ad75"} Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.226541 4931 scope.go:117] "RemoveContainer" containerID="ca23316a7a7a0870cd6ce778a3ddf7b3692d29f58078872d0288efcbee40c2e8" Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.228864 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" event={"ID":"47321851-ef2d-47a3-949a-58f2e87df8dd","Type":"ContainerStarted","Data":"159bf76c0061e6eddc60e9893eaf6b70e748433a758ebf68372dfd7d9f90924b"} Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.229071 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:20:28 crc kubenswrapper[4931]: I0130 05:20:28.280701 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" podStartSLOduration=1.965692415 podStartE2EDuration="8.280680766s" podCreationTimestamp="2026-01-30 05:20:20 +0000 UTC" firstStartedPulling="2026-01-30 05:20:20.901377705 +0000 UTC m=+756.271287972" lastFinishedPulling="2026-01-30 05:20:27.216366066 +0000 UTC m=+762.586276323" observedRunningTime="2026-01-30 05:20:28.269532561 +0000 UTC m=+763.639442848" watchObservedRunningTime="2026-01-30 05:20:28.280680766 +0000 UTC m=+763.650591043" Jan 30 05:20:40 crc kubenswrapper[4931]: I0130 05:20:40.684898 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7659bb7b4d-ssrqf" Jan 30 05:21:00 crc kubenswrapper[4931]: I0130 05:21:00.714956 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6969d469fc-rzjqg" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.491215 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz"] Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.492451 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.494667 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.494905 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4ztgc" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.498446 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-768qr"] Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.501205 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.502703 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.504003 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.504471 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz"] Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578510 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-sockets\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578568 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-metrics\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578633 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghnp\" (UniqueName: \"kubernetes.io/projected/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-kube-api-access-pghnp\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578677 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be5b19a8-200f-462e-b8f2-fc956ec52080-metrics-certs\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-reloader\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578770 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-conf\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjjpw\" (UniqueName: \"kubernetes.io/projected/be5b19a8-200f-462e-b8f2-fc956ec52080-kube-api-access-tjjpw\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.578827 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-startup\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.605210 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rcpl2"] Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.606103 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.610349 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tf9cf" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.610409 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.610461 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.610504 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.622627 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-g5mxs"] Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.623488 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.625615 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.649157 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-g5mxs"] Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679479 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679526 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-metrics\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679559 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghnp\" (UniqueName: \"kubernetes.io/projected/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-kube-api-access-pghnp\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679583 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be5b19a8-200f-462e-b8f2-fc956ec52080-metrics-certs\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679607 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-reloader\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679636 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-conf\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679652 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjjpw\" (UniqueName: \"kubernetes.io/projected/be5b19a8-200f-462e-b8f2-fc956ec52080-kube-api-access-tjjpw\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679672 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-startup\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.679698 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-sockets\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.680041 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-sockets\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.680222 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-reloader\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.680453 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-conf\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.680882 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/be5b19a8-200f-462e-b8f2-fc956ec52080-metrics\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: E0130 05:21:01.680904 4931 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 30 05:21:01 crc kubenswrapper[4931]: E0130 05:21:01.680987 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-cert podName:3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e nodeName:}" failed. No retries permitted until 2026-01-30 05:21:02.180965525 +0000 UTC m=+797.550875782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-cert") pod "frr-k8s-webhook-server-7df86c4f6c-56ftz" (UID: "3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e") : secret "frr-k8s-webhook-server-cert" not found Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.681347 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/be5b19a8-200f-462e-b8f2-fc956ec52080-frr-startup\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.693057 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be5b19a8-200f-462e-b8f2-fc956ec52080-metrics-certs\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.695803 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjjpw\" (UniqueName: \"kubernetes.io/projected/be5b19a8-200f-462e-b8f2-fc956ec52080-kube-api-access-tjjpw\") pod \"frr-k8s-768qr\" (UID: \"be5b19a8-200f-462e-b8f2-fc956ec52080\") " pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.696501 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghnp\" (UniqueName: \"kubernetes.io/projected/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-kube-api-access-pghnp\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781324 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-metrics-certs\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781387 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8pp\" (UniqueName: \"kubernetes.io/projected/c9c06e8c-f207-490b-8bea-d6a742d63e72-kube-api-access-2z8pp\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781434 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-cert\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781513 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-metallb-excludel2\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781621 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh7xf\" (UniqueName: \"kubernetes.io/projected/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-kube-api-access-kh7xf\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781681 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-metrics-certs\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.781791 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882551 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882799 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882862 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-metrics-certs\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882892 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8pp\" (UniqueName: \"kubernetes.io/projected/c9c06e8c-f207-490b-8bea-d6a742d63e72-kube-api-access-2z8pp\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882914 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-cert\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882941 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-metallb-excludel2\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.882976 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh7xf\" (UniqueName: \"kubernetes.io/projected/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-kube-api-access-kh7xf\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.883000 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-metrics-certs\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: E0130 05:21:01.883130 4931 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 30 05:21:01 crc kubenswrapper[4931]: E0130 05:21:01.883175 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-metrics-certs podName:c9c06e8c-f207-490b-8bea-d6a742d63e72 nodeName:}" failed. No retries permitted until 2026-01-30 05:21:02.383160933 +0000 UTC m=+797.753071180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-metrics-certs") pod "controller-6968d8fdc4-g5mxs" (UID: "c9c06e8c-f207-490b-8bea-d6a742d63e72") : secret "controller-certs-secret" not found Jan 30 05:21:01 crc kubenswrapper[4931]: E0130 05:21:01.883397 4931 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 05:21:01 crc kubenswrapper[4931]: E0130 05:21:01.883490 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist podName:f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18 nodeName:}" failed. No retries permitted until 2026-01-30 05:21:02.383480452 +0000 UTC m=+797.753390709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist") pod "speaker-rcpl2" (UID: "f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18") : secret "metallb-memberlist" not found Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.884114 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-metallb-excludel2\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.886938 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.887410 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-metrics-certs\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.898024 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-cert\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.907139 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh7xf\" (UniqueName: \"kubernetes.io/projected/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-kube-api-access-kh7xf\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:01 crc kubenswrapper[4931]: I0130 05:21:01.908297 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8pp\" (UniqueName: \"kubernetes.io/projected/c9c06e8c-f207-490b-8bea-d6a742d63e72-kube-api-access-2z8pp\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.186413 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.190745 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-56ftz\" (UID: \"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.388637 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-metrics-certs\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.388754 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:02 crc kubenswrapper[4931]: E0130 05:21:02.388953 4931 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 05:21:02 crc kubenswrapper[4931]: E0130 05:21:02.389036 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist podName:f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18 nodeName:}" failed. No retries permitted until 2026-01-30 05:21:03.389013613 +0000 UTC m=+798.758923910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist") pod "speaker-rcpl2" (UID: "f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18") : secret "metallb-memberlist" not found Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.394983 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9c06e8c-f207-490b-8bea-d6a742d63e72-metrics-certs\") pod \"controller-6968d8fdc4-g5mxs\" (UID: \"c9c06e8c-f207-490b-8bea-d6a742d63e72\") " pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.461179 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.477547 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"7b50a49fc816ff5341dd997175c4c6b3b9d39b62e08397148b0ba2c206d3b30d"} Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.540980 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.786336 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz"] Jan 30 05:21:02 crc kubenswrapper[4931]: I0130 05:21:02.891199 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-g5mxs"] Jan 30 05:21:02 crc kubenswrapper[4931]: W0130 05:21:02.899755 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c06e8c_f207_490b_8bea_d6a742d63e72.slice/crio-8c513c5b17f52d69a08cfd284411679c3c0adca1c9e5059f5ef107c77731997b WatchSource:0}: Error finding container 8c513c5b17f52d69a08cfd284411679c3c0adca1c9e5059f5ef107c77731997b: Status 404 returned error can't find the container with id 8c513c5b17f52d69a08cfd284411679c3c0adca1c9e5059f5ef107c77731997b Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.401283 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.409496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18-memberlist\") pod \"speaker-rcpl2\" (UID: \"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18\") " pod="metallb-system/speaker-rcpl2" Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.425822 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rcpl2" Jan 30 05:21:03 crc kubenswrapper[4931]: W0130 05:21:03.451608 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9e7a8e0_b04b_4d38_a05c_a9baa66d2c18.slice/crio-36fbaf6857ed3b18394339b7a673af544f2ae62e2c1ff510538ec9764c90599d WatchSource:0}: Error finding container 36fbaf6857ed3b18394339b7a673af544f2ae62e2c1ff510538ec9764c90599d: Status 404 returned error can't find the container with id 36fbaf6857ed3b18394339b7a673af544f2ae62e2c1ff510538ec9764c90599d Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.484934 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-g5mxs" event={"ID":"c9c06e8c-f207-490b-8bea-d6a742d63e72","Type":"ContainerStarted","Data":"b99b6e018c11c527be0e94c3c6bfc05b635e157d728c7cc74bdbaad9fdefca78"} Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.485263 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-g5mxs" event={"ID":"c9c06e8c-f207-490b-8bea-d6a742d63e72","Type":"ContainerStarted","Data":"3d0ef34e375de9ece415a82c35363c06734bf76d9f9e31976e861ace41f4a2c1"} Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.485277 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-g5mxs" event={"ID":"c9c06e8c-f207-490b-8bea-d6a742d63e72","Type":"ContainerStarted","Data":"8c513c5b17f52d69a08cfd284411679c3c0adca1c9e5059f5ef107c77731997b"} Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.485586 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.487337 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rcpl2" event={"ID":"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18","Type":"ContainerStarted","Data":"36fbaf6857ed3b18394339b7a673af544f2ae62e2c1ff510538ec9764c90599d"} Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.488772 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" event={"ID":"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e","Type":"ContainerStarted","Data":"7446ca5d1144ddde508fbf779bbefc6130d717b9170d677b9287d5111c1e8d0a"} Jan 30 05:21:03 crc kubenswrapper[4931]: I0130 05:21:03.504010 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-g5mxs" podStartSLOduration=2.503991567 podStartE2EDuration="2.503991567s" podCreationTimestamp="2026-01-30 05:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:03.501146407 +0000 UTC m=+798.871056684" watchObservedRunningTime="2026-01-30 05:21:03.503991567 +0000 UTC m=+798.873901824" Jan 30 05:21:04 crc kubenswrapper[4931]: I0130 05:21:04.495586 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rcpl2" event={"ID":"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18","Type":"ContainerStarted","Data":"47e8533c2f9476d932560ca0e77c3e05a995ac8dad3ca7531ed736451c7d8bb1"} Jan 30 05:21:04 crc kubenswrapper[4931]: I0130 05:21:04.495648 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rcpl2" event={"ID":"f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18","Type":"ContainerStarted","Data":"7e32cf4f79c08d2dcce18e81afb226dda165b036f93414277290c7c605fed69a"} Jan 30 05:21:04 crc kubenswrapper[4931]: I0130 05:21:04.510058 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rcpl2" podStartSLOduration=3.510042287 podStartE2EDuration="3.510042287s" podCreationTimestamp="2026-01-30 05:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:04.508782622 +0000 UTC m=+799.878692879" watchObservedRunningTime="2026-01-30 05:21:04.510042287 +0000 UTC m=+799.879952544" Jan 30 05:21:05 crc kubenswrapper[4931]: I0130 05:21:05.500899 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rcpl2" Jan 30 05:21:10 crc kubenswrapper[4931]: I0130 05:21:10.548897 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" event={"ID":"3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e","Type":"ContainerStarted","Data":"4c5800e75db9818badd02d28f4f4cdd69afd271f0b4ebdfd87b9ad81321a331e"} Jan 30 05:21:10 crc kubenswrapper[4931]: I0130 05:21:10.549877 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:10 crc kubenswrapper[4931]: I0130 05:21:10.551276 4931 generic.go:334] "Generic (PLEG): container finished" podID="be5b19a8-200f-462e-b8f2-fc956ec52080" containerID="d091fdb203c9eebcff39560e61d24099acb2730684803a6aad64d3125fbc8900" exitCode=0 Jan 30 05:21:10 crc kubenswrapper[4931]: I0130 05:21:10.551302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerDied","Data":"d091fdb203c9eebcff39560e61d24099acb2730684803a6aad64d3125fbc8900"} Jan 30 05:21:10 crc kubenswrapper[4931]: I0130 05:21:10.569159 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" podStartSLOduration=2.911992925 podStartE2EDuration="9.569143459s" podCreationTimestamp="2026-01-30 05:21:01 +0000 UTC" firstStartedPulling="2026-01-30 05:21:02.821581821 +0000 UTC m=+798.191492088" lastFinishedPulling="2026-01-30 05:21:09.478732325 +0000 UTC m=+804.848642622" observedRunningTime="2026-01-30 05:21:10.567744909 +0000 UTC m=+805.937655166" watchObservedRunningTime="2026-01-30 05:21:10.569143459 +0000 UTC m=+805.939053716" Jan 30 05:21:11 crc kubenswrapper[4931]: I0130 05:21:11.561841 4931 generic.go:334] "Generic (PLEG): container finished" podID="be5b19a8-200f-462e-b8f2-fc956ec52080" containerID="3614b404af2e15a8c92c762f8971d4adf0695ec4c2675a1f4ba165839d330054" exitCode=0 Jan 30 05:21:11 crc kubenswrapper[4931]: I0130 05:21:11.561892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerDied","Data":"3614b404af2e15a8c92c762f8971d4adf0695ec4c2675a1f4ba165839d330054"} Jan 30 05:21:12 crc kubenswrapper[4931]: I0130 05:21:12.575940 4931 generic.go:334] "Generic (PLEG): container finished" podID="be5b19a8-200f-462e-b8f2-fc956ec52080" containerID="832b72323076f1a2cdb42a6edc7ca402e3fa358a97ca8e86a49bc1733370824f" exitCode=0 Jan 30 05:21:12 crc kubenswrapper[4931]: I0130 05:21:12.576732 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerDied","Data":"832b72323076f1a2cdb42a6edc7ca402e3fa358a97ca8e86a49bc1733370824f"} Jan 30 05:21:13 crc kubenswrapper[4931]: I0130 05:21:13.430654 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rcpl2" Jan 30 05:21:13 crc kubenswrapper[4931]: I0130 05:21:13.594990 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"d345466d2acfb0fca0d70ee06649c0cf8629f7cfda0ea80fc958b94cba914897"} Jan 30 05:21:13 crc kubenswrapper[4931]: I0130 05:21:13.595056 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"689aab8a6c9e67bedae8f0fe7c6d3439fbd3d2a70509ca012f325407586e71a3"} Jan 30 05:21:13 crc kubenswrapper[4931]: I0130 05:21:13.595071 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"1d4ffcf3b8ce542596cc1a545cdcc8d9a6ba151afb2f6cd25406b2283f55b5a5"} Jan 30 05:21:13 crc kubenswrapper[4931]: I0130 05:21:13.595085 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"5ee6174343317c45a29a8db376af997e091e54bbbe262524d77e9604e18bef7c"} Jan 30 05:21:13 crc kubenswrapper[4931]: I0130 05:21:13.595109 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"5923ba459c547a350ff8b7ff30bcebed88229ac1c61a5c117fbb21f27cd8e21a"} Jan 30 05:21:14 crc kubenswrapper[4931]: I0130 05:21:14.607152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-768qr" event={"ID":"be5b19a8-200f-462e-b8f2-fc956ec52080","Type":"ContainerStarted","Data":"0c5d99e64f4e55b25f067af817de7a33d4fdf66d6bf5189b387f92886daff63d"} Jan 30 05:21:14 crc kubenswrapper[4931]: I0130 05:21:14.608207 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:14 crc kubenswrapper[4931]: I0130 05:21:14.640633 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-768qr" podStartSLOduration=6.211616472 podStartE2EDuration="13.640615639s" podCreationTimestamp="2026-01-30 05:21:01 +0000 UTC" firstStartedPulling="2026-01-30 05:21:02.047634029 +0000 UTC m=+797.417544286" lastFinishedPulling="2026-01-30 05:21:09.476633186 +0000 UTC m=+804.846543453" observedRunningTime="2026-01-30 05:21:14.639550739 +0000 UTC m=+810.009461006" watchObservedRunningTime="2026-01-30 05:21:14.640615639 +0000 UTC m=+810.010525906" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.220154 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd"] Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.222559 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.225904 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.230184 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd"] Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.408086 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46nt2\" (UniqueName: \"kubernetes.io/projected/686d3bad-998e-4688-a556-c25a0770810a-kube-api-access-46nt2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.408237 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.408271 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.509121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.509178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.509232 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46nt2\" (UniqueName: \"kubernetes.io/projected/686d3bad-998e-4688-a556-c25a0770810a-kube-api-access-46nt2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.509698 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.509971 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.543202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46nt2\" (UniqueName: \"kubernetes.io/projected/686d3bad-998e-4688-a556-c25a0770810a-kube-api-access-46nt2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:15 crc kubenswrapper[4931]: I0130 05:21:15.576280 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:16 crc kubenswrapper[4931]: I0130 05:21:16.061742 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd"] Jan 30 05:21:16 crc kubenswrapper[4931]: I0130 05:21:16.625980 4931 generic.go:334] "Generic (PLEG): container finished" podID="686d3bad-998e-4688-a556-c25a0770810a" containerID="dbeb18acfd8714226ddd9a2e3758eae93dbab1d2bafd3b44dbac6fcea4d2cc71" exitCode=0 Jan 30 05:21:16 crc kubenswrapper[4931]: I0130 05:21:16.626067 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" event={"ID":"686d3bad-998e-4688-a556-c25a0770810a","Type":"ContainerDied","Data":"dbeb18acfd8714226ddd9a2e3758eae93dbab1d2bafd3b44dbac6fcea4d2cc71"} Jan 30 05:21:16 crc kubenswrapper[4931]: I0130 05:21:16.626733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" event={"ID":"686d3bad-998e-4688-a556-c25a0770810a","Type":"ContainerStarted","Data":"a8f056d371a59e738cc49b1654c31fbae3f10712ac02381d1d03ca264298db58"} Jan 30 05:21:16 crc kubenswrapper[4931]: I0130 05:21:16.882840 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:16 crc kubenswrapper[4931]: I0130 05:21:16.958909 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.769730 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzlbl"] Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.772020 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.778572 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzlbl"] Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.864764 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-utilities\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.864845 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-catalog-content\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.864881 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkgz\" (UniqueName: \"kubernetes.io/projected/15d065d1-b45c-450a-9eec-aa929632433c-kube-api-access-jkkgz\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.965340 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkgz\" (UniqueName: \"kubernetes.io/projected/15d065d1-b45c-450a-9eec-aa929632433c-kube-api-access-jkkgz\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.965471 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-utilities\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.965514 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-catalog-content\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.966068 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-catalog-content\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.966150 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-utilities\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:18 crc kubenswrapper[4931]: I0130 05:21:18.988578 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkgz\" (UniqueName: \"kubernetes.io/projected/15d065d1-b45c-450a-9eec-aa929632433c-kube-api-access-jkkgz\") pod \"redhat-operators-dzlbl\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:19 crc kubenswrapper[4931]: I0130 05:21:19.149524 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:20 crc kubenswrapper[4931]: I0130 05:21:20.653746 4931 generic.go:334] "Generic (PLEG): container finished" podID="686d3bad-998e-4688-a556-c25a0770810a" containerID="8ad6359b697f32b69166cbb4abeaa98ee59ced3e0a39f5170c0a74d04deeeb04" exitCode=0 Jan 30 05:21:20 crc kubenswrapper[4931]: I0130 05:21:20.653878 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" event={"ID":"686d3bad-998e-4688-a556-c25a0770810a","Type":"ContainerDied","Data":"8ad6359b697f32b69166cbb4abeaa98ee59ced3e0a39f5170c0a74d04deeeb04"} Jan 30 05:21:20 crc kubenswrapper[4931]: I0130 05:21:20.714457 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzlbl"] Jan 30 05:21:20 crc kubenswrapper[4931]: W0130 05:21:20.724658 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d065d1_b45c_450a_9eec_aa929632433c.slice/crio-2d693934f6449d82237d1c208f5b4b2834aad8a3fb2653ce87c489e64b971cbc WatchSource:0}: Error finding container 2d693934f6449d82237d1c208f5b4b2834aad8a3fb2653ce87c489e64b971cbc: Status 404 returned error can't find the container with id 2d693934f6449d82237d1c208f5b4b2834aad8a3fb2653ce87c489e64b971cbc Jan 30 05:21:21 crc kubenswrapper[4931]: I0130 05:21:21.664038 4931 generic.go:334] "Generic (PLEG): container finished" podID="15d065d1-b45c-450a-9eec-aa929632433c" containerID="84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486" exitCode=0 Jan 30 05:21:21 crc kubenswrapper[4931]: I0130 05:21:21.664093 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerDied","Data":"84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486"} Jan 30 05:21:21 crc kubenswrapper[4931]: I0130 05:21:21.664486 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerStarted","Data":"2d693934f6449d82237d1c208f5b4b2834aad8a3fb2653ce87c489e64b971cbc"} Jan 30 05:21:21 crc kubenswrapper[4931]: I0130 05:21:21.673416 4931 generic.go:334] "Generic (PLEG): container finished" podID="686d3bad-998e-4688-a556-c25a0770810a" containerID="38d6d2e1ea8c080852c1379421d5471393a5c209a8bd5c148fd5da9727525c88" exitCode=0 Jan 30 05:21:21 crc kubenswrapper[4931]: I0130 05:21:21.673495 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" event={"ID":"686d3bad-998e-4688-a556-c25a0770810a","Type":"ContainerDied","Data":"38d6d2e1ea8c080852c1379421d5471393a5c209a8bd5c148fd5da9727525c88"} Jan 30 05:21:22 crc kubenswrapper[4931]: I0130 05:21:22.470415 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-56ftz" Jan 30 05:21:22 crc kubenswrapper[4931]: I0130 05:21:22.546727 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-g5mxs" Jan 30 05:21:22 crc kubenswrapper[4931]: I0130 05:21:22.680482 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerStarted","Data":"1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269"} Jan 30 05:21:22 crc kubenswrapper[4931]: I0130 05:21:22.965970 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.018414 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-bundle\") pod \"686d3bad-998e-4688-a556-c25a0770810a\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.018494 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46nt2\" (UniqueName: \"kubernetes.io/projected/686d3bad-998e-4688-a556-c25a0770810a-kube-api-access-46nt2\") pod \"686d3bad-998e-4688-a556-c25a0770810a\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.018539 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-util\") pod \"686d3bad-998e-4688-a556-c25a0770810a\" (UID: \"686d3bad-998e-4688-a556-c25a0770810a\") " Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.019195 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-bundle" (OuterVolumeSpecName: "bundle") pod "686d3bad-998e-4688-a556-c25a0770810a" (UID: "686d3bad-998e-4688-a556-c25a0770810a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.025699 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686d3bad-998e-4688-a556-c25a0770810a-kube-api-access-46nt2" (OuterVolumeSpecName: "kube-api-access-46nt2") pod "686d3bad-998e-4688-a556-c25a0770810a" (UID: "686d3bad-998e-4688-a556-c25a0770810a"). InnerVolumeSpecName "kube-api-access-46nt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.031318 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-util" (OuterVolumeSpecName: "util") pod "686d3bad-998e-4688-a556-c25a0770810a" (UID: "686d3bad-998e-4688-a556-c25a0770810a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.119787 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.119816 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46nt2\" (UniqueName: \"kubernetes.io/projected/686d3bad-998e-4688-a556-c25a0770810a-kube-api-access-46nt2\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.119826 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/686d3bad-998e-4688-a556-c25a0770810a-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.693669 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.693690 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd" event={"ID":"686d3bad-998e-4688-a556-c25a0770810a","Type":"ContainerDied","Data":"a8f056d371a59e738cc49b1654c31fbae3f10712ac02381d1d03ca264298db58"} Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.693763 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f056d371a59e738cc49b1654c31fbae3f10712ac02381d1d03ca264298db58" Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.697040 4931 generic.go:334] "Generic (PLEG): container finished" podID="15d065d1-b45c-450a-9eec-aa929632433c" containerID="1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269" exitCode=0 Jan 30 05:21:23 crc kubenswrapper[4931]: I0130 05:21:23.697108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerDied","Data":"1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269"} Jan 30 05:21:25 crc kubenswrapper[4931]: I0130 05:21:25.712279 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerStarted","Data":"ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2"} Jan 30 05:21:25 crc kubenswrapper[4931]: I0130 05:21:25.730002 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzlbl" podStartSLOduration=4.571202325 podStartE2EDuration="7.729988221s" podCreationTimestamp="2026-01-30 05:21:18 +0000 UTC" firstStartedPulling="2026-01-30 05:21:21.666889947 +0000 UTC m=+817.036800214" lastFinishedPulling="2026-01-30 05:21:24.825675823 +0000 UTC m=+820.195586110" observedRunningTime="2026-01-30 05:21:25.726413691 +0000 UTC m=+821.096323958" watchObservedRunningTime="2026-01-30 05:21:25.729988221 +0000 UTC m=+821.099898478" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.171987 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd"] Jan 30 05:21:28 crc kubenswrapper[4931]: E0130 05:21:28.172817 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="extract" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.172832 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="extract" Jan 30 05:21:28 crc kubenswrapper[4931]: E0130 05:21:28.172854 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="util" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.172861 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="util" Jan 30 05:21:28 crc kubenswrapper[4931]: E0130 05:21:28.172870 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="pull" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.172879 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="pull" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.173025 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="686d3bad-998e-4688-a556-c25a0770810a" containerName="extract" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.173490 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.175048 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-lg9z8" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.175331 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.175638 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.190574 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd"] Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.200307 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b67ac51-9e69-4b48-a6b3-2252a8c635ae-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9bbzd\" (UID: \"9b67ac51-9e69-4b48-a6b3-2252a8c635ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.200370 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pgpr\" (UniqueName: \"kubernetes.io/projected/9b67ac51-9e69-4b48-a6b3-2252a8c635ae-kube-api-access-4pgpr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9bbzd\" (UID: \"9b67ac51-9e69-4b48-a6b3-2252a8c635ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.302008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b67ac51-9e69-4b48-a6b3-2252a8c635ae-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9bbzd\" (UID: \"9b67ac51-9e69-4b48-a6b3-2252a8c635ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.302077 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pgpr\" (UniqueName: \"kubernetes.io/projected/9b67ac51-9e69-4b48-a6b3-2252a8c635ae-kube-api-access-4pgpr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9bbzd\" (UID: \"9b67ac51-9e69-4b48-a6b3-2252a8c635ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.302496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b67ac51-9e69-4b48-a6b3-2252a8c635ae-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9bbzd\" (UID: \"9b67ac51-9e69-4b48-a6b3-2252a8c635ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.330991 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pgpr\" (UniqueName: \"kubernetes.io/projected/9b67ac51-9e69-4b48-a6b3-2252a8c635ae-kube-api-access-4pgpr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9bbzd\" (UID: \"9b67ac51-9e69-4b48-a6b3-2252a8c635ae\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.539570 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" Jan 30 05:21:28 crc kubenswrapper[4931]: I0130 05:21:28.952891 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd"] Jan 30 05:21:28 crc kubenswrapper[4931]: W0130 05:21:28.966452 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b67ac51_9e69_4b48_a6b3_2252a8c635ae.slice/crio-01273766cf6a03889bff8cd66278caf13630c4b081a9550ce6ddacfebab92d0e WatchSource:0}: Error finding container 01273766cf6a03889bff8cd66278caf13630c4b081a9550ce6ddacfebab92d0e: Status 404 returned error can't find the container with id 01273766cf6a03889bff8cd66278caf13630c4b081a9550ce6ddacfebab92d0e Jan 30 05:21:29 crc kubenswrapper[4931]: I0130 05:21:29.149909 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:29 crc kubenswrapper[4931]: I0130 05:21:29.150047 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:29 crc kubenswrapper[4931]: I0130 05:21:29.741086 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" event={"ID":"9b67ac51-9e69-4b48-a6b3-2252a8c635ae","Type":"ContainerStarted","Data":"01273766cf6a03889bff8cd66278caf13630c4b081a9550ce6ddacfebab92d0e"} Jan 30 05:21:30 crc kubenswrapper[4931]: I0130 05:21:30.229123 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzlbl" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="registry-server" probeResult="failure" output=< Jan 30 05:21:30 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 05:21:30 crc kubenswrapper[4931]: > Jan 30 05:21:31 crc kubenswrapper[4931]: I0130 05:21:31.897546 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-768qr" Jan 30 05:21:32 crc kubenswrapper[4931]: I0130 05:21:32.764035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" event={"ID":"9b67ac51-9e69-4b48-a6b3-2252a8c635ae","Type":"ContainerStarted","Data":"4adc7f48a4b039047124a737e2ee98b55324b94ed05a4187b66008366f4ec0c9"} Jan 30 05:21:32 crc kubenswrapper[4931]: I0130 05:21:32.799151 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9bbzd" podStartSLOduration=1.6231087039999998 podStartE2EDuration="4.799125965s" podCreationTimestamp="2026-01-30 05:21:28 +0000 UTC" firstStartedPulling="2026-01-30 05:21:28.969383535 +0000 UTC m=+824.339293802" lastFinishedPulling="2026-01-30 05:21:32.145400796 +0000 UTC m=+827.515311063" observedRunningTime="2026-01-30 05:21:32.794141565 +0000 UTC m=+828.164051852" watchObservedRunningTime="2026-01-30 05:21:32.799125965 +0000 UTC m=+828.169036262" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.367255 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-hsrfm"] Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.368765 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.372207 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.372362 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qlvrg" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.372417 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.388862 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-hsrfm"] Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.503743 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5049b2a6-f85e-4250-9b12-c70705adaf35-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-hsrfm\" (UID: \"5049b2a6-f85e-4250-9b12-c70705adaf35\") " pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.503888 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67w6m\" (UniqueName: \"kubernetes.io/projected/5049b2a6-f85e-4250-9b12-c70705adaf35-kube-api-access-67w6m\") pod \"cert-manager-webhook-6888856db4-hsrfm\" (UID: \"5049b2a6-f85e-4250-9b12-c70705adaf35\") " pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.605396 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67w6m\" (UniqueName: \"kubernetes.io/projected/5049b2a6-f85e-4250-9b12-c70705adaf35-kube-api-access-67w6m\") pod \"cert-manager-webhook-6888856db4-hsrfm\" (UID: \"5049b2a6-f85e-4250-9b12-c70705adaf35\") " pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.605666 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5049b2a6-f85e-4250-9b12-c70705adaf35-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-hsrfm\" (UID: \"5049b2a6-f85e-4250-9b12-c70705adaf35\") " pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.640903 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5049b2a6-f85e-4250-9b12-c70705adaf35-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-hsrfm\" (UID: \"5049b2a6-f85e-4250-9b12-c70705adaf35\") " pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.640968 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67w6m\" (UniqueName: \"kubernetes.io/projected/5049b2a6-f85e-4250-9b12-c70705adaf35-kube-api-access-67w6m\") pod \"cert-manager-webhook-6888856db4-hsrfm\" (UID: \"5049b2a6-f85e-4250-9b12-c70705adaf35\") " pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:36 crc kubenswrapper[4931]: I0130 05:21:36.698693 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:37 crc kubenswrapper[4931]: I0130 05:21:37.021997 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-hsrfm"] Jan 30 05:21:37 crc kubenswrapper[4931]: I0130 05:21:37.805811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" event={"ID":"5049b2a6-f85e-4250-9b12-c70705adaf35","Type":"ContainerStarted","Data":"d4364c06c83c8bc95959ce092c6feddcbb0bf46ffc65930c8af19d5e9b61ab34"} Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.207538 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.256016 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.451452 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzlbl"] Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.872851 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-qvz8h"] Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.873582 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.876857 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-p2d92" Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.889507 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-qvz8h"] Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.952794 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39da06e0-e9ea-4570-b486-3c0d2fe79820-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-qvz8h\" (UID: \"39da06e0-e9ea-4570-b486-3c0d2fe79820\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:39 crc kubenswrapper[4931]: I0130 05:21:39.952916 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfwg\" (UniqueName: \"kubernetes.io/projected/39da06e0-e9ea-4570-b486-3c0d2fe79820-kube-api-access-pbfwg\") pod \"cert-manager-cainjector-5545bd876-qvz8h\" (UID: \"39da06e0-e9ea-4570-b486-3c0d2fe79820\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:40 crc kubenswrapper[4931]: I0130 05:21:40.053988 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39da06e0-e9ea-4570-b486-3c0d2fe79820-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-qvz8h\" (UID: \"39da06e0-e9ea-4570-b486-3c0d2fe79820\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:40 crc kubenswrapper[4931]: I0130 05:21:40.054071 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfwg\" (UniqueName: \"kubernetes.io/projected/39da06e0-e9ea-4570-b486-3c0d2fe79820-kube-api-access-pbfwg\") pod \"cert-manager-cainjector-5545bd876-qvz8h\" (UID: \"39da06e0-e9ea-4570-b486-3c0d2fe79820\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:40 crc kubenswrapper[4931]: I0130 05:21:40.078001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39da06e0-e9ea-4570-b486-3c0d2fe79820-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-qvz8h\" (UID: \"39da06e0-e9ea-4570-b486-3c0d2fe79820\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:40 crc kubenswrapper[4931]: I0130 05:21:40.081628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfwg\" (UniqueName: \"kubernetes.io/projected/39da06e0-e9ea-4570-b486-3c0d2fe79820-kube-api-access-pbfwg\") pod \"cert-manager-cainjector-5545bd876-qvz8h\" (UID: \"39da06e0-e9ea-4570-b486-3c0d2fe79820\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:40 crc kubenswrapper[4931]: I0130 05:21:40.188766 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" Jan 30 05:21:40 crc kubenswrapper[4931]: I0130 05:21:40.823629 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzlbl" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="registry-server" containerID="cri-o://ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2" gracePeriod=2 Jan 30 05:21:41 crc kubenswrapper[4931]: I0130 05:21:41.560030 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.540695 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-utilities\") pod \"15d065d1-b45c-450a-9eec-aa929632433c\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.540747 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkkgz\" (UniqueName: \"kubernetes.io/projected/15d065d1-b45c-450a-9eec-aa929632433c-kube-api-access-jkkgz\") pod \"15d065d1-b45c-450a-9eec-aa929632433c\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.540816 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-catalog-content\") pod \"15d065d1-b45c-450a-9eec-aa929632433c\" (UID: \"15d065d1-b45c-450a-9eec-aa929632433c\") " Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.542791 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-utilities" (OuterVolumeSpecName: "utilities") pod "15d065d1-b45c-450a-9eec-aa929632433c" (UID: "15d065d1-b45c-450a-9eec-aa929632433c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.564074 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d065d1-b45c-450a-9eec-aa929632433c-kube-api-access-jkkgz" (OuterVolumeSpecName: "kube-api-access-jkkgz") pod "15d065d1-b45c-450a-9eec-aa929632433c" (UID: "15d065d1-b45c-450a-9eec-aa929632433c"). InnerVolumeSpecName "kube-api-access-jkkgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.608862 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.644602 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.644695 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkkgz\" (UniqueName: \"kubernetes.io/projected/15d065d1-b45c-450a-9eec-aa929632433c-kube-api-access-jkkgz\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.651188 4931 generic.go:334] "Generic (PLEG): container finished" podID="15d065d1-b45c-450a-9eec-aa929632433c" containerID="ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2" exitCode=0 Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.651241 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerDied","Data":"ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2"} Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.651267 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzlbl" event={"ID":"15d065d1-b45c-450a-9eec-aa929632433c","Type":"ContainerDied","Data":"2d693934f6449d82237d1c208f5b4b2834aad8a3fb2653ce87c489e64b971cbc"} Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.651321 4931 scope.go:117] "RemoveContainer" containerID="ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.651517 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzlbl" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.682934 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" podStartSLOduration=2.3317749770000002 podStartE2EDuration="6.682912574s" podCreationTimestamp="2026-01-30 05:21:36 +0000 UTC" firstStartedPulling="2026-01-30 05:21:37.026880701 +0000 UTC m=+832.396790958" lastFinishedPulling="2026-01-30 05:21:41.378018298 +0000 UTC m=+836.747928555" observedRunningTime="2026-01-30 05:21:42.665242627 +0000 UTC m=+838.035152884" watchObservedRunningTime="2026-01-30 05:21:42.682912574 +0000 UTC m=+838.052822831" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.702738 4931 scope.go:117] "RemoveContainer" containerID="1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269" Jan 30 05:21:42 crc kubenswrapper[4931]: W0130 05:21:42.722561 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39da06e0_e9ea_4570_b486_3c0d2fe79820.slice/crio-a1b1c4f3db08a3fca137107295c23f533b18590d886a51a91a92a351bf5f8b8a WatchSource:0}: Error finding container a1b1c4f3db08a3fca137107295c23f533b18590d886a51a91a92a351bf5f8b8a: Status 404 returned error can't find the container with id a1b1c4f3db08a3fca137107295c23f533b18590d886a51a91a92a351bf5f8b8a Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.748024 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-qvz8h"] Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.759525 4931 scope.go:117] "RemoveContainer" containerID="84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.803634 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15d065d1-b45c-450a-9eec-aa929632433c" (UID: "15d065d1-b45c-450a-9eec-aa929632433c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.807327 4931 scope.go:117] "RemoveContainer" containerID="ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2" Jan 30 05:21:42 crc kubenswrapper[4931]: E0130 05:21:42.807729 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2\": container with ID starting with ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2 not found: ID does not exist" containerID="ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.807783 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2"} err="failed to get container status \"ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2\": rpc error: code = NotFound desc = could not find container \"ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2\": container with ID starting with ed1f535365a23ff00632912b9812f20b552c9fe41ab911db1619fcd42b6463c2 not found: ID does not exist" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.807805 4931 scope.go:117] "RemoveContainer" containerID="1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269" Jan 30 05:21:42 crc kubenswrapper[4931]: E0130 05:21:42.808878 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269\": container with ID starting with 1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269 not found: ID does not exist" containerID="1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.808902 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269"} err="failed to get container status \"1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269\": rpc error: code = NotFound desc = could not find container \"1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269\": container with ID starting with 1e3b51c147db5be8431faa0661c6bbd4938b1e46b691a9473b0ddefd6ab8f269 not found: ID does not exist" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.808935 4931 scope.go:117] "RemoveContainer" containerID="84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486" Jan 30 05:21:42 crc kubenswrapper[4931]: E0130 05:21:42.812097 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486\": container with ID starting with 84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486 not found: ID does not exist" containerID="84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.812139 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486"} err="failed to get container status \"84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486\": rpc error: code = NotFound desc = could not find container \"84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486\": container with ID starting with 84c7e5f808fcf810d715e22f2d08905c6c16fce44c23bc72deb01ddb3d056486 not found: ID does not exist" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.848482 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d065d1-b45c-450a-9eec-aa929632433c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.975918 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzlbl"] Jan 30 05:21:42 crc kubenswrapper[4931]: I0130 05:21:42.981293 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzlbl"] Jan 30 05:21:43 crc kubenswrapper[4931]: I0130 05:21:43.435982 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d065d1-b45c-450a-9eec-aa929632433c" path="/var/lib/kubelet/pods/15d065d1-b45c-450a-9eec-aa929632433c/volumes" Jan 30 05:21:43 crc kubenswrapper[4931]: I0130 05:21:43.660145 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" event={"ID":"39da06e0-e9ea-4570-b486-3c0d2fe79820","Type":"ContainerStarted","Data":"73b088c3d892eac744ad92115e819cd8cee8d8079242a5dca67e10016dc0ed8c"} Jan 30 05:21:43 crc kubenswrapper[4931]: I0130 05:21:43.660210 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" event={"ID":"39da06e0-e9ea-4570-b486-3c0d2fe79820","Type":"ContainerStarted","Data":"a1b1c4f3db08a3fca137107295c23f533b18590d886a51a91a92a351bf5f8b8a"} Jan 30 05:21:43 crc kubenswrapper[4931]: I0130 05:21:43.662365 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" event={"ID":"5049b2a6-f85e-4250-9b12-c70705adaf35","Type":"ContainerStarted","Data":"8c63d0bc957465018542479545f43be67531bf71e4edd2b213c708243a140698"} Jan 30 05:21:43 crc kubenswrapper[4931]: I0130 05:21:43.690095 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-qvz8h" podStartSLOduration=4.690059276 podStartE2EDuration="4.690059276s" podCreationTimestamp="2026-01-30 05:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:43.678548542 +0000 UTC m=+839.048458839" watchObservedRunningTime="2026-01-30 05:21:43.690059276 +0000 UTC m=+839.059969573" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.734077 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-8l2w4"] Jan 30 05:21:47 crc kubenswrapper[4931]: E0130 05:21:47.735718 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="extract-content" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.735820 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="extract-content" Jan 30 05:21:47 crc kubenswrapper[4931]: E0130 05:21:47.735909 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="extract-utilities" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.735988 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="extract-utilities" Jan 30 05:21:47 crc kubenswrapper[4931]: E0130 05:21:47.736075 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="registry-server" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.736146 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="registry-server" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.736347 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d065d1-b45c-450a-9eec-aa929632433c" containerName="registry-server" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.736905 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.740549 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2wljs" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.741292 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8l2w4"] Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.920315 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34b0cb15-9c48-4bb3-89e7-85efd5b8b76c-bound-sa-token\") pod \"cert-manager-545d4d4674-8l2w4\" (UID: \"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c\") " pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:47 crc kubenswrapper[4931]: I0130 05:21:47.920804 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgct\" (UniqueName: \"kubernetes.io/projected/34b0cb15-9c48-4bb3-89e7-85efd5b8b76c-kube-api-access-dwgct\") pod \"cert-manager-545d4d4674-8l2w4\" (UID: \"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c\") " pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.022023 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgct\" (UniqueName: \"kubernetes.io/projected/34b0cb15-9c48-4bb3-89e7-85efd5b8b76c-kube-api-access-dwgct\") pod \"cert-manager-545d4d4674-8l2w4\" (UID: \"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c\") " pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.022909 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34b0cb15-9c48-4bb3-89e7-85efd5b8b76c-bound-sa-token\") pod \"cert-manager-545d4d4674-8l2w4\" (UID: \"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c\") " pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.054554 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34b0cb15-9c48-4bb3-89e7-85efd5b8b76c-bound-sa-token\") pod \"cert-manager-545d4d4674-8l2w4\" (UID: \"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c\") " pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.054931 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgct\" (UniqueName: \"kubernetes.io/projected/34b0cb15-9c48-4bb3-89e7-85efd5b8b76c-kube-api-access-dwgct\") pod \"cert-manager-545d4d4674-8l2w4\" (UID: \"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c\") " pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.064789 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8l2w4" Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.388856 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8l2w4"] Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.716393 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8l2w4" event={"ID":"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c","Type":"ContainerStarted","Data":"051fba3fd5334586372d6209b352cc8ea34ed84f5b2b4941f55de8ec5d2f3544"} Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.716821 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8l2w4" event={"ID":"34b0cb15-9c48-4bb3-89e7-85efd5b8b76c","Type":"ContainerStarted","Data":"596f227167a4e51e0a7d9a47a2c3cbbd895224dfdea08bf45fee58abe909417c"} Jan 30 05:21:48 crc kubenswrapper[4931]: I0130 05:21:48.745456 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-8l2w4" podStartSLOduration=1.745405895 podStartE2EDuration="1.745405895s" podCreationTimestamp="2026-01-30 05:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:48.739131078 +0000 UTC m=+844.109041355" watchObservedRunningTime="2026-01-30 05:21:48.745405895 +0000 UTC m=+844.115316172" Jan 30 05:21:51 crc kubenswrapper[4931]: I0130 05:21:51.702960 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-hsrfm" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.217533 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kpdtt"] Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.219664 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.223675 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.234100 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.235070 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-29t8v" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.237676 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrtl\" (UniqueName: \"kubernetes.io/projected/3fd521ad-cd31-4827-99cf-2d78ddcf12ab-kube-api-access-mlrtl\") pod \"openstack-operator-index-kpdtt\" (UID: \"3fd521ad-cd31-4827-99cf-2d78ddcf12ab\") " pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.274595 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kpdtt"] Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.338613 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrtl\" (UniqueName: \"kubernetes.io/projected/3fd521ad-cd31-4827-99cf-2d78ddcf12ab-kube-api-access-mlrtl\") pod \"openstack-operator-index-kpdtt\" (UID: \"3fd521ad-cd31-4827-99cf-2d78ddcf12ab\") " pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.369801 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrtl\" (UniqueName: \"kubernetes.io/projected/3fd521ad-cd31-4827-99cf-2d78ddcf12ab-kube-api-access-mlrtl\") pod \"openstack-operator-index-kpdtt\" (UID: \"3fd521ad-cd31-4827-99cf-2d78ddcf12ab\") " pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.615178 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:21:55 crc kubenswrapper[4931]: I0130 05:21:55.899604 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kpdtt"] Jan 30 05:21:55 crc kubenswrapper[4931]: W0130 05:21:55.909321 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd521ad_cd31_4827_99cf_2d78ddcf12ab.slice/crio-c8b5caf549c14da8f9183cfe6a9eab8cca937640c3b84c97dfebb701545714e1 WatchSource:0}: Error finding container c8b5caf549c14da8f9183cfe6a9eab8cca937640c3b84c97dfebb701545714e1: Status 404 returned error can't find the container with id c8b5caf549c14da8f9183cfe6a9eab8cca937640c3b84c97dfebb701545714e1 Jan 30 05:21:56 crc kubenswrapper[4931]: I0130 05:21:56.797128 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpdtt" event={"ID":"3fd521ad-cd31-4827-99cf-2d78ddcf12ab","Type":"ContainerStarted","Data":"c8b5caf549c14da8f9183cfe6a9eab8cca937640c3b84c97dfebb701545714e1"} Jan 30 05:21:57 crc kubenswrapper[4931]: I0130 05:21:57.804709 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpdtt" event={"ID":"3fd521ad-cd31-4827-99cf-2d78ddcf12ab","Type":"ContainerStarted","Data":"fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d"} Jan 30 05:21:57 crc kubenswrapper[4931]: I0130 05:21:57.823026 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kpdtt" podStartSLOduration=1.547232207 podStartE2EDuration="2.823012494s" podCreationTimestamp="2026-01-30 05:21:55 +0000 UTC" firstStartedPulling="2026-01-30 05:21:55.913308072 +0000 UTC m=+851.283218329" lastFinishedPulling="2026-01-30 05:21:57.189088349 +0000 UTC m=+852.558998616" observedRunningTime="2026-01-30 05:21:57.819659665 +0000 UTC m=+853.189569912" watchObservedRunningTime="2026-01-30 05:21:57.823012494 +0000 UTC m=+853.192922751" Jan 30 05:21:58 crc kubenswrapper[4931]: I0130 05:21:58.587402 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kpdtt"] Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.194791 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7znpc"] Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.195994 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.208258 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhm8\" (UniqueName: \"kubernetes.io/projected/2efa198c-4fe6-4ed2-9627-14a9ce525363-kube-api-access-hvhm8\") pod \"openstack-operator-index-7znpc\" (UID: \"2efa198c-4fe6-4ed2-9627-14a9ce525363\") " pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.220545 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7znpc"] Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.310076 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhm8\" (UniqueName: \"kubernetes.io/projected/2efa198c-4fe6-4ed2-9627-14a9ce525363-kube-api-access-hvhm8\") pod \"openstack-operator-index-7znpc\" (UID: \"2efa198c-4fe6-4ed2-9627-14a9ce525363\") " pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.344631 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhm8\" (UniqueName: \"kubernetes.io/projected/2efa198c-4fe6-4ed2-9627-14a9ce525363-kube-api-access-hvhm8\") pod \"openstack-operator-index-7znpc\" (UID: \"2efa198c-4fe6-4ed2-9627-14a9ce525363\") " pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.531298 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:21:59 crc kubenswrapper[4931]: I0130 05:21:59.820829 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kpdtt" podUID="3fd521ad-cd31-4827-99cf-2d78ddcf12ab" containerName="registry-server" containerID="cri-o://fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d" gracePeriod=2 Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.027765 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7znpc"] Jan 30 05:22:00 crc kubenswrapper[4931]: W0130 05:22:00.038184 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efa198c_4fe6_4ed2_9627_14a9ce525363.slice/crio-b9eac2e3334db1a58d7b71ee7385d902a669e28be2fafb0e798e7885f51cc328 WatchSource:0}: Error finding container b9eac2e3334db1a58d7b71ee7385d902a669e28be2fafb0e798e7885f51cc328: Status 404 returned error can't find the container with id b9eac2e3334db1a58d7b71ee7385d902a669e28be2fafb0e798e7885f51cc328 Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.225541 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.421940 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrtl\" (UniqueName: \"kubernetes.io/projected/3fd521ad-cd31-4827-99cf-2d78ddcf12ab-kube-api-access-mlrtl\") pod \"3fd521ad-cd31-4827-99cf-2d78ddcf12ab\" (UID: \"3fd521ad-cd31-4827-99cf-2d78ddcf12ab\") " Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.439640 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd521ad-cd31-4827-99cf-2d78ddcf12ab-kube-api-access-mlrtl" (OuterVolumeSpecName: "kube-api-access-mlrtl") pod "3fd521ad-cd31-4827-99cf-2d78ddcf12ab" (UID: "3fd521ad-cd31-4827-99cf-2d78ddcf12ab"). InnerVolumeSpecName "kube-api-access-mlrtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.524813 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrtl\" (UniqueName: \"kubernetes.io/projected/3fd521ad-cd31-4827-99cf-2d78ddcf12ab-kube-api-access-mlrtl\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.829654 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7znpc" event={"ID":"2efa198c-4fe6-4ed2-9627-14a9ce525363","Type":"ContainerStarted","Data":"88926be0d9c4f030e2810b4f687d3e0e633f3337293beb7ca2e2922237f4364d"} Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.829733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7znpc" event={"ID":"2efa198c-4fe6-4ed2-9627-14a9ce525363","Type":"ContainerStarted","Data":"b9eac2e3334db1a58d7b71ee7385d902a669e28be2fafb0e798e7885f51cc328"} Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.831567 4931 generic.go:334] "Generic (PLEG): container finished" podID="3fd521ad-cd31-4827-99cf-2d78ddcf12ab" containerID="fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d" exitCode=0 Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.831644 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpdtt" event={"ID":"3fd521ad-cd31-4827-99cf-2d78ddcf12ab","Type":"ContainerDied","Data":"fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d"} Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.831694 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpdtt" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.831712 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpdtt" event={"ID":"3fd521ad-cd31-4827-99cf-2d78ddcf12ab","Type":"ContainerDied","Data":"c8b5caf549c14da8f9183cfe6a9eab8cca937640c3b84c97dfebb701545714e1"} Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.831799 4931 scope.go:117] "RemoveContainer" containerID="fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.863549 4931 scope.go:117] "RemoveContainer" containerID="fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d" Jan 30 05:22:00 crc kubenswrapper[4931]: E0130 05:22:00.864301 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d\": container with ID starting with fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d not found: ID does not exist" containerID="fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.864365 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d"} err="failed to get container status \"fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d\": rpc error: code = NotFound desc = could not find container \"fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d\": container with ID starting with fba7625e8263d7e47fd7c432e2e98bf81d8fbf8f542e2b36f7fc6b008a3d242d not found: ID does not exist" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.864863 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7znpc" podStartSLOduration=1.4370506189999999 podStartE2EDuration="1.864841083s" podCreationTimestamp="2026-01-30 05:21:59 +0000 UTC" firstStartedPulling="2026-01-30 05:22:00.042654987 +0000 UTC m=+855.412565264" lastFinishedPulling="2026-01-30 05:22:00.470445471 +0000 UTC m=+855.840355728" observedRunningTime="2026-01-30 05:22:00.852620707 +0000 UTC m=+856.222531004" watchObservedRunningTime="2026-01-30 05:22:00.864841083 +0000 UTC m=+856.234751380" Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.884358 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kpdtt"] Jan 30 05:22:00 crc kubenswrapper[4931]: I0130 05:22:00.890884 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kpdtt"] Jan 30 05:22:01 crc kubenswrapper[4931]: I0130 05:22:01.436286 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd521ad-cd31-4827-99cf-2d78ddcf12ab" path="/var/lib/kubelet/pods/3fd521ad-cd31-4827-99cf-2d78ddcf12ab/volumes" Jan 30 05:22:09 crc kubenswrapper[4931]: I0130 05:22:09.532477 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:22:09 crc kubenswrapper[4931]: I0130 05:22:09.533273 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:22:09 crc kubenswrapper[4931]: I0130 05:22:09.587234 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:22:09 crc kubenswrapper[4931]: I0130 05:22:09.953675 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7znpc" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.309919 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff"] Jan 30 05:22:16 crc kubenswrapper[4931]: E0130 05:22:16.310599 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd521ad-cd31-4827-99cf-2d78ddcf12ab" containerName="registry-server" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.310621 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd521ad-cd31-4827-99cf-2d78ddcf12ab" containerName="registry-server" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.310817 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd521ad-cd31-4827-99cf-2d78ddcf12ab" containerName="registry-server" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.312199 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.315492 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mwjcp" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.328794 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff"] Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.507586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg4v\" (UniqueName: \"kubernetes.io/projected/7fc41231-569f-429f-bcc3-d7d63888874b-kube-api-access-zlg4v\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.507657 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.507759 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.609523 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg4v\" (UniqueName: \"kubernetes.io/projected/7fc41231-569f-429f-bcc3-d7d63888874b-kube-api-access-zlg4v\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.609598 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.609723 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.610548 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.611155 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.644293 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg4v\" (UniqueName: \"kubernetes.io/projected/7fc41231-569f-429f-bcc3-d7d63888874b-kube-api-access-zlg4v\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:16 crc kubenswrapper[4931]: I0130 05:22:16.943685 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:17 crc kubenswrapper[4931]: I0130 05:22:17.485027 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff"] Jan 30 05:22:17 crc kubenswrapper[4931]: I0130 05:22:17.979316 4931 generic.go:334] "Generic (PLEG): container finished" podID="7fc41231-569f-429f-bcc3-d7d63888874b" containerID="1631dbc97906fa3af4c41cf4d6966ae7a42d7a3a73774352a5be878e540d98be" exitCode=0 Jan 30 05:22:17 crc kubenswrapper[4931]: I0130 05:22:17.979501 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" event={"ID":"7fc41231-569f-429f-bcc3-d7d63888874b","Type":"ContainerDied","Data":"1631dbc97906fa3af4c41cf4d6966ae7a42d7a3a73774352a5be878e540d98be"} Jan 30 05:22:17 crc kubenswrapper[4931]: I0130 05:22:17.979837 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" event={"ID":"7fc41231-569f-429f-bcc3-d7d63888874b","Type":"ContainerStarted","Data":"99df7d919fc8654b14cd44ee0356e56e18b7f6c0d432d68347ce64f41af5146f"} Jan 30 05:22:18 crc kubenswrapper[4931]: I0130 05:22:18.991281 4931 generic.go:334] "Generic (PLEG): container finished" podID="7fc41231-569f-429f-bcc3-d7d63888874b" containerID="70f5f3382fe3f09c39091adb22d065ed10d156811c6a803959c3ca9e2e50e29b" exitCode=0 Jan 30 05:22:18 crc kubenswrapper[4931]: I0130 05:22:18.991394 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" event={"ID":"7fc41231-569f-429f-bcc3-d7d63888874b","Type":"ContainerDied","Data":"70f5f3382fe3f09c39091adb22d065ed10d156811c6a803959c3ca9e2e50e29b"} Jan 30 05:22:20 crc kubenswrapper[4931]: I0130 05:22:20.022224 4931 generic.go:334] "Generic (PLEG): container finished" podID="7fc41231-569f-429f-bcc3-d7d63888874b" containerID="03aa25a03c595886ed804fdd0f68916aaddf4c83ea63500208775fc173916782" exitCode=0 Jan 30 05:22:20 crc kubenswrapper[4931]: I0130 05:22:20.022317 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" event={"ID":"7fc41231-569f-429f-bcc3-d7d63888874b","Type":"ContainerDied","Data":"03aa25a03c595886ed804fdd0f68916aaddf4c83ea63500208775fc173916782"} Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.392367 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.588608 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-util\") pod \"7fc41231-569f-429f-bcc3-d7d63888874b\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.588695 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-bundle\") pod \"7fc41231-569f-429f-bcc3-d7d63888874b\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.588793 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlg4v\" (UniqueName: \"kubernetes.io/projected/7fc41231-569f-429f-bcc3-d7d63888874b-kube-api-access-zlg4v\") pod \"7fc41231-569f-429f-bcc3-d7d63888874b\" (UID: \"7fc41231-569f-429f-bcc3-d7d63888874b\") " Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.589945 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-bundle" (OuterVolumeSpecName: "bundle") pod "7fc41231-569f-429f-bcc3-d7d63888874b" (UID: "7fc41231-569f-429f-bcc3-d7d63888874b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.599774 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc41231-569f-429f-bcc3-d7d63888874b-kube-api-access-zlg4v" (OuterVolumeSpecName: "kube-api-access-zlg4v") pod "7fc41231-569f-429f-bcc3-d7d63888874b" (UID: "7fc41231-569f-429f-bcc3-d7d63888874b"). InnerVolumeSpecName "kube-api-access-zlg4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.619276 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-util" (OuterVolumeSpecName: "util") pod "7fc41231-569f-429f-bcc3-d7d63888874b" (UID: "7fc41231-569f-429f-bcc3-d7d63888874b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.649171 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7ztfm"] Jan 30 05:22:21 crc kubenswrapper[4931]: E0130 05:22:21.649459 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="extract" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.649474 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="extract" Jan 30 05:22:21 crc kubenswrapper[4931]: E0130 05:22:21.649490 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="pull" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.649498 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="pull" Jan 30 05:22:21 crc kubenswrapper[4931]: E0130 05:22:21.649516 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="util" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.649524 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="util" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.649678 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc41231-569f-429f-bcc3-d7d63888874b" containerName="extract" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.650662 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.666859 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7ztfm"] Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.690041 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlg4v\" (UniqueName: \"kubernetes.io/projected/7fc41231-569f-429f-bcc3-d7d63888874b-kube-api-access-zlg4v\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.690238 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.690359 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fc41231-569f-429f-bcc3-d7d63888874b-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.792166 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6b2j\" (UniqueName: \"kubernetes.io/projected/49cdbfd0-6da5-4669-b468-c4622ed9d57e-kube-api-access-p6b2j\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.792519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-catalog-content\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.792660 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-utilities\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.893222 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-catalog-content\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.893270 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-utilities\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.893322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6b2j\" (UniqueName: \"kubernetes.io/projected/49cdbfd0-6da5-4669-b468-c4622ed9d57e-kube-api-access-p6b2j\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.894020 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-catalog-content\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.894551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-utilities\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.916772 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6b2j\" (UniqueName: \"kubernetes.io/projected/49cdbfd0-6da5-4669-b468-c4622ed9d57e-kube-api-access-p6b2j\") pod \"community-operators-7ztfm\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:21 crc kubenswrapper[4931]: I0130 05:22:21.978257 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:22 crc kubenswrapper[4931]: I0130 05:22:22.037908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" event={"ID":"7fc41231-569f-429f-bcc3-d7d63888874b","Type":"ContainerDied","Data":"99df7d919fc8654b14cd44ee0356e56e18b7f6c0d432d68347ce64f41af5146f"} Jan 30 05:22:22 crc kubenswrapper[4931]: I0130 05:22:22.037954 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99df7d919fc8654b14cd44ee0356e56e18b7f6c0d432d68347ce64f41af5146f" Jan 30 05:22:22 crc kubenswrapper[4931]: I0130 05:22:22.037966 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff" Jan 30 05:22:22 crc kubenswrapper[4931]: I0130 05:22:22.447394 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7ztfm"] Jan 30 05:22:23 crc kubenswrapper[4931]: I0130 05:22:23.048395 4931 generic.go:334] "Generic (PLEG): container finished" podID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerID="314d724b02d6787a215dc9b3f666ab62c4c4c772bf2b499443df86e2faaa1c65" exitCode=0 Jan 30 05:22:23 crc kubenswrapper[4931]: I0130 05:22:23.048515 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerDied","Data":"314d724b02d6787a215dc9b3f666ab62c4c4c772bf2b499443df86e2faaa1c65"} Jan 30 05:22:23 crc kubenswrapper[4931]: I0130 05:22:23.048565 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerStarted","Data":"3f94c7bde2b7138835d9baec6c7d669c847406880915fd1b9d23e1f123838119"} Jan 30 05:22:24 crc kubenswrapper[4931]: I0130 05:22:24.073526 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerStarted","Data":"b7e2d85f8046162ebaf6700ba0b11f2c28515570d4dc72972405a6137dfd4e9d"} Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.043119 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qrzn"] Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.045148 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.061799 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qrzn"] Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.091443 4931 generic.go:334] "Generic (PLEG): container finished" podID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerID="b7e2d85f8046162ebaf6700ba0b11f2c28515570d4dc72972405a6137dfd4e9d" exitCode=0 Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.091492 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerDied","Data":"b7e2d85f8046162ebaf6700ba0b11f2c28515570d4dc72972405a6137dfd4e9d"} Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.145380 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-utilities\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.145452 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qfh\" (UniqueName: \"kubernetes.io/projected/2a095d89-69cc-45d2-89b3-f363ba80192b-kube-api-access-c2qfh\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.145495 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-catalog-content\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.246772 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-utilities\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.246826 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qfh\" (UniqueName: \"kubernetes.io/projected/2a095d89-69cc-45d2-89b3-f363ba80192b-kube-api-access-c2qfh\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.246865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-catalog-content\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.247551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-catalog-content\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.247584 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-utilities\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.284652 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qfh\" (UniqueName: \"kubernetes.io/projected/2a095d89-69cc-45d2-89b3-f363ba80192b-kube-api-access-c2qfh\") pod \"certified-operators-6qrzn\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.371011 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:25 crc kubenswrapper[4931]: I0130 05:22:25.685364 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qrzn"] Jan 30 05:22:26 crc kubenswrapper[4931]: I0130 05:22:26.101313 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerStarted","Data":"662675c88d6a303836d0fba35c2d80ebd8b2008c91220499fa1991520747f37e"} Jan 30 05:22:26 crc kubenswrapper[4931]: I0130 05:22:26.104535 4931 generic.go:334] "Generic (PLEG): container finished" podID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerID="6f5368956d7211b22c5fc65e3edeafd0d348af26b0c6292fae2e9e7510e83b64" exitCode=0 Jan 30 05:22:26 crc kubenswrapper[4931]: I0130 05:22:26.104578 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qrzn" event={"ID":"2a095d89-69cc-45d2-89b3-f363ba80192b","Type":"ContainerDied","Data":"6f5368956d7211b22c5fc65e3edeafd0d348af26b0c6292fae2e9e7510e83b64"} Jan 30 05:22:26 crc kubenswrapper[4931]: I0130 05:22:26.104611 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qrzn" event={"ID":"2a095d89-69cc-45d2-89b3-f363ba80192b","Type":"ContainerStarted","Data":"e14a4370cfa1b43d0fed3626b0eff804af5fd5b2c13e420d03f69008172714ba"} Jan 30 05:22:26 crc kubenswrapper[4931]: I0130 05:22:26.122186 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7ztfm" podStartSLOduration=2.667332144 podStartE2EDuration="5.122170308s" podCreationTimestamp="2026-01-30 05:22:21 +0000 UTC" firstStartedPulling="2026-01-30 05:22:23.051452882 +0000 UTC m=+878.421363179" lastFinishedPulling="2026-01-30 05:22:25.506291086 +0000 UTC m=+880.876201343" observedRunningTime="2026-01-30 05:22:26.120197822 +0000 UTC m=+881.490108109" watchObservedRunningTime="2026-01-30 05:22:26.122170308 +0000 UTC m=+881.492080565" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.121626 4931 generic.go:334] "Generic (PLEG): container finished" podID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerID="cf8658320b6c2873656da860206e90f8011ee8e08a294c3eccdfc50db5066b48" exitCode=0 Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.123103 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qrzn" event={"ID":"2a095d89-69cc-45d2-89b3-f363ba80192b","Type":"ContainerDied","Data":"cf8658320b6c2873656da860206e90f8011ee8e08a294c3eccdfc50db5066b48"} Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.301524 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9"] Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.303334 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.307313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7pfmh" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.325580 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9"] Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.362874 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.362935 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.475951 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dthxp\" (UniqueName: \"kubernetes.io/projected/27c443b8-82d2-41c1-b747-b89e6cb44f16-kube-api-access-dthxp\") pod \"openstack-operator-controller-init-757f46c65d-rscb9\" (UID: \"27c443b8-82d2-41c1-b747-b89e6cb44f16\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.577381 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dthxp\" (UniqueName: \"kubernetes.io/projected/27c443b8-82d2-41c1-b747-b89e6cb44f16-kube-api-access-dthxp\") pod \"openstack-operator-controller-init-757f46c65d-rscb9\" (UID: \"27c443b8-82d2-41c1-b747-b89e6cb44f16\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.599829 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dthxp\" (UniqueName: \"kubernetes.io/projected/27c443b8-82d2-41c1-b747-b89e6cb44f16-kube-api-access-dthxp\") pod \"openstack-operator-controller-init-757f46c65d-rscb9\" (UID: \"27c443b8-82d2-41c1-b747-b89e6cb44f16\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.629933 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:27 crc kubenswrapper[4931]: I0130 05:22:27.906983 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9"] Jan 30 05:22:27 crc kubenswrapper[4931]: W0130 05:22:27.915788 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27c443b8_82d2_41c1_b747_b89e6cb44f16.slice/crio-57c02a6fe2a6af8364a73d91ebf8b1df7f7ce70faeba0627d6f563cf4c4951bc WatchSource:0}: Error finding container 57c02a6fe2a6af8364a73d91ebf8b1df7f7ce70faeba0627d6f563cf4c4951bc: Status 404 returned error can't find the container with id 57c02a6fe2a6af8364a73d91ebf8b1df7f7ce70faeba0627d6f563cf4c4951bc Jan 30 05:22:28 crc kubenswrapper[4931]: I0130 05:22:28.131375 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" event={"ID":"27c443b8-82d2-41c1-b747-b89e6cb44f16","Type":"ContainerStarted","Data":"57c02a6fe2a6af8364a73d91ebf8b1df7f7ce70faeba0627d6f563cf4c4951bc"} Jan 30 05:22:28 crc kubenswrapper[4931]: I0130 05:22:28.134862 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qrzn" event={"ID":"2a095d89-69cc-45d2-89b3-f363ba80192b","Type":"ContainerStarted","Data":"cfc2d03f7956dc0e0818cdc83356a4e8e3fce5194a922940559190e5c250570f"} Jan 30 05:22:28 crc kubenswrapper[4931]: I0130 05:22:28.158130 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qrzn" podStartSLOduration=1.711368295 podStartE2EDuration="3.158112226s" podCreationTimestamp="2026-01-30 05:22:25 +0000 UTC" firstStartedPulling="2026-01-30 05:22:26.106338397 +0000 UTC m=+881.476248684" lastFinishedPulling="2026-01-30 05:22:27.553082358 +0000 UTC m=+882.922992615" observedRunningTime="2026-01-30 05:22:28.155384182 +0000 UTC m=+883.525294459" watchObservedRunningTime="2026-01-30 05:22:28.158112226 +0000 UTC m=+883.528022483" Jan 30 05:22:31 crc kubenswrapper[4931]: I0130 05:22:31.979459 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:31 crc kubenswrapper[4931]: I0130 05:22:31.979833 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:32 crc kubenswrapper[4931]: I0130 05:22:32.040983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:32 crc kubenswrapper[4931]: I0130 05:22:32.245049 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:33 crc kubenswrapper[4931]: I0130 05:22:33.179288 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" event={"ID":"27c443b8-82d2-41c1-b747-b89e6cb44f16","Type":"ContainerStarted","Data":"0a18f96cc0c8c1e6c5347eb4ea7dc68d3091fed656839711a9073b590c3d7621"} Jan 30 05:22:33 crc kubenswrapper[4931]: I0130 05:22:33.179403 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:33 crc kubenswrapper[4931]: I0130 05:22:33.224702 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" podStartSLOduration=1.6786490120000002 podStartE2EDuration="6.224677296s" podCreationTimestamp="2026-01-30 05:22:27 +0000 UTC" firstStartedPulling="2026-01-30 05:22:27.918370368 +0000 UTC m=+883.288280625" lastFinishedPulling="2026-01-30 05:22:32.464398652 +0000 UTC m=+887.834308909" observedRunningTime="2026-01-30 05:22:33.220344836 +0000 UTC m=+888.590255133" watchObservedRunningTime="2026-01-30 05:22:33.224677296 +0000 UTC m=+888.594587593" Jan 30 05:22:34 crc kubenswrapper[4931]: I0130 05:22:34.430260 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7ztfm"] Jan 30 05:22:34 crc kubenswrapper[4931]: I0130 05:22:34.430818 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7ztfm" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="registry-server" containerID="cri-o://662675c88d6a303836d0fba35c2d80ebd8b2008c91220499fa1991520747f37e" gracePeriod=2 Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.200013 4931 generic.go:334] "Generic (PLEG): container finished" podID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerID="662675c88d6a303836d0fba35c2d80ebd8b2008c91220499fa1991520747f37e" exitCode=0 Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.200054 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerDied","Data":"662675c88d6a303836d0fba35c2d80ebd8b2008c91220499fa1991520747f37e"} Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.371250 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.371558 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.375590 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.456475 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.487952 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-utilities\") pod \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.488034 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6b2j\" (UniqueName: \"kubernetes.io/projected/49cdbfd0-6da5-4669-b468-c4622ed9d57e-kube-api-access-p6b2j\") pod \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.488098 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-catalog-content\") pod \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\" (UID: \"49cdbfd0-6da5-4669-b468-c4622ed9d57e\") " Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.490789 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-utilities" (OuterVolumeSpecName: "utilities") pod "49cdbfd0-6da5-4669-b468-c4622ed9d57e" (UID: "49cdbfd0-6da5-4669-b468-c4622ed9d57e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.507472 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49cdbfd0-6da5-4669-b468-c4622ed9d57e-kube-api-access-p6b2j" (OuterVolumeSpecName: "kube-api-access-p6b2j") pod "49cdbfd0-6da5-4669-b468-c4622ed9d57e" (UID: "49cdbfd0-6da5-4669-b468-c4622ed9d57e"). InnerVolumeSpecName "kube-api-access-p6b2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.575507 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49cdbfd0-6da5-4669-b468-c4622ed9d57e" (UID: "49cdbfd0-6da5-4669-b468-c4622ed9d57e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.590062 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.590114 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6b2j\" (UniqueName: \"kubernetes.io/projected/49cdbfd0-6da5-4669-b468-c4622ed9d57e-kube-api-access-p6b2j\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:35 crc kubenswrapper[4931]: I0130 05:22:35.590143 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49cdbfd0-6da5-4669-b468-c4622ed9d57e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.213872 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ztfm" event={"ID":"49cdbfd0-6da5-4669-b468-c4622ed9d57e","Type":"ContainerDied","Data":"3f94c7bde2b7138835d9baec6c7d669c847406880915fd1b9d23e1f123838119"} Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.213955 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ztfm" Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.213970 4931 scope.go:117] "RemoveContainer" containerID="662675c88d6a303836d0fba35c2d80ebd8b2008c91220499fa1991520747f37e" Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.240237 4931 scope.go:117] "RemoveContainer" containerID="b7e2d85f8046162ebaf6700ba0b11f2c28515570d4dc72972405a6137dfd4e9d" Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.283806 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7ztfm"] Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.284000 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.286264 4931 scope.go:117] "RemoveContainer" containerID="314d724b02d6787a215dc9b3f666ab62c4c4c772bf2b499443df86e2faaa1c65" Jan 30 05:22:36 crc kubenswrapper[4931]: I0130 05:22:36.287729 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7ztfm"] Jan 30 05:22:37 crc kubenswrapper[4931]: I0130 05:22:37.435793 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" path="/var/lib/kubelet/pods/49cdbfd0-6da5-4669-b468-c4622ed9d57e/volumes" Jan 30 05:22:37 crc kubenswrapper[4931]: I0130 05:22:37.634176 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-rscb9" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.037139 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qrzn"] Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.037553 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qrzn" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="registry-server" containerID="cri-o://cfc2d03f7956dc0e0818cdc83356a4e8e3fce5194a922940559190e5c250570f" gracePeriod=2 Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.245517 4931 generic.go:334] "Generic (PLEG): container finished" podID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerID="cfc2d03f7956dc0e0818cdc83356a4e8e3fce5194a922940559190e5c250570f" exitCode=0 Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.245742 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qrzn" event={"ID":"2a095d89-69cc-45d2-89b3-f363ba80192b","Type":"ContainerDied","Data":"cfc2d03f7956dc0e0818cdc83356a4e8e3fce5194a922940559190e5c250570f"} Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.504759 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.654854 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-utilities\") pod \"2a095d89-69cc-45d2-89b3-f363ba80192b\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.654947 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-catalog-content\") pod \"2a095d89-69cc-45d2-89b3-f363ba80192b\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.655036 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2qfh\" (UniqueName: \"kubernetes.io/projected/2a095d89-69cc-45d2-89b3-f363ba80192b-kube-api-access-c2qfh\") pod \"2a095d89-69cc-45d2-89b3-f363ba80192b\" (UID: \"2a095d89-69cc-45d2-89b3-f363ba80192b\") " Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.656725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-utilities" (OuterVolumeSpecName: "utilities") pod "2a095d89-69cc-45d2-89b3-f363ba80192b" (UID: "2a095d89-69cc-45d2-89b3-f363ba80192b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.660464 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a095d89-69cc-45d2-89b3-f363ba80192b-kube-api-access-c2qfh" (OuterVolumeSpecName: "kube-api-access-c2qfh") pod "2a095d89-69cc-45d2-89b3-f363ba80192b" (UID: "2a095d89-69cc-45d2-89b3-f363ba80192b"). InnerVolumeSpecName "kube-api-access-c2qfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.739676 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a095d89-69cc-45d2-89b3-f363ba80192b" (UID: "2a095d89-69cc-45d2-89b3-f363ba80192b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.757071 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2qfh\" (UniqueName: \"kubernetes.io/projected/2a095d89-69cc-45d2-89b3-f363ba80192b-kube-api-access-c2qfh\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.757126 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:39 crc kubenswrapper[4931]: I0130 05:22:39.757147 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a095d89-69cc-45d2-89b3-f363ba80192b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.256671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qrzn" event={"ID":"2a095d89-69cc-45d2-89b3-f363ba80192b","Type":"ContainerDied","Data":"e14a4370cfa1b43d0fed3626b0eff804af5fd5b2c13e420d03f69008172714ba"} Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.256743 4931 scope.go:117] "RemoveContainer" containerID="cfc2d03f7956dc0e0818cdc83356a4e8e3fce5194a922940559190e5c250570f" Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.256921 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qrzn" Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.284745 4931 scope.go:117] "RemoveContainer" containerID="cf8658320b6c2873656da860206e90f8011ee8e08a294c3eccdfc50db5066b48" Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.337340 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qrzn"] Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.346212 4931 scope.go:117] "RemoveContainer" containerID="6f5368956d7211b22c5fc65e3edeafd0d348af26b0c6292fae2e9e7510e83b64" Jan 30 05:22:40 crc kubenswrapper[4931]: I0130 05:22:40.349381 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qrzn"] Jan 30 05:22:41 crc kubenswrapper[4931]: I0130 05:22:41.611602 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" path="/var/lib/kubelet/pods/2a095d89-69cc-45d2-89b3-f363ba80192b/volumes" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.292241 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qhkx7"] Jan 30 05:22:44 crc kubenswrapper[4931]: E0130 05:22:44.292610 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="extract-utilities" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.292631 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="extract-utilities" Jan 30 05:22:44 crc kubenswrapper[4931]: E0130 05:22:44.292667 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="registry-server" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293073 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="registry-server" Jan 30 05:22:44 crc kubenswrapper[4931]: E0130 05:22:44.293116 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="registry-server" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293129 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="registry-server" Jan 30 05:22:44 crc kubenswrapper[4931]: E0130 05:22:44.293154 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="extract-content" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293168 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="extract-content" Jan 30 05:22:44 crc kubenswrapper[4931]: E0130 05:22:44.293184 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="extract-utilities" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293196 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="extract-utilities" Jan 30 05:22:44 crc kubenswrapper[4931]: E0130 05:22:44.293216 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="extract-content" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293228 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="extract-content" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293461 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a095d89-69cc-45d2-89b3-f363ba80192b" containerName="registry-server" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.293488 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="49cdbfd0-6da5-4669-b468-c4622ed9d57e" containerName="registry-server" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.294847 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.311646 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhkx7"] Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.444927 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-utilities\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.445023 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb44p\" (UniqueName: \"kubernetes.io/projected/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-kube-api-access-hb44p\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.445191 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-catalog-content\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.547035 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb44p\" (UniqueName: \"kubernetes.io/projected/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-kube-api-access-hb44p\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.547094 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-catalog-content\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.547250 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-utilities\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.547677 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-utilities\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.548468 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-catalog-content\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.570475 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb44p\" (UniqueName: \"kubernetes.io/projected/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-kube-api-access-hb44p\") pod \"redhat-marketplace-qhkx7\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.619118 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:44 crc kubenswrapper[4931]: I0130 05:22:44.869502 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhkx7"] Jan 30 05:22:45 crc kubenswrapper[4931]: I0130 05:22:45.298889 4931 generic.go:334] "Generic (PLEG): container finished" podID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerID="3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e" exitCode=0 Jan 30 05:22:45 crc kubenswrapper[4931]: I0130 05:22:45.298934 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhkx7" event={"ID":"53b1acfb-8c75-448d-9ef1-94eb07b92e6b","Type":"ContainerDied","Data":"3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e"} Jan 30 05:22:45 crc kubenswrapper[4931]: I0130 05:22:45.298988 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhkx7" event={"ID":"53b1acfb-8c75-448d-9ef1-94eb07b92e6b","Type":"ContainerStarted","Data":"0fc85161717a539fa8da0c468567aa7f1e508cd927e38c401b64d7b954772d30"} Jan 30 05:22:46 crc kubenswrapper[4931]: I0130 05:22:46.309069 4931 generic.go:334] "Generic (PLEG): container finished" podID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerID="e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2" exitCode=0 Jan 30 05:22:46 crc kubenswrapper[4931]: I0130 05:22:46.309279 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhkx7" event={"ID":"53b1acfb-8c75-448d-9ef1-94eb07b92e6b","Type":"ContainerDied","Data":"e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2"} Jan 30 05:22:47 crc kubenswrapper[4931]: I0130 05:22:47.323514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhkx7" event={"ID":"53b1acfb-8c75-448d-9ef1-94eb07b92e6b","Type":"ContainerStarted","Data":"93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9"} Jan 30 05:22:47 crc kubenswrapper[4931]: I0130 05:22:47.352971 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qhkx7" podStartSLOduration=1.890929268 podStartE2EDuration="3.352946081s" podCreationTimestamp="2026-01-30 05:22:44 +0000 UTC" firstStartedPulling="2026-01-30 05:22:45.300466132 +0000 UTC m=+900.670376429" lastFinishedPulling="2026-01-30 05:22:46.762482975 +0000 UTC m=+902.132393242" observedRunningTime="2026-01-30 05:22:47.348842397 +0000 UTC m=+902.718752694" watchObservedRunningTime="2026-01-30 05:22:47.352946081 +0000 UTC m=+902.722856368" Jan 30 05:22:54 crc kubenswrapper[4931]: I0130 05:22:54.620723 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:54 crc kubenswrapper[4931]: I0130 05:22:54.621271 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:54 crc kubenswrapper[4931]: I0130 05:22:54.691701 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:55 crc kubenswrapper[4931]: I0130 05:22:55.450631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:56 crc kubenswrapper[4931]: I0130 05:22:56.643930 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhkx7"] Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.363456 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.363535 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.397606 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qhkx7" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="registry-server" containerID="cri-o://93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9" gracePeriod=2 Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.783575 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.874563 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb44p\" (UniqueName: \"kubernetes.io/projected/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-kube-api-access-hb44p\") pod \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.874650 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-utilities\") pod \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.874677 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-catalog-content\") pod \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\" (UID: \"53b1acfb-8c75-448d-9ef1-94eb07b92e6b\") " Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.875389 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-utilities" (OuterVolumeSpecName: "utilities") pod "53b1acfb-8c75-448d-9ef1-94eb07b92e6b" (UID: "53b1acfb-8c75-448d-9ef1-94eb07b92e6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.894704 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53b1acfb-8c75-448d-9ef1-94eb07b92e6b" (UID: "53b1acfb-8c75-448d-9ef1-94eb07b92e6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.895199 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-kube-api-access-hb44p" (OuterVolumeSpecName: "kube-api-access-hb44p") pod "53b1acfb-8c75-448d-9ef1-94eb07b92e6b" (UID: "53b1acfb-8c75-448d-9ef1-94eb07b92e6b"). InnerVolumeSpecName "kube-api-access-hb44p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.975337 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb44p\" (UniqueName: \"kubernetes.io/projected/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-kube-api-access-hb44p\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.975368 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:57 crc kubenswrapper[4931]: I0130 05:22:57.975379 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b1acfb-8c75-448d-9ef1-94eb07b92e6b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.408835 4931 generic.go:334] "Generic (PLEG): container finished" podID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerID="93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9" exitCode=0 Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.408936 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhkx7" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.408958 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhkx7" event={"ID":"53b1acfb-8c75-448d-9ef1-94eb07b92e6b","Type":"ContainerDied","Data":"93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9"} Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.409070 4931 scope.go:117] "RemoveContainer" containerID="93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.409234 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhkx7" event={"ID":"53b1acfb-8c75-448d-9ef1-94eb07b92e6b","Type":"ContainerDied","Data":"0fc85161717a539fa8da0c468567aa7f1e508cd927e38c401b64d7b954772d30"} Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.441821 4931 scope.go:117] "RemoveContainer" containerID="e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.455028 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhkx7"] Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.461156 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhkx7"] Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.468239 4931 scope.go:117] "RemoveContainer" containerID="3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.513683 4931 scope.go:117] "RemoveContainer" containerID="93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9" Jan 30 05:22:58 crc kubenswrapper[4931]: E0130 05:22:58.514104 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9\": container with ID starting with 93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9 not found: ID does not exist" containerID="93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.514146 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9"} err="failed to get container status \"93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9\": rpc error: code = NotFound desc = could not find container \"93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9\": container with ID starting with 93cb27deefd25bc0a0cfa7742cbed5fe5edbbc049bfbd9f8b6f8883f1db330d9 not found: ID does not exist" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.514170 4931 scope.go:117] "RemoveContainer" containerID="e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2" Jan 30 05:22:58 crc kubenswrapper[4931]: E0130 05:22:58.514591 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2\": container with ID starting with e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2 not found: ID does not exist" containerID="e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.514616 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2"} err="failed to get container status \"e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2\": rpc error: code = NotFound desc = could not find container \"e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2\": container with ID starting with e9b29114ef6554ae05e9356410b8f0c63040eddcdda9e52cdd177b28297f60b2 not found: ID does not exist" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.514632 4931 scope.go:117] "RemoveContainer" containerID="3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e" Jan 30 05:22:58 crc kubenswrapper[4931]: E0130 05:22:58.515249 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e\": container with ID starting with 3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e not found: ID does not exist" containerID="3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e" Jan 30 05:22:58 crc kubenswrapper[4931]: I0130 05:22:58.515278 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e"} err="failed to get container status \"3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e\": rpc error: code = NotFound desc = could not find container \"3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e\": container with ID starting with 3f4c917911a567a55fdcec78e67814ce33c0c7e69484fc283691b551ba36eb0e not found: ID does not exist" Jan 30 05:22:59 crc kubenswrapper[4931]: I0130 05:22:59.434158 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" path="/var/lib/kubelet/pods/53b1acfb-8c75-448d-9ef1-94eb07b92e6b/volumes" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.031204 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk"] Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.032059 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="extract-content" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.032102 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="extract-content" Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.032118 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="extract-utilities" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.032128 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="extract-utilities" Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.032143 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="registry-server" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.032151 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="registry-server" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.032283 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b1acfb-8c75-448d-9ef1-94eb07b92e6b" containerName="registry-server" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.032865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.034526 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-pjqv5" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.035022 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.035754 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.037071 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-z754h" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.043693 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.052914 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.058615 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.059834 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.063801 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pwqfx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.076461 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.077242 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.080256 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.082309 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4hght" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.096524 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.099129 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.101375 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.103029 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vrmds" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.108061 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.129572 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.130543 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.132698 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.133231 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bd57t" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.164602 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.165301 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.172944 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.173925 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.177030 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.177340 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5fpjl" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.177581 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lrgxk" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.187153 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.191631 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.192477 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqxq\" (UniqueName: \"kubernetes.io/projected/dea1ae69-0c15-4228-a323-dc6f762e3c82-kube-api-access-fjqxq\") pod \"designate-operator-controller-manager-6d9697b7f4-bf56z\" (UID: \"dea1ae69-0c15-4228-a323-dc6f762e3c82\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.192517 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjd2d\" (UniqueName: \"kubernetes.io/projected/eb76dd84-30db-4769-852c-9a42814949d7-kube-api-access-xjd2d\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-mttxk\" (UID: \"eb76dd84-30db-4769-852c-9a42814949d7\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.192549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6s9\" (UniqueName: \"kubernetes.io/projected/80b25db7-e1c2-4787-89f4-952cd7e845ba-kube-api-access-ff6s9\") pod \"cinder-operator-controller-manager-8d874c8fc-4wv6z\" (UID: \"80b25db7-e1c2-4787-89f4-952cd7e845ba\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.192611 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n968v\" (UniqueName: \"kubernetes.io/projected/2773429e-ccbb-43a4-a88a-a1cd41a63e10-kube-api-access-n968v\") pod \"glance-operator-controller-manager-8886f4c47-nsn26\" (UID: \"2773429e-ccbb-43a4-a88a-a1cd41a63e10\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.192952 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.196893 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jfhnm" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.200291 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.236238 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.262394 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.275631 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.283828 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xggwq" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.285085 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.285845 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.287214 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-knbw2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.290986 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293353 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllrk\" (UniqueName: \"kubernetes.io/projected/ce7feb31-22f3-42d9-83b1-cd9155abae99-kube-api-access-zllrk\") pod \"horizon-operator-controller-manager-5fb775575f-l5dv2\" (UID: \"ce7feb31-22f3-42d9-83b1-cd9155abae99\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293392 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n968v\" (UniqueName: \"kubernetes.io/projected/2773429e-ccbb-43a4-a88a-a1cd41a63e10-kube-api-access-n968v\") pod \"glance-operator-controller-manager-8886f4c47-nsn26\" (UID: \"2773429e-ccbb-43a4-a88a-a1cd41a63e10\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293432 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6q9\" (UniqueName: \"kubernetes.io/projected/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-kube-api-access-7s6q9\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293459 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqxq\" (UniqueName: \"kubernetes.io/projected/dea1ae69-0c15-4228-a323-dc6f762e3c82-kube-api-access-fjqxq\") pod \"designate-operator-controller-manager-6d9697b7f4-bf56z\" (UID: \"dea1ae69-0c15-4228-a323-dc6f762e3c82\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293478 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qq8\" (UniqueName: \"kubernetes.io/projected/33b18ace-2da3-4bad-b093-d7db2aad7f50-kube-api-access-j2qq8\") pod \"ironic-operator-controller-manager-5f4b8bd54d-v9fgj\" (UID: \"33b18ace-2da3-4bad-b093-d7db2aad7f50\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293499 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjd2d\" (UniqueName: \"kubernetes.io/projected/eb76dd84-30db-4769-852c-9a42814949d7-kube-api-access-xjd2d\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-mttxk\" (UID: \"eb76dd84-30db-4769-852c-9a42814949d7\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293520 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293542 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6s9\" (UniqueName: \"kubernetes.io/projected/80b25db7-e1c2-4787-89f4-952cd7e845ba-kube-api-access-ff6s9\") pod \"cinder-operator-controller-manager-8d874c8fc-4wv6z\" (UID: \"80b25db7-e1c2-4787-89f4-952cd7e845ba\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293558 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhxdn\" (UniqueName: \"kubernetes.io/projected/cc5025a4-0807-478d-831a-c6ed424628a9-kube-api-access-qhxdn\") pod \"keystone-operator-controller-manager-84f48565d4-ddtbw\" (UID: \"cc5025a4-0807-478d-831a-c6ed424628a9\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.293583 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldhpd\" (UniqueName: \"kubernetes.io/projected/d806e5bf-8346-46c0-a3de-5f8412e92b4f-kube-api-access-ldhpd\") pod \"heat-operator-controller-manager-69d6db494d-lmgq2\" (UID: \"d806e5bf-8346-46c0-a3de-5f8412e92b4f\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.314567 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.317239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6s9\" (UniqueName: \"kubernetes.io/projected/80b25db7-e1c2-4787-89f4-952cd7e845ba-kube-api-access-ff6s9\") pod \"cinder-operator-controller-manager-8d874c8fc-4wv6z\" (UID: \"80b25db7-e1c2-4787-89f4-952cd7e845ba\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.317939 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqxq\" (UniqueName: \"kubernetes.io/projected/dea1ae69-0c15-4228-a323-dc6f762e3c82-kube-api-access-fjqxq\") pod \"designate-operator-controller-manager-6d9697b7f4-bf56z\" (UID: \"dea1ae69-0c15-4228-a323-dc6f762e3c82\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.320068 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.321444 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.322210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjd2d\" (UniqueName: \"kubernetes.io/projected/eb76dd84-30db-4769-852c-9a42814949d7-kube-api-access-xjd2d\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-mttxk\" (UID: \"eb76dd84-30db-4769-852c-9a42814949d7\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.323313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2zrxv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.323650 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n968v\" (UniqueName: \"kubernetes.io/projected/2773429e-ccbb-43a4-a88a-a1cd41a63e10-kube-api-access-n968v\") pod \"glance-operator-controller-manager-8886f4c47-nsn26\" (UID: \"2773429e-ccbb-43a4-a88a-a1cd41a63e10\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.330410 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.331155 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.333808 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hdxvm" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.346498 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.347588 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.350907 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fn7xp" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.351049 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.355157 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.356154 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.361269 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.369384 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.375784 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.376573 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.377954 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.387146 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.388713 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.390536 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.390773 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ppcml" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.390935 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zl4cx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.391333 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.393908 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.394549 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395193 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhxdn\" (UniqueName: \"kubernetes.io/projected/cc5025a4-0807-478d-831a-c6ed424628a9-kube-api-access-qhxdn\") pod \"keystone-operator-controller-manager-84f48565d4-ddtbw\" (UID: \"cc5025a4-0807-478d-831a-c6ed424628a9\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395225 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhpd\" (UniqueName: \"kubernetes.io/projected/d806e5bf-8346-46c0-a3de-5f8412e92b4f-kube-api-access-ldhpd\") pod \"heat-operator-controller-manager-69d6db494d-lmgq2\" (UID: \"d806e5bf-8346-46c0-a3de-5f8412e92b4f\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395273 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rhb6\" (UniqueName: \"kubernetes.io/projected/8553945b-dfe3-4c77-bb73-dce58c6ad3ba-kube-api-access-7rhb6\") pod \"mariadb-operator-controller-manager-67bf948998-fsdvn\" (UID: \"8553945b-dfe3-4c77-bb73-dce58c6ad3ba\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395294 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllrk\" (UniqueName: \"kubernetes.io/projected/ce7feb31-22f3-42d9-83b1-cd9155abae99-kube-api-access-zllrk\") pod \"horizon-operator-controller-manager-5fb775575f-l5dv2\" (UID: \"ce7feb31-22f3-42d9-83b1-cd9155abae99\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395320 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/a3f6ed4d-518f-4415-9378-73fca072d431-kube-api-access-xchc4\") pod \"manila-operator-controller-manager-7dd968899f-5sgtg\" (UID: \"a3f6ed4d-518f-4415-9378-73fca072d431\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395338 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6q9\" (UniqueName: \"kubernetes.io/projected/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-kube-api-access-7s6q9\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395362 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qq8\" (UniqueName: \"kubernetes.io/projected/33b18ace-2da3-4bad-b093-d7db2aad7f50-kube-api-access-j2qq8\") pod \"ironic-operator-controller-manager-5f4b8bd54d-v9fgj\" (UID: \"33b18ace-2da3-4bad-b093-d7db2aad7f50\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.395385 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.395481 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.395517 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert podName:29ae7a52-ff32-4f97-8f6c-830ac4e4b40b nodeName:}" failed. No retries permitted until 2026-01-30 05:23:18.895503377 +0000 UTC m=+934.265413634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert") pod "infra-operator-controller-manager-79955696d6-tzxqv" (UID: "29ae7a52-ff32-4f97-8f6c-830ac4e4b40b") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.396410 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tx9mf" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.396698 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.407526 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.409165 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.416992 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldhpd\" (UniqueName: \"kubernetes.io/projected/d806e5bf-8346-46c0-a3de-5f8412e92b4f-kube-api-access-ldhpd\") pod \"heat-operator-controller-manager-69d6db494d-lmgq2\" (UID: \"d806e5bf-8346-46c0-a3de-5f8412e92b4f\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.422855 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6q9\" (UniqueName: \"kubernetes.io/projected/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-kube-api-access-7s6q9\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.422862 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.427695 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllrk\" (UniqueName: \"kubernetes.io/projected/ce7feb31-22f3-42d9-83b1-cd9155abae99-kube-api-access-zllrk\") pod \"horizon-operator-controller-manager-5fb775575f-l5dv2\" (UID: \"ce7feb31-22f3-42d9-83b1-cd9155abae99\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.429970 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhxdn\" (UniqueName: \"kubernetes.io/projected/cc5025a4-0807-478d-831a-c6ed424628a9-kube-api-access-qhxdn\") pod \"keystone-operator-controller-manager-84f48565d4-ddtbw\" (UID: \"cc5025a4-0807-478d-831a-c6ed424628a9\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.431178 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.432027 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.434859 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-p2zpj" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.437347 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qq8\" (UniqueName: \"kubernetes.io/projected/33b18ace-2da3-4bad-b093-d7db2aad7f50-kube-api-access-j2qq8\") pod \"ironic-operator-controller-manager-5f4b8bd54d-v9fgj\" (UID: \"33b18ace-2da3-4bad-b093-d7db2aad7f50\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.438865 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.452551 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.482842 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.484795 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.486616 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vk7pd" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496222 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8xbx\" (UniqueName: \"kubernetes.io/projected/2b83a9b3-5579-438f-8f65-effa382b726c-kube-api-access-c8xbx\") pod \"nova-operator-controller-manager-55bff696bd-5l9jv\" (UID: \"2b83a9b3-5579-438f-8f65-effa382b726c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496270 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncsq\" (UniqueName: \"kubernetes.io/projected/59634caa-7fe0-49a1-98bf-dbc61a15f495-kube-api-access-nncsq\") pod \"placement-operator-controller-manager-5b964cf4cd-t4scx\" (UID: \"59634caa-7fe0-49a1-98bf-dbc61a15f495\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcj4w\" (UniqueName: \"kubernetes.io/projected/a536697c-8056-4907-a09e-b23aa129435d-kube-api-access-hcj4w\") pod \"ovn-operator-controller-manager-788c46999f-mkk7j\" (UID: \"a536697c-8056-4907-a09e-b23aa129435d\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496386 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zmp\" (UniqueName: \"kubernetes.io/projected/5e6de10d-baf2-4ef4-9acf-d093ee65c4fd-kube-api-access-f5zmp\") pod \"neutron-operator-controller-manager-585dbc889-wssqz\" (UID: \"5e6de10d-baf2-4ef4-9acf-d093ee65c4fd\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496409 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nt5x\" (UniqueName: \"kubernetes.io/projected/47b128c8-46ef-422c-aabc-1220f85fef83-kube-api-access-7nt5x\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496467 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rhb6\" (UniqueName: \"kubernetes.io/projected/8553945b-dfe3-4c77-bb73-dce58c6ad3ba-kube-api-access-7rhb6\") pod \"mariadb-operator-controller-manager-67bf948998-fsdvn\" (UID: \"8553945b-dfe3-4c77-bb73-dce58c6ad3ba\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496495 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hqdv\" (UniqueName: \"kubernetes.io/projected/456074da-531d-471b-92d3-cb4ea156bfae-kube-api-access-2hqdv\") pod \"octavia-operator-controller-manager-6687f8d877-kndp7\" (UID: \"456074da-531d-471b-92d3-cb4ea156bfae\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496515 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/a3f6ed4d-518f-4415-9378-73fca072d431-kube-api-access-xchc4\") pod \"manila-operator-controller-manager-7dd968899f-5sgtg\" (UID: \"a3f6ed4d-518f-4415-9378-73fca072d431\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.496943 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.517030 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rhb6\" (UniqueName: \"kubernetes.io/projected/8553945b-dfe3-4c77-bb73-dce58c6ad3ba-kube-api-access-7rhb6\") pod \"mariadb-operator-controller-manager-67bf948998-fsdvn\" (UID: \"8553945b-dfe3-4c77-bb73-dce58c6ad3ba\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.522414 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/a3f6ed4d-518f-4415-9378-73fca072d431-kube-api-access-xchc4\") pod \"manila-operator-controller-manager-7dd968899f-5sgtg\" (UID: \"a3f6ed4d-518f-4415-9378-73fca072d431\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.525529 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.537028 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.562610 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.563313 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.568805 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rr2qh" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.575612 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.594085 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.600642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hqdv\" (UniqueName: \"kubernetes.io/projected/456074da-531d-471b-92d3-cb4ea156bfae-kube-api-access-2hqdv\") pod \"octavia-operator-controller-manager-6687f8d877-kndp7\" (UID: \"456074da-531d-471b-92d3-cb4ea156bfae\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.600711 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8xbx\" (UniqueName: \"kubernetes.io/projected/2b83a9b3-5579-438f-8f65-effa382b726c-kube-api-access-c8xbx\") pod \"nova-operator-controller-manager-55bff696bd-5l9jv\" (UID: \"2b83a9b3-5579-438f-8f65-effa382b726c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.600768 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncsq\" (UniqueName: \"kubernetes.io/projected/59634caa-7fe0-49a1-98bf-dbc61a15f495-kube-api-access-nncsq\") pod \"placement-operator-controller-manager-5b964cf4cd-t4scx\" (UID: \"59634caa-7fe0-49a1-98bf-dbc61a15f495\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.600871 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvx2b\" (UniqueName: \"kubernetes.io/projected/3d63764e-5f26-4a63-870a-af0e86eb5d23-kube-api-access-wvx2b\") pod \"swift-operator-controller-manager-68fc8c869-gqvgs\" (UID: \"3d63764e-5f26-4a63-870a-af0e86eb5d23\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.600925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcj4w\" (UniqueName: \"kubernetes.io/projected/a536697c-8056-4907-a09e-b23aa129435d-kube-api-access-hcj4w\") pod \"ovn-operator-controller-manager-788c46999f-mkk7j\" (UID: \"a536697c-8056-4907-a09e-b23aa129435d\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.600962 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.601055 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5zmp\" (UniqueName: \"kubernetes.io/projected/5e6de10d-baf2-4ef4-9acf-d093ee65c4fd-kube-api-access-f5zmp\") pod \"neutron-operator-controller-manager-585dbc889-wssqz\" (UID: \"5e6de10d-baf2-4ef4-9acf-d093ee65c4fd\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.601200 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ntrl\" (UniqueName: \"kubernetes.io/projected/8e470db6-3785-4da2-9b83-5242d6712d6a-kube-api-access-5ntrl\") pod \"telemetry-operator-controller-manager-64b5b76f97-gqv2m\" (UID: \"8e470db6-3785-4da2-9b83-5242d6712d6a\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.601257 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nt5x\" (UniqueName: \"kubernetes.io/projected/47b128c8-46ef-422c-aabc-1220f85fef83-kube-api-access-7nt5x\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.602533 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.602602 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert podName:47b128c8-46ef-422c-aabc-1220f85fef83 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:19.102576952 +0000 UTC m=+934.472487199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" (UID: "47b128c8-46ef-422c-aabc-1220f85fef83") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.606541 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.630517 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcj4w\" (UniqueName: \"kubernetes.io/projected/a536697c-8056-4907-a09e-b23aa129435d-kube-api-access-hcj4w\") pod \"ovn-operator-controller-manager-788c46999f-mkk7j\" (UID: \"a536697c-8056-4907-a09e-b23aa129435d\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.631123 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hqdv\" (UniqueName: \"kubernetes.io/projected/456074da-531d-471b-92d3-cb4ea156bfae-kube-api-access-2hqdv\") pod \"octavia-operator-controller-manager-6687f8d877-kndp7\" (UID: \"456074da-531d-471b-92d3-cb4ea156bfae\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.636846 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5zmp\" (UniqueName: \"kubernetes.io/projected/5e6de10d-baf2-4ef4-9acf-d093ee65c4fd-kube-api-access-f5zmp\") pod \"neutron-operator-controller-manager-585dbc889-wssqz\" (UID: \"5e6de10d-baf2-4ef4-9acf-d093ee65c4fd\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.637603 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8xbx\" (UniqueName: \"kubernetes.io/projected/2b83a9b3-5579-438f-8f65-effa382b726c-kube-api-access-c8xbx\") pod \"nova-operator-controller-manager-55bff696bd-5l9jv\" (UID: \"2b83a9b3-5579-438f-8f65-effa382b726c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.643993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncsq\" (UniqueName: \"kubernetes.io/projected/59634caa-7fe0-49a1-98bf-dbc61a15f495-kube-api-access-nncsq\") pod \"placement-operator-controller-manager-5b964cf4cd-t4scx\" (UID: \"59634caa-7fe0-49a1-98bf-dbc61a15f495\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.644899 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-vqp2s"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.646464 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.649236 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-vqp2s"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.650780 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-g7tvd" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.657320 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nt5x\" (UniqueName: \"kubernetes.io/projected/47b128c8-46ef-422c-aabc-1220f85fef83-kube-api-access-7nt5x\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.663394 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.706137 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvx2b\" (UniqueName: \"kubernetes.io/projected/3d63764e-5f26-4a63-870a-af0e86eb5d23-kube-api-access-wvx2b\") pod \"swift-operator-controller-manager-68fc8c869-gqvgs\" (UID: \"3d63764e-5f26-4a63-870a-af0e86eb5d23\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.706206 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ntrl\" (UniqueName: \"kubernetes.io/projected/8e470db6-3785-4da2-9b83-5242d6712d6a-kube-api-access-5ntrl\") pod \"telemetry-operator-controller-manager-64b5b76f97-gqv2m\" (UID: \"8e470db6-3785-4da2-9b83-5242d6712d6a\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.706274 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h8k\" (UniqueName: \"kubernetes.io/projected/9e5eb1e9-111a-4230-92d6-5b1fbc332ada-kube-api-access-k9h8k\") pod \"test-operator-controller-manager-56f8bfcd9f-vccxr\" (UID: \"9e5eb1e9-111a-4230-92d6-5b1fbc332ada\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.724588 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvx2b\" (UniqueName: \"kubernetes.io/projected/3d63764e-5f26-4a63-870a-af0e86eb5d23-kube-api-access-wvx2b\") pod \"swift-operator-controller-manager-68fc8c869-gqvgs\" (UID: \"3d63764e-5f26-4a63-870a-af0e86eb5d23\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.728804 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.729686 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ntrl\" (UniqueName: \"kubernetes.io/projected/8e470db6-3785-4da2-9b83-5242d6712d6a-kube-api-access-5ntrl\") pod \"telemetry-operator-controller-manager-64b5b76f97-gqv2m\" (UID: \"8e470db6-3785-4da2-9b83-5242d6712d6a\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.745156 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.764894 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.765698 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.776273 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.776352 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.776898 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sxl8b" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.783024 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.786816 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.799889 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.807123 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2f7b\" (UniqueName: \"kubernetes.io/projected/6d92f2e0-367c-428a-bcd5-cf6e5846046f-kube-api-access-h2f7b\") pod \"watcher-operator-controller-manager-564965969-vqp2s\" (UID: \"6d92f2e0-367c-428a-bcd5-cf6e5846046f\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.807299 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h8k\" (UniqueName: \"kubernetes.io/projected/9e5eb1e9-111a-4230-92d6-5b1fbc332ada-kube-api-access-k9h8k\") pod \"test-operator-controller-manager-56f8bfcd9f-vccxr\" (UID: \"9e5eb1e9-111a-4230-92d6-5b1fbc332ada\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.812662 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.829498 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h8k\" (UniqueName: \"kubernetes.io/projected/9e5eb1e9-111a-4230-92d6-5b1fbc332ada-kube-api-access-k9h8k\") pod \"test-operator-controller-manager-56f8bfcd9f-vccxr\" (UID: \"9e5eb1e9-111a-4230-92d6-5b1fbc332ada\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.831830 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.853440 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.854817 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.863802 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-s952l" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.868953 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.893342 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.899361 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z"] Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.909470 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.909511 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.909572 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2f7b\" (UniqueName: \"kubernetes.io/projected/6d92f2e0-367c-428a-bcd5-cf6e5846046f-kube-api-access-h2f7b\") pod \"watcher-operator-controller-manager-564965969-vqp2s\" (UID: \"6d92f2e0-367c-428a-bcd5-cf6e5846046f\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.909622 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.909640 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsnh2\" (UniqueName: \"kubernetes.io/projected/5852e12a-376e-420f-a0fd-efecae7ef623-kube-api-access-qsnh2\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.909761 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:18 crc kubenswrapper[4931]: E0130 05:23:18.909805 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert podName:29ae7a52-ff32-4f97-8f6c-830ac4e4b40b nodeName:}" failed. No retries permitted until 2026-01-30 05:23:19.909790859 +0000 UTC m=+935.279701116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert") pod "infra-operator-controller-manager-79955696d6-tzxqv" (UID: "29ae7a52-ff32-4f97-8f6c-830ac4e4b40b") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:18 crc kubenswrapper[4931]: I0130 05:23:18.939459 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2f7b\" (UniqueName: \"kubernetes.io/projected/6d92f2e0-367c-428a-bcd5-cf6e5846046f-kube-api-access-h2f7b\") pod \"watcher-operator-controller-manager-564965969-vqp2s\" (UID: \"6d92f2e0-367c-428a-bcd5-cf6e5846046f\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:18.999857 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:18.999873 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.010618 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.010651 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsnh2\" (UniqueName: \"kubernetes.io/projected/5852e12a-376e-420f-a0fd-efecae7ef623-kube-api-access-qsnh2\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.010686 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.010746 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvj6\" (UniqueName: \"kubernetes.io/projected/ad890bc5-5b72-4833-86d5-2c022cd87e4a-kube-api-access-clvj6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-v4vnz\" (UID: \"ad890bc5-5b72-4833-86d5-2c022cd87e4a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.010875 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.010921 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:19.51090488 +0000 UTC m=+934.880815137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "metrics-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.011151 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.011209 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:19.511191588 +0000 UTC m=+934.881101835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.030858 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.033807 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsnh2\" (UniqueName: \"kubernetes.io/projected/5852e12a-376e-420f-a0fd-efecae7ef623-kube-api-access-qsnh2\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.113248 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvj6\" (UniqueName: \"kubernetes.io/projected/ad890bc5-5b72-4833-86d5-2c022cd87e4a-kube-api-access-clvj6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-v4vnz\" (UID: \"ad890bc5-5b72-4833-86d5-2c022cd87e4a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.113411 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.113731 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.113825 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert podName:47b128c8-46ef-422c-aabc-1220f85fef83 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:20.113762398 +0000 UTC m=+935.483672655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" (UID: "47b128c8-46ef-422c-aabc-1220f85fef83") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.133119 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.141667 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvj6\" (UniqueName: \"kubernetes.io/projected/ad890bc5-5b72-4833-86d5-2c022cd87e4a-kube-api-access-clvj6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-v4vnz\" (UID: \"ad890bc5-5b72-4833-86d5-2c022cd87e4a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.145458 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.209987 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.335668 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.339930 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.370852 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2"] Jan 30 05:23:19 crc kubenswrapper[4931]: W0130 05:23:19.374916 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd806e5bf_8346_46c0_a3de_5f8412e92b4f.slice/crio-79c2b361d544854d19894ef245ced794b18dd59b97294328a5498b546fff9d74 WatchSource:0}: Error finding container 79c2b361d544854d19894ef245ced794b18dd59b97294328a5498b546fff9d74: Status 404 returned error can't find the container with id 79c2b361d544854d19894ef245ced794b18dd59b97294328a5498b546fff9d74 Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.533119 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.533172 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.533281 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.533311 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.533336 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:20.533323796 +0000 UTC m=+935.903234053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.533383 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:20.533364997 +0000 UTC m=+935.903275244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "metrics-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.568383 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" event={"ID":"cc5025a4-0807-478d-831a-c6ed424628a9","Type":"ContainerStarted","Data":"06b2923c961bf0d4ee316001972de932f77d28df5f63fb55ba281f4acbd3edd5"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.570260 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" event={"ID":"ce7feb31-22f3-42d9-83b1-cd9155abae99","Type":"ContainerStarted","Data":"699d1979a4e6c22e40e2958d0727b8a813afa0c8c42fb39279acb007eef3702b"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.571841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" event={"ID":"d806e5bf-8346-46c0-a3de-5f8412e92b4f","Type":"ContainerStarted","Data":"79c2b361d544854d19894ef245ced794b18dd59b97294328a5498b546fff9d74"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.573078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" event={"ID":"33b18ace-2da3-4bad-b093-d7db2aad7f50","Type":"ContainerStarted","Data":"516b01dca3eca9c8675ba023cbc5f5231817840726a7660b277587917f25fdc3"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.574124 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" event={"ID":"eb76dd84-30db-4769-852c-9a42814949d7","Type":"ContainerStarted","Data":"14f29a0f74f382c91558a5ade188b393f9bed0c831bc5c494f955e657bf8d4ea"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.576190 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" event={"ID":"2773429e-ccbb-43a4-a88a-a1cd41a63e10","Type":"ContainerStarted","Data":"07d7a793d68db451ddcc1bf5a4730ad681e0ad391b3fa174b65490081056f1d5"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.577546 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" event={"ID":"dea1ae69-0c15-4228-a323-dc6f762e3c82","Type":"ContainerStarted","Data":"f1884e6ab103cea1bba8d73c115d3edba1f8b0dbeeae34f1637b34025862f1ee"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.578311 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" event={"ID":"80b25db7-e1c2-4787-89f4-952cd7e845ba","Type":"ContainerStarted","Data":"9eb6821fa91ed10a1bd3b567ceeda4f00173041848682e4c89c5ffbbd889e058"} Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.736933 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.744545 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.749785 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.756689 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.777899 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.785914 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.802979 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv"] Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.810872 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcj4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-mkk7j_openstack-operators(a536697c-8056-4907-a09e-b23aa129435d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.811301 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h2f7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-vqp2s_openstack-operators(6d92f2e0-367c-428a-bcd5-cf6e5846046f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.812391 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" podUID="a536697c-8056-4907-a09e-b23aa129435d" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.812399 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" podUID="6d92f2e0-367c-428a-bcd5-cf6e5846046f" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.829519 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-vqp2s"] Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.833237 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-clvj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-v4vnz_openstack-operators(ad890bc5-5b72-4833-86d5-2c022cd87e4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.834908 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" podUID="ad890bc5-5b72-4833-86d5-2c022cd87e4a" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.834985 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz"] Jan 30 05:23:19 crc kubenswrapper[4931]: W0130 05:23:19.840540 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d63764e_5f26_4a63_870a_af0e86eb5d23.slice/crio-1e3ea5b7e3a1c069f4d22ce87c1264356e67a543d3ab638dbbe5e61a5c08e1d1 WatchSource:0}: Error finding container 1e3ea5b7e3a1c069f4d22ce87c1264356e67a543d3ab638dbbe5e61a5c08e1d1: Status 404 returned error can't find the container with id 1e3ea5b7e3a1c069f4d22ce87c1264356e67a543d3ab638dbbe5e61a5c08e1d1 Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.840689 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2hqdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-kndp7_openstack-operators(456074da-531d-471b-92d3-cb4ea156bfae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.841791 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" podUID="456074da-531d-471b-92d3-cb4ea156bfae" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.841963 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wvx2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-gqvgs_openstack-operators(3d63764e-5f26-4a63-870a-af0e86eb5d23): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:19 crc kubenswrapper[4931]: W0130 05:23:19.842311 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e5eb1e9_111a_4230_92d6_5b1fbc332ada.slice/crio-6daadfb949bcd6080310d50bcb3aeaa7d05d66803d1bb9a2b60ca3c9b04244c3 WatchSource:0}: Error finding container 6daadfb949bcd6080310d50bcb3aeaa7d05d66803d1bb9a2b60ca3c9b04244c3: Status 404 returned error can't find the container with id 6daadfb949bcd6080310d50bcb3aeaa7d05d66803d1bb9a2b60ca3c9b04244c3 Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.842744 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr"] Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.843414 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" podUID="3d63764e-5f26-4a63-870a-af0e86eb5d23" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.844798 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k9h8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-vccxr_openstack-operators(9e5eb1e9-111a-4230-92d6-5b1fbc332ada): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.846050 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" podUID="9e5eb1e9-111a-4230-92d6-5b1fbc332ada" Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.849586 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.855055 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs"] Jan 30 05:23:19 crc kubenswrapper[4931]: I0130 05:23:19.945183 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.945372 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:19 crc kubenswrapper[4931]: E0130 05:23:19.945478 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert podName:29ae7a52-ff32-4f97-8f6c-830ac4e4b40b nodeName:}" failed. No retries permitted until 2026-01-30 05:23:21.945452199 +0000 UTC m=+937.315362516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert") pod "infra-operator-controller-manager-79955696d6-tzxqv" (UID: "29ae7a52-ff32-4f97-8f6c-830ac4e4b40b") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.146409 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.146575 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.146628 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert podName:47b128c8-46ef-422c-aabc-1220f85fef83 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:22.1466076 +0000 UTC m=+937.516517857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" (UID: "47b128c8-46ef-422c-aabc-1220f85fef83") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.552248 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.552302 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.552443 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.552483 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.552506 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:22.552492341 +0000 UTC m=+937.922402598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "webhook-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.552592 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:22.552567963 +0000 UTC m=+937.922478320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "metrics-server-cert" not found Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.605708 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" event={"ID":"5e6de10d-baf2-4ef4-9acf-d093ee65c4fd","Type":"ContainerStarted","Data":"06ac2eca3df17fbdc2e9847bff81089b0dfd6b12a02f6a6fbd704819801c9c82"} Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.608047 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" event={"ID":"3d63764e-5f26-4a63-870a-af0e86eb5d23","Type":"ContainerStarted","Data":"1e3ea5b7e3a1c069f4d22ce87c1264356e67a543d3ab638dbbe5e61a5c08e1d1"} Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.611261 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" event={"ID":"59634caa-7fe0-49a1-98bf-dbc61a15f495","Type":"ContainerStarted","Data":"a85556acf5fd5c5bb24304c0807640e47defb000e0b39b263c48f504303b1dc1"} Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.612107 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" podUID="3d63764e-5f26-4a63-870a-af0e86eb5d23" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.612811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" event={"ID":"9e5eb1e9-111a-4230-92d6-5b1fbc332ada","Type":"ContainerStarted","Data":"6daadfb949bcd6080310d50bcb3aeaa7d05d66803d1bb9a2b60ca3c9b04244c3"} Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.614023 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" podUID="9e5eb1e9-111a-4230-92d6-5b1fbc332ada" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.614937 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" event={"ID":"ad890bc5-5b72-4833-86d5-2c022cd87e4a","Type":"ContainerStarted","Data":"ff84e7ded79401d6cc907dd06658e56ce815b51b4f7e0f258d0c089befc32f1f"} Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.617302 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" podUID="ad890bc5-5b72-4833-86d5-2c022cd87e4a" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.628728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" event={"ID":"8e470db6-3785-4da2-9b83-5242d6712d6a","Type":"ContainerStarted","Data":"4d1529c3cd06cef7c859cdbded1996e56a11e05d299b8ac71ceae26922cf7c8c"} Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.632396 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" event={"ID":"6d92f2e0-367c-428a-bcd5-cf6e5846046f","Type":"ContainerStarted","Data":"589ce9e6eb6354776adbacc8ea36e2442305cc2b39239273b6f66d743190e980"} Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.633606 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" podUID="6d92f2e0-367c-428a-bcd5-cf6e5846046f" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.636211 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" event={"ID":"a3f6ed4d-518f-4415-9378-73fca072d431","Type":"ContainerStarted","Data":"87382b6a8b2e9985466b0f3911466583658fa03b4a291a32bd38a3b33905df61"} Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.637616 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" event={"ID":"456074da-531d-471b-92d3-cb4ea156bfae","Type":"ContainerStarted","Data":"4d829eec5ce9b3580ee3f4dac0b7b12a402cbdd7aae399082b6f52677c6c296b"} Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.639509 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" podUID="456074da-531d-471b-92d3-cb4ea156bfae" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.640038 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" event={"ID":"a536697c-8056-4907-a09e-b23aa129435d","Type":"ContainerStarted","Data":"9dd8b5ea4a528dbe0f4ef4b500a30d4f620becea5d8059f124470198e50517bf"} Jan 30 05:23:20 crc kubenswrapper[4931]: E0130 05:23:20.644267 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" podUID="a536697c-8056-4907-a09e-b23aa129435d" Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.645956 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" event={"ID":"8553945b-dfe3-4c77-bb73-dce58c6ad3ba","Type":"ContainerStarted","Data":"31e5cb4b3d4087f00cd1b494b56b823920b79283e25c58e355b3d380020832c3"} Jan 30 05:23:20 crc kubenswrapper[4931]: I0130 05:23:20.648088 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" event={"ID":"2b83a9b3-5579-438f-8f65-effa382b726c","Type":"ContainerStarted","Data":"db4b8f222693a8b5dce73a0b7bfb48f1466a4469dcc9166f10b46355eec50f5c"} Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.657110 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" podUID="6d92f2e0-367c-428a-bcd5-cf6e5846046f" Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.658548 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" podUID="456074da-531d-471b-92d3-cb4ea156bfae" Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.658789 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" podUID="ad890bc5-5b72-4833-86d5-2c022cd87e4a" Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.658807 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" podUID="9e5eb1e9-111a-4230-92d6-5b1fbc332ada" Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.658830 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" podUID="a536697c-8056-4907-a09e-b23aa129435d" Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.659049 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" podUID="3d63764e-5f26-4a63-870a-af0e86eb5d23" Jan 30 05:23:21 crc kubenswrapper[4931]: I0130 05:23:21.967152 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.967282 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:21 crc kubenswrapper[4931]: E0130 05:23:21.967359 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert podName:29ae7a52-ff32-4f97-8f6c-830ac4e4b40b nodeName:}" failed. No retries permitted until 2026-01-30 05:23:25.967332207 +0000 UTC m=+941.337242464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert") pod "infra-operator-controller-manager-79955696d6-tzxqv" (UID: "29ae7a52-ff32-4f97-8f6c-830ac4e4b40b") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:22 crc kubenswrapper[4931]: I0130 05:23:22.169575 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:22 crc kubenswrapper[4931]: E0130 05:23:22.169701 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:22 crc kubenswrapper[4931]: E0130 05:23:22.169765 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert podName:47b128c8-46ef-422c-aabc-1220f85fef83 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:26.169748683 +0000 UTC m=+941.539658930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" (UID: "47b128c8-46ef-422c-aabc-1220f85fef83") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:22 crc kubenswrapper[4931]: I0130 05:23:22.576949 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:22 crc kubenswrapper[4931]: I0130 05:23:22.577039 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:22 crc kubenswrapper[4931]: E0130 05:23:22.577107 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:22 crc kubenswrapper[4931]: E0130 05:23:22.577162 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:26.577146126 +0000 UTC m=+941.947056383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "metrics-server-cert" not found Jan 30 05:23:22 crc kubenswrapper[4931]: E0130 05:23:22.577480 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:22 crc kubenswrapper[4931]: E0130 05:23:22.577551 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:26.577529206 +0000 UTC m=+941.947439463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "webhook-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: I0130 05:23:26.025343 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.025529 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.025943 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert podName:29ae7a52-ff32-4f97-8f6c-830ac4e4b40b nodeName:}" failed. No retries permitted until 2026-01-30 05:23:34.025927162 +0000 UTC m=+949.395837409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert") pod "infra-operator-controller-manager-79955696d6-tzxqv" (UID: "29ae7a52-ff32-4f97-8f6c-830ac4e4b40b") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: I0130 05:23:26.228226 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.228489 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.228624 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert podName:47b128c8-46ef-422c-aabc-1220f85fef83 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:34.228579535 +0000 UTC m=+949.598489802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" (UID: "47b128c8-46ef-422c-aabc-1220f85fef83") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: I0130 05:23:26.634329 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:26 crc kubenswrapper[4931]: I0130 05:23:26.634405 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.634557 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.634557 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.634618 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:34.63460181 +0000 UTC m=+950.004512077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "metrics-server-cert" not found Jan 30 05:23:26 crc kubenswrapper[4931]: E0130 05:23:26.634635 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:34.634627851 +0000 UTC m=+950.004538118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "webhook-server-cert" not found Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.363303 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.363361 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.364261 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.364751 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60aa2b4543ca7f8bb21bef3c167f0da099829d6ddc544f185e09f8c4de74ad75"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.364810 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://60aa2b4543ca7f8bb21bef3c167f0da099829d6ddc544f185e09f8c4de74ad75" gracePeriod=600 Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.721596 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="60aa2b4543ca7f8bb21bef3c167f0da099829d6ddc544f185e09f8c4de74ad75" exitCode=0 Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.721637 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"60aa2b4543ca7f8bb21bef3c167f0da099829d6ddc544f185e09f8c4de74ad75"} Jan 30 05:23:27 crc kubenswrapper[4931]: I0130 05:23:27.721667 4931 scope.go:117] "RemoveContainer" containerID="1794ca6ffdd404c39dffe9fa048526a3a78869de00d876a52d3bd280c8bbc2a2" Jan 30 05:23:33 crc kubenswrapper[4931]: E0130 05:23:33.153046 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521" Jan 30 05:23:33 crc kubenswrapper[4931]: E0130 05:23:33.153956 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2qq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5f4b8bd54d-v9fgj_openstack-operators(33b18ace-2da3-4bad-b093-d7db2aad7f50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:33 crc kubenswrapper[4931]: E0130 05:23:33.155243 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" podUID="33b18ace-2da3-4bad-b093-d7db2aad7f50" Jan 30 05:23:33 crc kubenswrapper[4931]: E0130 05:23:33.782655 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" podUID="33b18ace-2da3-4bad-b093-d7db2aad7f50" Jan 30 05:23:34 crc kubenswrapper[4931]: I0130 05:23:34.040931 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.041100 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.041169 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert podName:29ae7a52-ff32-4f97-8f6c-830ac4e4b40b nodeName:}" failed. No retries permitted until 2026-01-30 05:23:50.04115028 +0000 UTC m=+965.411060537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert") pod "infra-operator-controller-manager-79955696d6-tzxqv" (UID: "29ae7a52-ff32-4f97-8f6c-830ac4e4b40b") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.082724 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.082885 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fjqxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d9697b7f4-bf56z_openstack-operators(dea1ae69-0c15-4228-a323-dc6f762e3c82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.084485 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" podUID="dea1ae69-0c15-4228-a323-dc6f762e3c82" Jan 30 05:23:34 crc kubenswrapper[4931]: I0130 05:23:34.244293 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.244508 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.244562 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert podName:47b128c8-46ef-422c-aabc-1220f85fef83 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:50.244544233 +0000 UTC m=+965.614454500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" (UID: "47b128c8-46ef-422c-aabc-1220f85fef83") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: I0130 05:23:34.650175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:34 crc kubenswrapper[4931]: I0130 05:23:34.650230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.650365 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.650414 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.650454 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:50.650434805 +0000 UTC m=+966.020345062 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "metrics-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.650494 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs podName:5852e12a-376e-420f-a0fd-efecae7ef623 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:50.650477266 +0000 UTC m=+966.020387523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-qpp9f" (UID: "5852e12a-376e-420f-a0fd-efecae7ef623") : secret "webhook-server-cert" not found Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.748142 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.748282 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xchc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-5sgtg_openstack-operators(a3f6ed4d-518f-4415-9378-73fca072d431): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.749384 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" podUID="a3f6ed4d-518f-4415-9378-73fca072d431" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.795525 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" podUID="a3f6ed4d-518f-4415-9378-73fca072d431" Jan 30 05:23:34 crc kubenswrapper[4931]: E0130 05:23:34.796239 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" podUID="dea1ae69-0c15-4228-a323-dc6f762e3c82" Jan 30 05:23:36 crc kubenswrapper[4931]: I0130 05:23:36.805769 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f"} Jan 30 05:23:37 crc kubenswrapper[4931]: I0130 05:23:37.830063 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" event={"ID":"ce7feb31-22f3-42d9-83b1-cd9155abae99","Type":"ContainerStarted","Data":"d3cab9b2e4ee8963cdfd7db313c28e5e14abdfed3fdc74d199ad0622b444578d"} Jan 30 05:23:37 crc kubenswrapper[4931]: I0130 05:23:37.830566 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:37 crc kubenswrapper[4931]: I0130 05:23:37.835995 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" event={"ID":"80b25db7-e1c2-4787-89f4-952cd7e845ba","Type":"ContainerStarted","Data":"c44314702bfc41dc7a507bed53fae47af3f510a1255238b7d83deebd6b131685"} Jan 30 05:23:37 crc kubenswrapper[4931]: I0130 05:23:37.836032 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:37 crc kubenswrapper[4931]: I0130 05:23:37.850321 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" podStartSLOduration=4.217877207 podStartE2EDuration="19.850302556s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.183782714 +0000 UTC m=+934.553692971" lastFinishedPulling="2026-01-30 05:23:34.816208043 +0000 UTC m=+950.186118320" observedRunningTime="2026-01-30 05:23:37.843947682 +0000 UTC m=+953.213857969" watchObservedRunningTime="2026-01-30 05:23:37.850302556 +0000 UTC m=+953.220212813" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.843239 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" event={"ID":"eb76dd84-30db-4769-852c-9a42814949d7","Type":"ContainerStarted","Data":"ac1b2a0a43d20788ee404a395fdde4cb810deb70c9ea522720b066fa608a6a98"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.844355 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.853732 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" event={"ID":"2b83a9b3-5579-438f-8f65-effa382b726c","Type":"ContainerStarted","Data":"fc065fa52f3f82077a3221e1fa1735b74f16d593029388265c4a1a7c9c574370"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.854250 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.860983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" event={"ID":"cc5025a4-0807-478d-831a-c6ed424628a9","Type":"ContainerStarted","Data":"7d8ecc6ac1fdf3e3641188b711a0755410e43838332f7877df696903d674e460"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.861526 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.863991 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" event={"ID":"3d63764e-5f26-4a63-870a-af0e86eb5d23","Type":"ContainerStarted","Data":"da1b433065e2e87a24641bff526ae9b9c0f32f0f87e7653bcf885cb86419ad0f"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.864318 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.867780 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" event={"ID":"9e5eb1e9-111a-4230-92d6-5b1fbc332ada","Type":"ContainerStarted","Data":"480d7fd833d0af0b1986b5fd062806d3f39ea5d40c5b0efbe29fce5e12054cdf"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.868128 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.869100 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" event={"ID":"d806e5bf-8346-46c0-a3de-5f8412e92b4f","Type":"ContainerStarted","Data":"32f9648fa8035ad706f5ef17b8b74cce10cfdd6e508bf4cdc19932320e5474d7"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.869438 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.870264 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" event={"ID":"2773429e-ccbb-43a4-a88a-a1cd41a63e10","Type":"ContainerStarted","Data":"5d48d2c9aa3ee2fd32b371415f8f3fcad14bd90c4ff9cbd0fdb0019886b9a8bf"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.870615 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.874152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" event={"ID":"456074da-531d-471b-92d3-cb4ea156bfae","Type":"ContainerStarted","Data":"79be38d24d378e06b783fdd24e85fb505677806404cd6bc57aa6140912680645"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.874497 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.875372 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" event={"ID":"a536697c-8056-4907-a09e-b23aa129435d","Type":"ContainerStarted","Data":"aed51bac6c8458a9f0f204ccad30e73cc8979986750b94b3495be05b963d785c"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.875703 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.876520 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" event={"ID":"59634caa-7fe0-49a1-98bf-dbc61a15f495","Type":"ContainerStarted","Data":"fe49d6efca633ced99c67e5797d6f7ba6a765a838b713eb01e9c10374fda81e0"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.876843 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.879475 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" event={"ID":"8553945b-dfe3-4c77-bb73-dce58c6ad3ba","Type":"ContainerStarted","Data":"92b2144b7f36ac97060b832e1eebaa80170871d3e428616f6020807babefccd0"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.879695 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.883510 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" event={"ID":"8e470db6-3785-4da2-9b83-5242d6712d6a","Type":"ContainerStarted","Data":"c9416a8be147da5a30fb922010687ce898274467c6bbfe71fc12e6ddbcbe95a9"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.883852 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.892752 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" event={"ID":"6d92f2e0-367c-428a-bcd5-cf6e5846046f","Type":"ContainerStarted","Data":"5fff86b6b08566970d03a7221c38f5ec2ae15bd27ff24102681d4c55d71bad36"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.893610 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.912491 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" event={"ID":"5e6de10d-baf2-4ef4-9acf-d093ee65c4fd","Type":"ContainerStarted","Data":"5f406d7a163233efebbdaaa4b40595ee6c44d87c16232dff8db32617a4297178"} Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.912528 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:38 crc kubenswrapper[4931]: I0130 05:23:38.928436 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" podStartSLOduration=5.132896858 podStartE2EDuration="20.928405562s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:18.951015443 +0000 UTC m=+934.320925700" lastFinishedPulling="2026-01-30 05:23:34.746524147 +0000 UTC m=+950.116434404" observedRunningTime="2026-01-30 05:23:37.863929451 +0000 UTC m=+953.233839708" watchObservedRunningTime="2026-01-30 05:23:38.928405562 +0000 UTC m=+954.298315819" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.023695 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" podStartSLOduration=4.801171607 podStartE2EDuration="21.023675282s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.049022888 +0000 UTC m=+934.418933145" lastFinishedPulling="2026-01-30 05:23:35.271526563 +0000 UTC m=+950.641436820" observedRunningTime="2026-01-30 05:23:38.931691432 +0000 UTC m=+954.301601689" watchObservedRunningTime="2026-01-30 05:23:39.023675282 +0000 UTC m=+954.393585529" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.026841 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" podStartSLOduration=4.793139496 podStartE2EDuration="21.026832179s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.793709476 +0000 UTC m=+935.163619733" lastFinishedPulling="2026-01-30 05:23:36.027402159 +0000 UTC m=+951.397312416" observedRunningTime="2026-01-30 05:23:39.013533513 +0000 UTC m=+954.383443770" watchObservedRunningTime="2026-01-30 05:23:39.026832179 +0000 UTC m=+954.396742436" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.081281 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" podStartSLOduration=5.169252098 podStartE2EDuration="21.081260935s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.359400793 +0000 UTC m=+934.729311050" lastFinishedPulling="2026-01-30 05:23:35.27140962 +0000 UTC m=+950.641319887" observedRunningTime="2026-01-30 05:23:39.071896568 +0000 UTC m=+954.441806825" watchObservedRunningTime="2026-01-30 05:23:39.081260935 +0000 UTC m=+954.451171192" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.103689 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" podStartSLOduration=5.5305280020000005 podStartE2EDuration="21.103671642s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.173408748 +0000 UTC m=+934.543319005" lastFinishedPulling="2026-01-30 05:23:34.746552388 +0000 UTC m=+950.116462645" observedRunningTime="2026-01-30 05:23:39.099757794 +0000 UTC m=+954.469668051" watchObservedRunningTime="2026-01-30 05:23:39.103671642 +0000 UTC m=+954.473581899" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.187581 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" podStartSLOduration=6.165518154 podStartE2EDuration="21.187567039s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.793558732 +0000 UTC m=+935.163468989" lastFinishedPulling="2026-01-30 05:23:34.815607617 +0000 UTC m=+950.185517874" observedRunningTime="2026-01-30 05:23:39.181378719 +0000 UTC m=+954.551288976" watchObservedRunningTime="2026-01-30 05:23:39.187567039 +0000 UTC m=+954.557477296" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.234380 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" podStartSLOduration=5.773483733 podStartE2EDuration="21.234360205s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.810534118 +0000 UTC m=+935.180444375" lastFinishedPulling="2026-01-30 05:23:35.27141059 +0000 UTC m=+950.641320847" observedRunningTime="2026-01-30 05:23:39.231882387 +0000 UTC m=+954.601792644" watchObservedRunningTime="2026-01-30 05:23:39.234360205 +0000 UTC m=+954.604270462" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.276751 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" podStartSLOduration=6.260233388 podStartE2EDuration="21.276734331s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.794062685 +0000 UTC m=+935.163972942" lastFinishedPulling="2026-01-30 05:23:34.810563628 +0000 UTC m=+950.180473885" observedRunningTime="2026-01-30 05:23:39.276042442 +0000 UTC m=+954.645952689" watchObservedRunningTime="2026-01-30 05:23:39.276734331 +0000 UTC m=+954.646644588" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.335041 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" podStartSLOduration=3.770093214 podStartE2EDuration="21.335020174s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.84185956 +0000 UTC m=+935.211769817" lastFinishedPulling="2026-01-30 05:23:37.40678653 +0000 UTC m=+952.776696777" observedRunningTime="2026-01-30 05:23:39.333878662 +0000 UTC m=+954.703788919" watchObservedRunningTime="2026-01-30 05:23:39.335020174 +0000 UTC m=+954.704930431" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.338844 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" podStartSLOduration=3.7415599779999997 podStartE2EDuration="21.338832918s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.810763185 +0000 UTC m=+935.180673442" lastFinishedPulling="2026-01-30 05:23:37.408036125 +0000 UTC m=+952.777946382" observedRunningTime="2026-01-30 05:23:39.304757451 +0000 UTC m=+954.674667738" watchObservedRunningTime="2026-01-30 05:23:39.338832918 +0000 UTC m=+954.708743175" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.389592 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" podStartSLOduration=3.828140469 podStartE2EDuration="21.389574244s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.844711308 +0000 UTC m=+935.214621565" lastFinishedPulling="2026-01-30 05:23:37.406145083 +0000 UTC m=+952.776055340" observedRunningTime="2026-01-30 05:23:39.384825253 +0000 UTC m=+954.754735510" watchObservedRunningTime="2026-01-30 05:23:39.389574244 +0000 UTC m=+954.759484501" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.439311 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" podStartSLOduration=6.42317841 podStartE2EDuration="21.439289661s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.793813899 +0000 UTC m=+935.163724156" lastFinishedPulling="2026-01-30 05:23:34.80992515 +0000 UTC m=+950.179835407" observedRunningTime="2026-01-30 05:23:39.431031774 +0000 UTC m=+954.800942031" watchObservedRunningTime="2026-01-30 05:23:39.439289661 +0000 UTC m=+954.809199918" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.461884 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" podStartSLOduration=2.917636312 podStartE2EDuration="21.461866732s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.840550264 +0000 UTC m=+935.210460521" lastFinishedPulling="2026-01-30 05:23:38.384780684 +0000 UTC m=+953.754690941" observedRunningTime="2026-01-30 05:23:39.45999568 +0000 UTC m=+954.829905937" watchObservedRunningTime="2026-01-30 05:23:39.461866732 +0000 UTC m=+954.831776989" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.496294 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" podStartSLOduration=5.603526871 podStartE2EDuration="21.496276828s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.378613241 +0000 UTC m=+934.748523498" lastFinishedPulling="2026-01-30 05:23:35.271363198 +0000 UTC m=+950.641273455" observedRunningTime="2026-01-30 05:23:39.491098206 +0000 UTC m=+954.861008463" watchObservedRunningTime="2026-01-30 05:23:39.496276828 +0000 UTC m=+954.866187075" Jan 30 05:23:39 crc kubenswrapper[4931]: I0130 05:23:39.529896 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" podStartSLOduration=2.998925867 podStartE2EDuration="21.529881272s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.811196997 +0000 UTC m=+935.181107254" lastFinishedPulling="2026-01-30 05:23:38.342152402 +0000 UTC m=+953.712062659" observedRunningTime="2026-01-30 05:23:39.528116523 +0000 UTC m=+954.898026780" watchObservedRunningTime="2026-01-30 05:23:39.529881272 +0000 UTC m=+954.899791529" Jan 30 05:23:42 crc kubenswrapper[4931]: I0130 05:23:42.946543 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" event={"ID":"ad890bc5-5b72-4833-86d5-2c022cd87e4a","Type":"ContainerStarted","Data":"cac9d9cf02537d9f30af2fd8be71b7f5f9ec7eb7b38a9f7bfef1be1aab977885"} Jan 30 05:23:42 crc kubenswrapper[4931]: I0130 05:23:42.971202 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v4vnz" podStartSLOduration=2.997967851 podStartE2EDuration="24.971176723s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.833071488 +0000 UTC m=+935.202981745" lastFinishedPulling="2026-01-30 05:23:41.80628036 +0000 UTC m=+957.176190617" observedRunningTime="2026-01-30 05:23:42.966579456 +0000 UTC m=+958.336489773" watchObservedRunningTime="2026-01-30 05:23:42.971176723 +0000 UTC m=+958.341087010" Jan 30 05:23:47 crc kubenswrapper[4931]: I0130 05:23:47.998772 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" event={"ID":"dea1ae69-0c15-4228-a323-dc6f762e3c82","Type":"ContainerStarted","Data":"74e87ffff406463ee0865f7b17c2812c2fd93121443f274041a32e947224d69f"} Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:47.999796 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.028697 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" podStartSLOduration=2.158455816 podStartE2EDuration="30.028673317s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.077561763 +0000 UTC m=+934.447472020" lastFinishedPulling="2026-01-30 05:23:46.947779224 +0000 UTC m=+962.317689521" observedRunningTime="2026-01-30 05:23:48.024307247 +0000 UTC m=+963.394217554" watchObservedRunningTime="2026-01-30 05:23:48.028673317 +0000 UTC m=+963.398583584" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.358302 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-mttxk" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.384722 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4wv6z" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.405115 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-nsn26" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.432164 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lmgq2" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.457304 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-l5dv2" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.544669 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-ddtbw" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.601244 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-fsdvn" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.665878 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wssqz" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.735003 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-5l9jv" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.747241 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kndp7" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.785005 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-mkk7j" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.802353 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-t4scx" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.818640 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gqvgs" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.835315 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-gqv2m" Jan 30 05:23:48 crc kubenswrapper[4931]: I0130 05:23:48.896242 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vccxr" Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.003218 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-vqp2s" Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.007501 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" event={"ID":"a3f6ed4d-518f-4415-9378-73fca072d431","Type":"ContainerStarted","Data":"c5b2cb4f2ceb89c9e32a2fc6e8ad2b3582da479d12e85146f2f6f17a818cad3d"} Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.008076 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.013818 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" event={"ID":"33b18ace-2da3-4bad-b093-d7db2aad7f50","Type":"ContainerStarted","Data":"23e951c62f4ac3258f76d8365dba7c4e0d144bbf7537acc8fff1bbfe4d7f5ec0"} Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.014063 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.045607 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" podStartSLOduration=1.987107304 podStartE2EDuration="31.045587291s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.80876808 +0000 UTC m=+935.178678337" lastFinishedPulling="2026-01-30 05:23:48.867248067 +0000 UTC m=+964.237158324" observedRunningTime="2026-01-30 05:23:49.0350049 +0000 UTC m=+964.404915177" watchObservedRunningTime="2026-01-30 05:23:49.045587291 +0000 UTC m=+964.415497548" Jan 30 05:23:49 crc kubenswrapper[4931]: I0130 05:23:49.059987 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" podStartSLOduration=2.4049755250000002 podStartE2EDuration="31.059973277s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:19.369310975 +0000 UTC m=+934.739221242" lastFinishedPulling="2026-01-30 05:23:48.024308727 +0000 UTC m=+963.394218994" observedRunningTime="2026-01-30 05:23:49.059589996 +0000 UTC m=+964.429500253" watchObservedRunningTime="2026-01-30 05:23:49.059973277 +0000 UTC m=+964.429883534" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.121220 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.130826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29ae7a52-ff32-4f97-8f6c-830ac4e4b40b-cert\") pod \"infra-operator-controller-manager-79955696d6-tzxqv\" (UID: \"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.296238 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5fpjl" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.305168 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.325549 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.330604 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47b128c8-46ef-422c-aabc-1220f85fef83-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp\" (UID: \"47b128c8-46ef-422c-aabc-1220f85fef83\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.552010 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv"] Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.571059 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zl4cx" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.580348 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.736578 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.736718 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.746380 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.747013 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5852e12a-376e-420f-a0fd-efecae7ef623-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-qpp9f\" (UID: \"5852e12a-376e-420f-a0fd-efecae7ef623\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.839709 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp"] Jan 30 05:23:50 crc kubenswrapper[4931]: W0130 05:23:50.844933 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b128c8_46ef_422c_aabc_1220f85fef83.slice/crio-ecab003ed86f5685bfcb4ee8bfa058812cc32ad956a26e8a03faa87a6e7f1234 WatchSource:0}: Error finding container ecab003ed86f5685bfcb4ee8bfa058812cc32ad956a26e8a03faa87a6e7f1234: Status 404 returned error can't find the container with id ecab003ed86f5685bfcb4ee8bfa058812cc32ad956a26e8a03faa87a6e7f1234 Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.895015 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sxl8b" Jan 30 05:23:50 crc kubenswrapper[4931]: I0130 05:23:50.903690 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:51 crc kubenswrapper[4931]: I0130 05:23:51.040365 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" event={"ID":"47b128c8-46ef-422c-aabc-1220f85fef83","Type":"ContainerStarted","Data":"ecab003ed86f5685bfcb4ee8bfa058812cc32ad956a26e8a03faa87a6e7f1234"} Jan 30 05:23:51 crc kubenswrapper[4931]: I0130 05:23:51.042146 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" event={"ID":"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b","Type":"ContainerStarted","Data":"6585170c17fcbafd900d4df4f324a65b80219870b607472bc1bb47bb88b4be49"} Jan 30 05:23:51 crc kubenswrapper[4931]: I0130 05:23:51.196121 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f"] Jan 30 05:23:52 crc kubenswrapper[4931]: I0130 05:23:52.056170 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" event={"ID":"5852e12a-376e-420f-a0fd-efecae7ef623","Type":"ContainerStarted","Data":"445fe4c7045195b6564422c7068094c3df8b23bc420290d25f20db125cadefdb"} Jan 30 05:23:52 crc kubenswrapper[4931]: I0130 05:23:52.056216 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" event={"ID":"5852e12a-376e-420f-a0fd-efecae7ef623","Type":"ContainerStarted","Data":"806cf08e848406395c1aced7aa0c86ae87439f12aaa856a998612f458f4062a2"} Jan 30 05:23:52 crc kubenswrapper[4931]: I0130 05:23:52.057222 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:23:52 crc kubenswrapper[4931]: I0130 05:23:52.086731 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" podStartSLOduration=34.086717017 podStartE2EDuration="34.086717017s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:23:52.08536719 +0000 UTC m=+967.455277447" watchObservedRunningTime="2026-01-30 05:23:52.086717017 +0000 UTC m=+967.456627274" Jan 30 05:23:54 crc kubenswrapper[4931]: I0130 05:23:54.070892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" event={"ID":"29ae7a52-ff32-4f97-8f6c-830ac4e4b40b","Type":"ContainerStarted","Data":"84679140b3518d88037d90080960b88b7b8345bd0388d026da15d2a2c3c2dd76"} Jan 30 05:23:54 crc kubenswrapper[4931]: I0130 05:23:54.071314 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:23:54 crc kubenswrapper[4931]: I0130 05:23:54.074663 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" event={"ID":"47b128c8-46ef-422c-aabc-1220f85fef83","Type":"ContainerStarted","Data":"9c0d20f49a082af89977722012c4671034b35f92aea87540386161f822f2c2b6"} Jan 30 05:23:54 crc kubenswrapper[4931]: I0130 05:23:54.101151 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" podStartSLOduration=33.293509585 podStartE2EDuration="36.101131541s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:50.55439225 +0000 UTC m=+965.924302507" lastFinishedPulling="2026-01-30 05:23:53.362014196 +0000 UTC m=+968.731924463" observedRunningTime="2026-01-30 05:23:54.092500103 +0000 UTC m=+969.462410360" watchObservedRunningTime="2026-01-30 05:23:54.101131541 +0000 UTC m=+969.471041798" Jan 30 05:23:54 crc kubenswrapper[4931]: I0130 05:23:54.131754 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" podStartSLOduration=33.601185165 podStartE2EDuration="36.131728532s" podCreationTimestamp="2026-01-30 05:23:18 +0000 UTC" firstStartedPulling="2026-01-30 05:23:50.846922114 +0000 UTC m=+966.216832371" lastFinishedPulling="2026-01-30 05:23:53.377465481 +0000 UTC m=+968.747375738" observedRunningTime="2026-01-30 05:23:54.130174409 +0000 UTC m=+969.500084676" watchObservedRunningTime="2026-01-30 05:23:54.131728532 +0000 UTC m=+969.501638799" Jan 30 05:23:55 crc kubenswrapper[4931]: I0130 05:23:55.081974 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:23:58 crc kubenswrapper[4931]: I0130 05:23:58.382746 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-bf56z" Jan 30 05:23:58 crc kubenswrapper[4931]: I0130 05:23:58.529404 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-v9fgj" Jan 30 05:23:58 crc kubenswrapper[4931]: I0130 05:23:58.610663 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5sgtg" Jan 30 05:24:00 crc kubenswrapper[4931]: I0130 05:24:00.316563 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-tzxqv" Jan 30 05:24:00 crc kubenswrapper[4931]: I0130 05:24:00.590560 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp" Jan 30 05:24:00 crc kubenswrapper[4931]: I0130 05:24:00.913668 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-qpp9f" Jan 30 05:24:15 crc kubenswrapper[4931]: I0130 05:24:15.986839 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-lfczj"] Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.006444 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.009599 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-lfczj"] Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.010903 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.011757 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.012020 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.012324 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rmhq8" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.031300 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-249jr"] Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.032336 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.036953 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.038116 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f097854-006c-4110-bec8-b9d364ddb000-config\") pod \"dnsmasq-dns-84bb9d8bd9-lfczj\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.038172 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q99sh\" (UniqueName: \"kubernetes.io/projected/7f097854-006c-4110-bec8-b9d364ddb000-kube-api-access-q99sh\") pod \"dnsmasq-dns-84bb9d8bd9-lfczj\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.051936 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-249jr"] Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.139839 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q99sh\" (UniqueName: \"kubernetes.io/projected/7f097854-006c-4110-bec8-b9d364ddb000-kube-api-access-q99sh\") pod \"dnsmasq-dns-84bb9d8bd9-lfczj\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.139907 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-config\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.139932 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7lm\" (UniqueName: \"kubernetes.io/projected/942de512-1fdc-4955-a703-ccd872474993-kube-api-access-kx7lm\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.140032 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-dns-svc\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.140060 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f097854-006c-4110-bec8-b9d364ddb000-config\") pod \"dnsmasq-dns-84bb9d8bd9-lfczj\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.141214 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f097854-006c-4110-bec8-b9d364ddb000-config\") pod \"dnsmasq-dns-84bb9d8bd9-lfczj\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.158747 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q99sh\" (UniqueName: \"kubernetes.io/projected/7f097854-006c-4110-bec8-b9d364ddb000-kube-api-access-q99sh\") pod \"dnsmasq-dns-84bb9d8bd9-lfczj\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.241596 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-config\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.241652 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7lm\" (UniqueName: \"kubernetes.io/projected/942de512-1fdc-4955-a703-ccd872474993-kube-api-access-kx7lm\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.241753 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-dns-svc\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.242873 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-dns-svc\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.242955 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-config\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.269155 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7lm\" (UniqueName: \"kubernetes.io/projected/942de512-1fdc-4955-a703-ccd872474993-kube-api-access-kx7lm\") pod \"dnsmasq-dns-5f854695bc-249jr\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.338109 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.372976 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.624331 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-lfczj"] Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.639929 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:24:16 crc kubenswrapper[4931]: I0130 05:24:16.695285 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-249jr"] Jan 30 05:24:16 crc kubenswrapper[4931]: W0130 05:24:16.696100 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod942de512_1fdc_4955_a703_ccd872474993.slice/crio-e64df3f8ede918e6ed1e054ec67159e4720f4b2da72fde95107c95f43f1ee3d0 WatchSource:0}: Error finding container e64df3f8ede918e6ed1e054ec67159e4720f4b2da72fde95107c95f43f1ee3d0: Status 404 returned error can't find the container with id e64df3f8ede918e6ed1e054ec67159e4720f4b2da72fde95107c95f43f1ee3d0 Jan 30 05:24:17 crc kubenswrapper[4931]: I0130 05:24:17.270651 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" event={"ID":"7f097854-006c-4110-bec8-b9d364ddb000","Type":"ContainerStarted","Data":"bbc8e313c79f2552b4495d71bd60cf2000cede24398da8824f1260de69102011"} Jan 30 05:24:17 crc kubenswrapper[4931]: I0130 05:24:17.272168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-249jr" event={"ID":"942de512-1fdc-4955-a703-ccd872474993","Type":"ContainerStarted","Data":"e64df3f8ede918e6ed1e054ec67159e4720f4b2da72fde95107c95f43f1ee3d0"} Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.564113 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-lfczj"] Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.585118 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-tbgfx"] Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.586858 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.601915 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-tbgfx"] Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.684906 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-config\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.684972 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvn6\" (UniqueName: \"kubernetes.io/projected/75ff6901-d6c9-467a-a4d2-35ddb8050570-kube-api-access-wkvn6\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.685024 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.786225 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-config\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.786273 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvn6\" (UniqueName: \"kubernetes.io/projected/75ff6901-d6c9-467a-a4d2-35ddb8050570-kube-api-access-wkvn6\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.786308 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.787203 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.787592 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-config\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.815496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvn6\" (UniqueName: \"kubernetes.io/projected/75ff6901-d6c9-467a-a4d2-35ddb8050570-kube-api-access-wkvn6\") pod \"dnsmasq-dns-744ffd65bc-tbgfx\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.872398 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-249jr"] Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.899917 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9vtmt"] Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.901514 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.909876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:18 crc kubenswrapper[4931]: I0130 05:24:18.911365 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9vtmt"] Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.090331 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-dns-svc\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.090384 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-config\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.090460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knxhw\" (UniqueName: \"kubernetes.io/projected/7358be48-7c82-45bd-8165-0c02dcdb3666-kube-api-access-knxhw\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.195218 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knxhw\" (UniqueName: \"kubernetes.io/projected/7358be48-7c82-45bd-8165-0c02dcdb3666-kube-api-access-knxhw\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.195302 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-dns-svc\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.195331 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-config\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.196136 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-config\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.198237 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-dns-svc\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.218198 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knxhw\" (UniqueName: \"kubernetes.io/projected/7358be48-7c82-45bd-8165-0c02dcdb3666-kube-api-access-knxhw\") pod \"dnsmasq-dns-95f5f6995-9vtmt\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.225284 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.444761 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-tbgfx"] Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.648960 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9vtmt"] Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.702410 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.703487 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.706438 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.706506 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.706643 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.706704 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.706804 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.707004 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.707109 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bn5cs" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.719533 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904255 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904298 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904339 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc3f4796-66b1-452b-afca-5e62cbf2a53b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904365 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xv5\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-kube-api-access-d6xv5\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904381 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904406 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904554 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904773 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904832 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc3f4796-66b1-452b-afca-5e62cbf2a53b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:19 crc kubenswrapper[4931]: I0130 05:24:19.904905 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.006457 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.006958 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.006983 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.007152 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.007300 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.008879 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc3f4796-66b1-452b-afca-5e62cbf2a53b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.008945 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.008966 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xv5\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-kube-api-access-d6xv5\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.009033 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.009111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.009130 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.009142 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.009339 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.009491 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc3f4796-66b1-452b-afca-5e62cbf2a53b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.010077 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.010496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.017280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.017922 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc3f4796-66b1-452b-afca-5e62cbf2a53b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.020949 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.021630 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.023736 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.024335 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.024485 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.025204 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.025502 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-smjgk" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.025595 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.025704 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.025843 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.025904 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.030133 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.030560 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.030877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc3f4796-66b1-452b-afca-5e62cbf2a53b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.049521 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xv5\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-kube-api-access-d6xv5\") pod \"rabbitmq-server-0\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.214965 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.215016 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.215075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.216343 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/081e3873-ea99-4486-925f-784a98e49405-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217444 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217537 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217595 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrm76\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-kube-api-access-rrm76\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217682 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/081e3873-ea99-4486-925f-784a98e49405-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217707 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.217737 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.321850 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrm76\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-kube-api-access-rrm76\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.321908 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/081e3873-ea99-4486-925f-784a98e49405-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.321927 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.321950 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.321973 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.321990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322030 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322058 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/081e3873-ea99-4486-925f-784a98e49405-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322072 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322092 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322259 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322613 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.322911 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.323040 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.323235 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.323907 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.324286 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.324304 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.329856 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/081e3873-ea99-4486-925f-784a98e49405-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.331181 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.342212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrm76\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-kube-api-access-rrm76\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.343398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/081e3873-ea99-4486-925f-784a98e49405-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.345125 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.352390 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:20 crc kubenswrapper[4931]: I0130 05:24:20.429535 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.360777 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.365876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.375010 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.375961 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.376408 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.376577 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ms6mr" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.380907 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.382051 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551341 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-default\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551394 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fq4\" (UniqueName: \"kubernetes.io/projected/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kube-api-access-t5fq4\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551436 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-operator-scripts\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551626 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551724 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-generated\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551870 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kolla-config\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.551927 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.552085 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.653654 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-default\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.653705 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fq4\" (UniqueName: \"kubernetes.io/projected/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kube-api-access-t5fq4\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.653732 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-operator-scripts\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.653763 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.653969 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-generated\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.654007 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kolla-config\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.654031 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.654118 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.655096 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-default\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.655129 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-generated\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.655239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kolla-config\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.655432 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.656193 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-operator-scripts\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.659205 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.664117 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.673276 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fq4\" (UniqueName: \"kubernetes.io/projected/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kube-api-access-t5fq4\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.680072 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " pod="openstack/openstack-galera-0" Jan 30 05:24:21 crc kubenswrapper[4931]: I0130 05:24:21.713909 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.809284 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.815618 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.822793 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.823139 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.823381 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-wvx2b" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.824389 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.852765 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974417 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974487 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974522 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbw9z\" (UniqueName: \"kubernetes.io/projected/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kube-api-access-fbw9z\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974581 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974676 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974705 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:22 crc kubenswrapper[4931]: I0130 05:24:22.974727 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078544 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078602 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078624 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078666 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078708 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078729 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbw9z\" (UniqueName: \"kubernetes.io/projected/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kube-api-access-fbw9z\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.078752 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.079471 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.079698 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.080060 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.080805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.080921 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.084764 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.085974 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.095727 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.097036 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.098823 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.098875 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4v8gl" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.099856 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.106728 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbw9z\" (UniqueName: \"kubernetes.io/projected/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kube-api-access-fbw9z\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.125923 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.145408 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.156738 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.179946 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kolla-config\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.179999 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-config-data\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.180149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l65cq\" (UniqueName: \"kubernetes.io/projected/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kube-api-access-l65cq\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.180246 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.180286 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.282148 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l65cq\" (UniqueName: \"kubernetes.io/projected/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kube-api-access-l65cq\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.282204 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.282231 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.282282 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kolla-config\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.282311 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-config-data\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.283200 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kolla-config\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.283208 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-config-data\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.286886 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.288826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.297207 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l65cq\" (UniqueName: \"kubernetes.io/projected/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kube-api-access-l65cq\") pod \"memcached-0\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: I0130 05:24:23.478159 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:24:23 crc kubenswrapper[4931]: W0130 05:24:23.612802 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ff6901_d6c9_467a_a4d2_35ddb8050570.slice/crio-d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206 WatchSource:0}: Error finding container d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206: Status 404 returned error can't find the container with id d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206 Jan 30 05:24:23 crc kubenswrapper[4931]: W0130 05:24:23.613307 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7358be48_7c82_45bd_8165_0c02dcdb3666.slice/crio-16572f431ff9e8b9359ca1361817097bd7aaff980c134f42bc30a47f2e623bfc WatchSource:0}: Error finding container 16572f431ff9e8b9359ca1361817097bd7aaff980c134f42bc30a47f2e623bfc: Status 404 returned error can't find the container with id 16572f431ff9e8b9359ca1361817097bd7aaff980c134f42bc30a47f2e623bfc Jan 30 05:24:24 crc kubenswrapper[4931]: I0130 05:24:24.328738 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" event={"ID":"75ff6901-d6c9-467a-a4d2-35ddb8050570","Type":"ContainerStarted","Data":"d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206"} Jan 30 05:24:24 crc kubenswrapper[4931]: I0130 05:24:24.329911 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" event={"ID":"7358be48-7c82-45bd-8165-0c02dcdb3666","Type":"ContainerStarted","Data":"16572f431ff9e8b9359ca1361817097bd7aaff980c134f42bc30a47f2e623bfc"} Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.011008 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.012493 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.015498 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-b6t4t" Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.017510 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.112960 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqss9\" (UniqueName: \"kubernetes.io/projected/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2-kube-api-access-tqss9\") pod \"kube-state-metrics-0\" (UID: \"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2\") " pod="openstack/kube-state-metrics-0" Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.215306 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqss9\" (UniqueName: \"kubernetes.io/projected/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2-kube-api-access-tqss9\") pod \"kube-state-metrics-0\" (UID: \"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2\") " pod="openstack/kube-state-metrics-0" Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.238261 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqss9\" (UniqueName: \"kubernetes.io/projected/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2-kube-api-access-tqss9\") pod \"kube-state-metrics-0\" (UID: \"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2\") " pod="openstack/kube-state-metrics-0" Jan 30 05:24:25 crc kubenswrapper[4931]: I0130 05:24:25.370387 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.975163 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ggjtl"] Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.976788 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.979694 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zlq6r" Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.980219 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.987656 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-thxc2"] Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.987850 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.990067 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:28 crc kubenswrapper[4931]: I0130 05:24:28.993876 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ggjtl"] Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077171 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-ovn-controller-tls-certs\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077219 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-log-ovn\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077243 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077261 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-log\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077280 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-lib\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run-ovn\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077311 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7snh\" (UniqueName: \"kubernetes.io/projected/8a337463-8b7e-496b-9a01-fc491120c21d-kube-api-access-l7snh\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077329 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-etc-ovs\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077373 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d259g\" (UniqueName: \"kubernetes.io/projected/5732e34e-6330-4a36-9082-dbb50eede9f2-kube-api-access-d259g\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077396 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-run\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077453 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5732e34e-6330-4a36-9082-dbb50eede9f2-scripts\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077472 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a337463-8b7e-496b-9a01-fc491120c21d-scripts\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.077500 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-combined-ca-bundle\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.080382 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-thxc2"] Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178719 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d259g\" (UniqueName: \"kubernetes.io/projected/5732e34e-6330-4a36-9082-dbb50eede9f2-kube-api-access-d259g\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178764 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-run\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178794 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5732e34e-6330-4a36-9082-dbb50eede9f2-scripts\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178816 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a337463-8b7e-496b-9a01-fc491120c21d-scripts\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178841 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-combined-ca-bundle\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-ovn-controller-tls-certs\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178885 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-log-ovn\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178905 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178926 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-log\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178946 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-lib\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178972 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run-ovn\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.178988 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7snh\" (UniqueName: \"kubernetes.io/projected/8a337463-8b7e-496b-9a01-fc491120c21d-kube-api-access-l7snh\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.179009 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-etc-ovs\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.179557 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-etc-ovs\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.179972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run-ovn\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.180096 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.180121 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-run\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.182059 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a337463-8b7e-496b-9a01-fc491120c21d-scripts\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.182166 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-log\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.182210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-log-ovn\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.182366 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-lib\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.183690 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5732e34e-6330-4a36-9082-dbb50eede9f2-scripts\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.185186 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-ovn-controller-tls-certs\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.187514 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-combined-ca-bundle\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.192663 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d259g\" (UniqueName: \"kubernetes.io/projected/5732e34e-6330-4a36-9082-dbb50eede9f2-kube-api-access-d259g\") pod \"ovn-controller-ovs-thxc2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.197191 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7snh\" (UniqueName: \"kubernetes.io/projected/8a337463-8b7e-496b-9a01-fc491120c21d-kube-api-access-l7snh\") pod \"ovn-controller-ggjtl\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.335386 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.339123 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.482299 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.484411 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.486361 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.486885 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.487186 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.487917 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.489179 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-s2vdq" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.514872 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586431 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2l8\" (UniqueName: \"kubernetes.io/projected/49a63fb4-24bc-4834-b6e7-937688c5de09-kube-api-access-6f2l8\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586478 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586508 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586534 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586621 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586638 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-config\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.586655 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688643 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688763 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688796 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-config\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688839 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2l8\" (UniqueName: \"kubernetes.io/projected/49a63fb4-24bc-4834-b6e7-937688c5de09-kube-api-access-6f2l8\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.688981 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.689029 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.689996 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-config\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.690273 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.691058 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.692578 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.699548 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.701977 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.711086 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.714173 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2l8\" (UniqueName: \"kubernetes.io/projected/49a63fb4-24bc-4834-b6e7-937688c5de09-kube-api-access-6f2l8\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.720165 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:29 crc kubenswrapper[4931]: I0130 05:24:29.808876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:31 crc kubenswrapper[4931]: I0130 05:24:31.383103 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: E0130 05:24:32.108703 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 30 05:24:32 crc kubenswrapper[4931]: E0130 05:24:32.108873 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kx7lm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-249jr_openstack(942de512-1fdc-4955-a703-ccd872474993): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:24:32 crc kubenswrapper[4931]: E0130 05:24:32.110204 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-249jr" podUID="942de512-1fdc-4955-a703-ccd872474993" Jan 30 05:24:32 crc kubenswrapper[4931]: W0130 05:24:32.120173 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod348ffd7a_9b7f_40aa_ada9_145a3a783d09.slice/crio-b84c7628d09612f5f198418a62d4a2daabe598f826560ac2746867af05368a8f WatchSource:0}: Error finding container b84c7628d09612f5f198418a62d4a2daabe598f826560ac2746867af05368a8f: Status 404 returned error can't find the container with id b84c7628d09612f5f198418a62d4a2daabe598f826560ac2746867af05368a8f Jan 30 05:24:32 crc kubenswrapper[4931]: E0130 05:24:32.142110 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 30 05:24:32 crc kubenswrapper[4931]: E0130 05:24:32.142496 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q99sh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-lfczj_openstack(7f097854-006c-4110-bec8-b9d364ddb000): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:24:32 crc kubenswrapper[4931]: E0130 05:24:32.143851 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" podUID="7f097854-006c-4110-bec8-b9d364ddb000" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.389856 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"348ffd7a-9b7f-40aa-ada9-145a3a783d09","Type":"ContainerStarted","Data":"b84c7628d09612f5f198418a62d4a2daabe598f826560ac2746867af05368a8f"} Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.745678 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.751662 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.785773 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.786916 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.790396 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.790446 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.790749 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c7nvq" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.790880 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.793861 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.845644 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-dns-svc\") pod \"942de512-1fdc-4955-a703-ccd872474993\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.845812 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7lm\" (UniqueName: \"kubernetes.io/projected/942de512-1fdc-4955-a703-ccd872474993-kube-api-access-kx7lm\") pod \"942de512-1fdc-4955-a703-ccd872474993\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.845880 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-config\") pod \"942de512-1fdc-4955-a703-ccd872474993\" (UID: \"942de512-1fdc-4955-a703-ccd872474993\") " Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.845933 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q99sh\" (UniqueName: \"kubernetes.io/projected/7f097854-006c-4110-bec8-b9d364ddb000-kube-api-access-q99sh\") pod \"7f097854-006c-4110-bec8-b9d364ddb000\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.845980 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f097854-006c-4110-bec8-b9d364ddb000-config\") pod \"7f097854-006c-4110-bec8-b9d364ddb000\" (UID: \"7f097854-006c-4110-bec8-b9d364ddb000\") " Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846513 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mdl\" (UniqueName: \"kubernetes.io/projected/f28f211b-be26-4f15-92a1-36b91cb53bbb-kube-api-access-k9mdl\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846620 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846687 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846744 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846821 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.846856 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-config\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.848160 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "942de512-1fdc-4955-a703-ccd872474993" (UID: "942de512-1fdc-4955-a703-ccd872474993"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.850721 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-config" (OuterVolumeSpecName: "config") pod "942de512-1fdc-4955-a703-ccd872474993" (UID: "942de512-1fdc-4955-a703-ccd872474993"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.851289 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f097854-006c-4110-bec8-b9d364ddb000-config" (OuterVolumeSpecName: "config") pod "7f097854-006c-4110-bec8-b9d364ddb000" (UID: "7f097854-006c-4110-bec8-b9d364ddb000"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.854881 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f097854-006c-4110-bec8-b9d364ddb000-kube-api-access-q99sh" (OuterVolumeSpecName: "kube-api-access-q99sh") pod "7f097854-006c-4110-bec8-b9d364ddb000" (UID: "7f097854-006c-4110-bec8-b9d364ddb000"). InnerVolumeSpecName "kube-api-access-q99sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.855592 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942de512-1fdc-4955-a703-ccd872474993-kube-api-access-kx7lm" (OuterVolumeSpecName: "kube-api-access-kx7lm") pod "942de512-1fdc-4955-a703-ccd872474993" (UID: "942de512-1fdc-4955-a703-ccd872474993"). InnerVolumeSpecName "kube-api-access-kx7lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.925787 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.932230 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.938173 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ggjtl"] Jan 30 05:24:32 crc kubenswrapper[4931]: W0130 05:24:32.941705 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e7b62f_8246_48b8_bcbb_d7c5129dd5e2.slice/crio-54dbd562a66dec0d4b1d17dc98e849f8bad3b54d165bc61c92a13695e75f4ae7 WatchSource:0}: Error finding container 54dbd562a66dec0d4b1d17dc98e849f8bad3b54d165bc61c92a13695e75f4ae7: Status 404 returned error can't find the container with id 54dbd562a66dec0d4b1d17dc98e849f8bad3b54d165bc61c92a13695e75f4ae7 Jan 30 05:24:32 crc kubenswrapper[4931]: W0130 05:24:32.941906 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a337463_8b7e_496b_9a01_fc491120c21d.slice/crio-c2331a0e3efc476073fa6f72907e46cdc0fd3358dd0c363648234586881ae09d WatchSource:0}: Error finding container c2331a0e3efc476073fa6f72907e46cdc0fd3358dd0c363648234586881ae09d: Status 404 returned error can't find the container with id c2331a0e3efc476073fa6f72907e46cdc0fd3358dd0c363648234586881ae09d Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948622 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948661 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mdl\" (UniqueName: \"kubernetes.io/projected/f28f211b-be26-4f15-92a1-36b91cb53bbb-kube-api-access-k9mdl\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948728 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948767 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948791 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948808 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948827 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-config\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948887 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7lm\" (UniqueName: \"kubernetes.io/projected/942de512-1fdc-4955-a703-ccd872474993-kube-api-access-kx7lm\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948898 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948908 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q99sh\" (UniqueName: \"kubernetes.io/projected/7f097854-006c-4110-bec8-b9d364ddb000-kube-api-access-q99sh\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948917 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f097854-006c-4110-bec8-b9d364ddb000-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.948924 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/942de512-1fdc-4955-a703-ccd872474993-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.949877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-config\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.950139 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.950792 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.953670 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.954629 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.954645 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.955140 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.972936 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mdl\" (UniqueName: \"kubernetes.io/projected/f28f211b-be26-4f15-92a1-36b91cb53bbb-kube-api-access-k9mdl\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.975488 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.983509 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.987445 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:24:32 crc kubenswrapper[4931]: I0130 05:24:32.992751 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:24:33 crc kubenswrapper[4931]: W0130 05:24:33.012641 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod081e3873_ea99_4486_925f_784a98e49405.slice/crio-9ce481797a1f7988304010979cd564d60b819812a50932395d5f66e51b07187f WatchSource:0}: Error finding container 9ce481797a1f7988304010979cd564d60b819812a50932395d5f66e51b07187f: Status 404 returned error can't find the container with id 9ce481797a1f7988304010979cd564d60b819812a50932395d5f66e51b07187f Jan 30 05:24:33 crc kubenswrapper[4931]: W0130 05:24:33.014876 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3f4796_66b1_452b_afca_5e62cbf2a53b.slice/crio-29ea210245e2f099c89b0f1dd11f3b873ad2e58d0c37d026cc8cfb61ec6d3cfa WatchSource:0}: Error finding container 29ea210245e2f099c89b0f1dd11f3b873ad2e58d0c37d026cc8cfb61ec6d3cfa: Status 404 returned error can't find the container with id 29ea210245e2f099c89b0f1dd11f3b873ad2e58d0c37d026cc8cfb61ec6d3cfa Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.059825 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-thxc2"] Jan 30 05:24:33 crc kubenswrapper[4931]: W0130 05:24:33.065587 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5732e34e_6330_4a36_9082_dbb50eede9f2.slice/crio-f259a70451b1edb6023ad4c42bb1037e4e2cbc756eed3599105a7d0ba07dc5ac WatchSource:0}: Error finding container f259a70451b1edb6023ad4c42bb1037e4e2cbc756eed3599105a7d0ba07dc5ac: Status 404 returned error can't find the container with id f259a70451b1edb6023ad4c42bb1037e4e2cbc756eed3599105a7d0ba07dc5ac Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.117802 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.396842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerStarted","Data":"f259a70451b1edb6023ad4c42bb1037e4e2cbc756eed3599105a7d0ba07dc5ac"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.398839 4931 generic.go:334] "Generic (PLEG): container finished" podID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerID="d84e00fba877076358074492301ade07a08806d5e1a8e09523a5b7de67a88279" exitCode=0 Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.398883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" event={"ID":"7358be48-7c82-45bd-8165-0c02dcdb3666","Type":"ContainerDied","Data":"d84e00fba877076358074492301ade07a08806d5e1a8e09523a5b7de67a88279"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.400525 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2","Type":"ContainerStarted","Data":"54dbd562a66dec0d4b1d17dc98e849f8bad3b54d165bc61c92a13695e75f4ae7"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.404469 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef","Type":"ContainerStarted","Data":"651858dcd740868b54f1818387952f7e3dd92b06537502abf826f277b0f1c2f7"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.407064 4931 generic.go:334] "Generic (PLEG): container finished" podID="75ff6901-d6c9-467a-a4d2-35ddb8050570" containerID="cf669d89126cd05876fe2026bdc44224135e63c9e8ec5899f87342a850974a32" exitCode=0 Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.407116 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" event={"ID":"75ff6901-d6c9-467a-a4d2-35ddb8050570","Type":"ContainerDied","Data":"cf669d89126cd05876fe2026bdc44224135e63c9e8ec5899f87342a850974a32"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.408934 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-249jr" Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.408981 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-249jr" event={"ID":"942de512-1fdc-4955-a703-ccd872474993","Type":"ContainerDied","Data":"e64df3f8ede918e6ed1e054ec67159e4720f4b2da72fde95107c95f43f1ee3d0"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.413745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba","Type":"ContainerStarted","Data":"04861bcc57b9390c9ad1874bbf632a1a5e0da259d664ad8c22e1c2db45c343a6"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.450947 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.457848 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"081e3873-ea99-4486-925f-784a98e49405","Type":"ContainerStarted","Data":"9ce481797a1f7988304010979cd564d60b819812a50932395d5f66e51b07187f"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.457908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc3f4796-66b1-452b-afca-5e62cbf2a53b","Type":"ContainerStarted","Data":"29ea210245e2f099c89b0f1dd11f3b873ad2e58d0c37d026cc8cfb61ec6d3cfa"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.457928 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-lfczj" event={"ID":"7f097854-006c-4110-bec8-b9d364ddb000","Type":"ContainerDied","Data":"bbc8e313c79f2552b4495d71bd60cf2000cede24398da8824f1260de69102011"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.467677 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl" event={"ID":"8a337463-8b7e-496b-9a01-fc491120c21d","Type":"ContainerStarted","Data":"c2331a0e3efc476073fa6f72907e46cdc0fd3358dd0c363648234586881ae09d"} Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.530308 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-249jr"] Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.535290 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-249jr"] Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.597486 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-lfczj"] Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.602815 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-lfczj"] Jan 30 05:24:33 crc kubenswrapper[4931]: I0130 05:24:33.693010 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:24:33 crc kubenswrapper[4931]: W0130 05:24:33.734180 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf28f211b_be26_4f15_92a1_36b91cb53bbb.slice/crio-920213aded2a6124fc2a4c0ef0f31260bf1b62f8e7693371989b52df16882f74 WatchSource:0}: Error finding container 920213aded2a6124fc2a4c0ef0f31260bf1b62f8e7693371989b52df16882f74: Status 404 returned error can't find the container with id 920213aded2a6124fc2a4c0ef0f31260bf1b62f8e7693371989b52df16882f74 Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.076997 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.150099 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-dvktv"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.151281 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.154469 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.170583 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ntv\" (UniqueName: \"kubernetes.io/projected/4ba289fc-17e9-45e9-ac24-434d69045d97-kube-api-access-j6ntv\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.171171 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-combined-ca-bundle\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.171394 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.171568 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovn-rundir\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.171637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba289fc-17e9-45e9-ac24-434d69045d97-config\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.171666 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovs-rundir\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.174844 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dvktv"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272363 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovn-rundir\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272433 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba289fc-17e9-45e9-ac24-434d69045d97-config\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272455 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovs-rundir\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272494 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ntv\" (UniqueName: \"kubernetes.io/projected/4ba289fc-17e9-45e9-ac24-434d69045d97-kube-api-access-j6ntv\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272522 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-combined-ca-bundle\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272584 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.272712 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovn-rundir\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.273219 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba289fc-17e9-45e9-ac24-434d69045d97-config\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.273546 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovs-rundir\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.280740 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.289040 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ntv\" (UniqueName: \"kubernetes.io/projected/4ba289fc-17e9-45e9-ac24-434d69045d97-kube-api-access-j6ntv\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.301015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-combined-ca-bundle\") pod \"ovn-controller-metrics-dvktv\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.335012 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-tbgfx"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.340309 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-794868bd45-bzjkd"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.348577 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.351366 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.381093 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-bzjkd"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.452307 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9vtmt"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.468530 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.470076 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-v6tmx"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.475171 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.478600 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.483389 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-config\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.483533 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.483558 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv89g\" (UniqueName: \"kubernetes.io/projected/a34b87df-8978-4e2d-9875-a6b81a09fa84-kube-api-access-xv89g\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.483594 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-dns-svc\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.498507 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-v6tmx"] Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.508141 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" event={"ID":"7358be48-7c82-45bd-8165-0c02dcdb3666","Type":"ContainerStarted","Data":"4608c1f4973c251ac1bb4b51d3c1df7c1d0e5b8e2d0e5905c0643537c2687704"} Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.508450 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.519951 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f28f211b-be26-4f15-92a1-36b91cb53bbb","Type":"ContainerStarted","Data":"920213aded2a6124fc2a4c0ef0f31260bf1b62f8e7693371989b52df16882f74"} Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.527589 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" podStartSLOduration=7.907460806 podStartE2EDuration="16.527575623s" podCreationTimestamp="2026-01-30 05:24:18 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.622727358 +0000 UTC m=+998.992637625" lastFinishedPulling="2026-01-30 05:24:32.242842185 +0000 UTC m=+1007.612752442" observedRunningTime="2026-01-30 05:24:34.526248625 +0000 UTC m=+1009.896158882" watchObservedRunningTime="2026-01-30 05:24:34.527575623 +0000 UTC m=+1009.897485880" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.585588 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586295 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586351 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv89g\" (UniqueName: \"kubernetes.io/projected/a34b87df-8978-4e2d-9875-a6b81a09fa84-kube-api-access-xv89g\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586414 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-config\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586468 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-dns-svc\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586584 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpfk6\" (UniqueName: \"kubernetes.io/projected/c052a747-4d6e-459f-80c2-b690015e411d-kube-api-access-dpfk6\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586631 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586727 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-config\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.586774 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.588797 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-config\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.589478 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-dns-svc\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.589910 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.603795 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv89g\" (UniqueName: \"kubernetes.io/projected/a34b87df-8978-4e2d-9875-a6b81a09fa84-kube-api-access-xv89g\") pod \"dnsmasq-dns-794868bd45-bzjkd\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: E0130 05:24:34.645352 4931 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 30 05:24:34 crc kubenswrapper[4931]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/75ff6901-d6c9-467a-a4d2-35ddb8050570/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 05:24:34 crc kubenswrapper[4931]: > podSandboxID="d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206" Jan 30 05:24:34 crc kubenswrapper[4931]: E0130 05:24:34.645574 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:24:34 crc kubenswrapper[4931]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkvn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-tbgfx_openstack(75ff6901-d6c9-467a-a4d2-35ddb8050570): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/75ff6901-d6c9-467a-a4d2-35ddb8050570/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 05:24:34 crc kubenswrapper[4931]: > logger="UnhandledError" Jan 30 05:24:34 crc kubenswrapper[4931]: E0130 05:24:34.646950 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/75ff6901-d6c9-467a-a4d2-35ddb8050570/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" podUID="75ff6901-d6c9-467a-a4d2-35ddb8050570" Jan 30 05:24:34 crc kubenswrapper[4931]: W0130 05:24:34.649990 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a63fb4_24bc_4834_b6e7_937688c5de09.slice/crio-d92e6599441617be3a228c318ab5084a192dbb4da1df24b47362bc9f2366da37 WatchSource:0}: Error finding container d92e6599441617be3a228c318ab5084a192dbb4da1df24b47362bc9f2366da37: Status 404 returned error can't find the container with id d92e6599441617be3a228c318ab5084a192dbb4da1df24b47362bc9f2366da37 Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.688453 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.688530 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.688568 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-config\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.688595 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpfk6\" (UniqueName: \"kubernetes.io/projected/c052a747-4d6e-459f-80c2-b690015e411d-kube-api-access-dpfk6\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.688615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.689624 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.689684 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-config\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.689707 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.691437 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.693164 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.706371 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpfk6\" (UniqueName: \"kubernetes.io/projected/c052a747-4d6e-459f-80c2-b690015e411d-kube-api-access-dpfk6\") pod \"dnsmasq-dns-757dc6fff9-v6tmx\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:34 crc kubenswrapper[4931]: I0130 05:24:34.809834 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:35 crc kubenswrapper[4931]: I0130 05:24:35.434900 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f097854-006c-4110-bec8-b9d364ddb000" path="/var/lib/kubelet/pods/7f097854-006c-4110-bec8-b9d364ddb000/volumes" Jan 30 05:24:35 crc kubenswrapper[4931]: I0130 05:24:35.435329 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="942de512-1fdc-4955-a703-ccd872474993" path="/var/lib/kubelet/pods/942de512-1fdc-4955-a703-ccd872474993/volumes" Jan 30 05:24:35 crc kubenswrapper[4931]: I0130 05:24:35.530319 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"49a63fb4-24bc-4834-b6e7-937688c5de09","Type":"ContainerStarted","Data":"d92e6599441617be3a228c318ab5084a192dbb4da1df24b47362bc9f2366da37"} Jan 30 05:24:35 crc kubenswrapper[4931]: I0130 05:24:35.530440 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="dnsmasq-dns" containerID="cri-o://4608c1f4973c251ac1bb4b51d3c1df7c1d0e5b8e2d0e5905c0643537c2687704" gracePeriod=10 Jan 30 05:24:36 crc kubenswrapper[4931]: I0130 05:24:36.541882 4931 generic.go:334] "Generic (PLEG): container finished" podID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerID="4608c1f4973c251ac1bb4b51d3c1df7c1d0e5b8e2d0e5905c0643537c2687704" exitCode=0 Jan 30 05:24:36 crc kubenswrapper[4931]: I0130 05:24:36.541936 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" event={"ID":"7358be48-7c82-45bd-8165-0c02dcdb3666","Type":"ContainerDied","Data":"4608c1f4973c251ac1bb4b51d3c1df7c1d0e5b8e2d0e5905c0643537c2687704"} Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.555658 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" event={"ID":"75ff6901-d6c9-467a-a4d2-35ddb8050570","Type":"ContainerDied","Data":"d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206"} Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.555866 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42db6c19bc5af9b7f0bda95ed922048afc3e58c472145aa59ba1e17a5184206" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.589606 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.745116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvn6\" (UniqueName: \"kubernetes.io/projected/75ff6901-d6c9-467a-a4d2-35ddb8050570-kube-api-access-wkvn6\") pod \"75ff6901-d6c9-467a-a4d2-35ddb8050570\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.745369 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-dns-svc\") pod \"75ff6901-d6c9-467a-a4d2-35ddb8050570\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.745485 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-config\") pod \"75ff6901-d6c9-467a-a4d2-35ddb8050570\" (UID: \"75ff6901-d6c9-467a-a4d2-35ddb8050570\") " Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.757664 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ff6901-d6c9-467a-a4d2-35ddb8050570-kube-api-access-wkvn6" (OuterVolumeSpecName: "kube-api-access-wkvn6") pod "75ff6901-d6c9-467a-a4d2-35ddb8050570" (UID: "75ff6901-d6c9-467a-a4d2-35ddb8050570"). InnerVolumeSpecName "kube-api-access-wkvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.794360 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-config" (OuterVolumeSpecName: "config") pod "75ff6901-d6c9-467a-a4d2-35ddb8050570" (UID: "75ff6901-d6c9-467a-a4d2-35ddb8050570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.796280 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75ff6901-d6c9-467a-a4d2-35ddb8050570" (UID: "75ff6901-d6c9-467a-a4d2-35ddb8050570"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.848232 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvn6\" (UniqueName: \"kubernetes.io/projected/75ff6901-d6c9-467a-a4d2-35ddb8050570-kube-api-access-wkvn6\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.848305 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:37 crc kubenswrapper[4931]: I0130 05:24:37.848315 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ff6901-d6c9-467a-a4d2-35ddb8050570-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:38 crc kubenswrapper[4931]: I0130 05:24:38.569977 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-tbgfx" Jan 30 05:24:38 crc kubenswrapper[4931]: I0130 05:24:38.631774 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-tbgfx"] Jan 30 05:24:38 crc kubenswrapper[4931]: I0130 05:24:38.637168 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-tbgfx"] Jan 30 05:24:39 crc kubenswrapper[4931]: I0130 05:24:39.433907 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ff6901-d6c9-467a-a4d2-35ddb8050570" path="/var/lib/kubelet/pods/75ff6901-d6c9-467a-a4d2-35ddb8050570/volumes" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.668017 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.802332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knxhw\" (UniqueName: \"kubernetes.io/projected/7358be48-7c82-45bd-8165-0c02dcdb3666-kube-api-access-knxhw\") pod \"7358be48-7c82-45bd-8165-0c02dcdb3666\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.802481 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-config\") pod \"7358be48-7c82-45bd-8165-0c02dcdb3666\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.802586 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-dns-svc\") pod \"7358be48-7c82-45bd-8165-0c02dcdb3666\" (UID: \"7358be48-7c82-45bd-8165-0c02dcdb3666\") " Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.808938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7358be48-7c82-45bd-8165-0c02dcdb3666-kube-api-access-knxhw" (OuterVolumeSpecName: "kube-api-access-knxhw") pod "7358be48-7c82-45bd-8165-0c02dcdb3666" (UID: "7358be48-7c82-45bd-8165-0c02dcdb3666"). InnerVolumeSpecName "kube-api-access-knxhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.855564 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-config" (OuterVolumeSpecName: "config") pod "7358be48-7c82-45bd-8165-0c02dcdb3666" (UID: "7358be48-7c82-45bd-8165-0c02dcdb3666"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.859383 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7358be48-7c82-45bd-8165-0c02dcdb3666" (UID: "7358be48-7c82-45bd-8165-0c02dcdb3666"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.904033 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knxhw\" (UniqueName: \"kubernetes.io/projected/7358be48-7c82-45bd-8165-0c02dcdb3666-kube-api-access-knxhw\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.904071 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:40 crc kubenswrapper[4931]: I0130 05:24:40.904081 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7358be48-7c82-45bd-8165-0c02dcdb3666-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:41 crc kubenswrapper[4931]: I0130 05:24:41.608395 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" event={"ID":"7358be48-7c82-45bd-8165-0c02dcdb3666","Type":"ContainerDied","Data":"16572f431ff9e8b9359ca1361817097bd7aaff980c134f42bc30a47f2e623bfc"} Jan 30 05:24:41 crc kubenswrapper[4931]: I0130 05:24:41.608537 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" Jan 30 05:24:41 crc kubenswrapper[4931]: I0130 05:24:41.608556 4931 scope.go:117] "RemoveContainer" containerID="4608c1f4973c251ac1bb4b51d3c1df7c1d0e5b8e2d0e5905c0643537c2687704" Jan 30 05:24:41 crc kubenswrapper[4931]: I0130 05:24:41.790133 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9vtmt"] Jan 30 05:24:41 crc kubenswrapper[4931]: I0130 05:24:41.798809 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9vtmt"] Jan 30 05:24:41 crc kubenswrapper[4931]: I0130 05:24:41.826064 4931 scope.go:117] "RemoveContainer" containerID="d84e00fba877076358074492301ade07a08806d5e1a8e09523a5b7de67a88279" Jan 30 05:24:42 crc kubenswrapper[4931]: I0130 05:24:42.669518 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dvktv"] Jan 30 05:24:42 crc kubenswrapper[4931]: I0130 05:24:42.898109 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-bzjkd"] Jan 30 05:24:42 crc kubenswrapper[4931]: I0130 05:24:42.904689 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-v6tmx"] Jan 30 05:24:43 crc kubenswrapper[4931]: W0130 05:24:43.265382 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda34b87df_8978_4e2d_9875_a6b81a09fa84.slice/crio-d1724450fd332ef12f3dd92f8b86fd94720600e08c372a71b460370b61dc3ade WatchSource:0}: Error finding container d1724450fd332ef12f3dd92f8b86fd94720600e08c372a71b460370b61dc3ade: Status 404 returned error can't find the container with id d1724450fd332ef12f3dd92f8b86fd94720600e08c372a71b460370b61dc3ade Jan 30 05:24:43 crc kubenswrapper[4931]: I0130 05:24:43.457995 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" path="/var/lib/kubelet/pods/7358be48-7c82-45bd-8165-0c02dcdb3666/volumes" Jan 30 05:24:43 crc kubenswrapper[4931]: I0130 05:24:43.650930 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dvktv" event={"ID":"4ba289fc-17e9-45e9-ac24-434d69045d97","Type":"ContainerStarted","Data":"39a86ec198f21c9ed97c5b274927fc46f2f6f56ea606ee080f8268afe4d2241b"} Jan 30 05:24:43 crc kubenswrapper[4931]: I0130 05:24:43.653974 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" event={"ID":"c052a747-4d6e-459f-80c2-b690015e411d","Type":"ContainerStarted","Data":"d8ae5c6c06a93c29197bfde41e6a215859930a15dc388d2269865aa48021ba9a"} Jan 30 05:24:43 crc kubenswrapper[4931]: I0130 05:24:43.655142 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" event={"ID":"a34b87df-8978-4e2d-9875-a6b81a09fa84","Type":"ContainerStarted","Data":"d1724450fd332ef12f3dd92f8b86fd94720600e08c372a71b460370b61dc3ade"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.226284 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-95f5f6995-9vtmt" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: i/o timeout" Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.669057 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"348ffd7a-9b7f-40aa-ada9-145a3a783d09","Type":"ContainerStarted","Data":"8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.678130 4931 generic.go:334] "Generic (PLEG): container finished" podID="c052a747-4d6e-459f-80c2-b690015e411d" containerID="2951358824ae5ca54f437c7afd5ea7478602f9317a7330914d36e2cd66c684f6" exitCode=0 Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.678762 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" event={"ID":"c052a747-4d6e-459f-80c2-b690015e411d","Type":"ContainerDied","Data":"2951358824ae5ca54f437c7afd5ea7478602f9317a7330914d36e2cd66c684f6"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.682620 4931 generic.go:334] "Generic (PLEG): container finished" podID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerID="f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054" exitCode=0 Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.682686 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerDied","Data":"f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.686178 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f28f211b-be26-4f15-92a1-36b91cb53bbb","Type":"ContainerStarted","Data":"8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.708967 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba","Type":"ContainerStarted","Data":"d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.709123 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.710708 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"081e3873-ea99-4486-925f-784a98e49405","Type":"ContainerStarted","Data":"4db4fc560f1e0be65146b56bdc4340b3ff1c5a4fe7510a353795090f99291213"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.711913 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2","Type":"ContainerStarted","Data":"b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.711981 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.712977 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef","Type":"ContainerStarted","Data":"c1a9c2221cdd695e864a018795817226f2731d9a93a0ec46938ca160bd878ce0"} Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.786316 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.485921152 podStartE2EDuration="21.786297957s" podCreationTimestamp="2026-01-30 05:24:23 +0000 UTC" firstStartedPulling="2026-01-30 05:24:32.958193806 +0000 UTC m=+1008.328104063" lastFinishedPulling="2026-01-30 05:24:41.258570571 +0000 UTC m=+1016.628480868" observedRunningTime="2026-01-30 05:24:44.775786052 +0000 UTC m=+1020.145696329" watchObservedRunningTime="2026-01-30 05:24:44.786297957 +0000 UTC m=+1020.156208214" Jan 30 05:24:44 crc kubenswrapper[4931]: I0130 05:24:44.819860 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.403326916 podStartE2EDuration="20.819837408s" podCreationTimestamp="2026-01-30 05:24:24 +0000 UTC" firstStartedPulling="2026-01-30 05:24:32.957881127 +0000 UTC m=+1008.327791384" lastFinishedPulling="2026-01-30 05:24:43.374391609 +0000 UTC m=+1018.744301876" observedRunningTime="2026-01-30 05:24:44.814097977 +0000 UTC m=+1020.184008234" watchObservedRunningTime="2026-01-30 05:24:44.819837408 +0000 UTC m=+1020.189747665" Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.725737 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" event={"ID":"c052a747-4d6e-459f-80c2-b690015e411d","Type":"ContainerStarted","Data":"3eb355680179efbcbb2cf73e83f9b34f38755a348dc73a0a8db4b58a9c1de2f1"} Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.726692 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.727828 4931 generic.go:334] "Generic (PLEG): container finished" podID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerID="b4197288cd21de4e055a7299affb1cb43a53991838539d7286331173ba92c743" exitCode=0 Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.727883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" event={"ID":"a34b87df-8978-4e2d-9875-a6b81a09fa84","Type":"ContainerDied","Data":"b4197288cd21de4e055a7299affb1cb43a53991838539d7286331173ba92c743"} Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.730017 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"49a63fb4-24bc-4834-b6e7-937688c5de09","Type":"ContainerStarted","Data":"ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21"} Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.735663 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dvktv" event={"ID":"4ba289fc-17e9-45e9-ac24-434d69045d97","Type":"ContainerStarted","Data":"82c70d68aab65fc3db72ee184a048732b17b72a09f49232810d0c430a261f1e7"} Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.739548 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f28f211b-be26-4f15-92a1-36b91cb53bbb","Type":"ContainerStarted","Data":"4583c4efc80289000d7023b793d84dc55442d51907c690cc558e927738cb2e88"} Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.742000 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl" event={"ID":"8a337463-8b7e-496b-9a01-fc491120c21d","Type":"ContainerStarted","Data":"324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9"} Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.775842 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" podStartSLOduration=11.775808112 podStartE2EDuration="11.775808112s" podCreationTimestamp="2026-01-30 05:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:24:45.755846282 +0000 UTC m=+1021.125756579" watchObservedRunningTime="2026-01-30 05:24:45.775808112 +0000 UTC m=+1021.145718399" Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.791091 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-dvktv" podStartSLOduration=9.907045496 podStartE2EDuration="11.79105719s" podCreationTimestamp="2026-01-30 05:24:34 +0000 UTC" firstStartedPulling="2026-01-30 05:24:42.764034314 +0000 UTC m=+1018.133944571" lastFinishedPulling="2026-01-30 05:24:44.648046008 +0000 UTC m=+1020.017956265" observedRunningTime="2026-01-30 05:24:45.790099023 +0000 UTC m=+1021.160009290" watchObservedRunningTime="2026-01-30 05:24:45.79105719 +0000 UTC m=+1021.160967457" Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.908220 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ggjtl" podStartSLOduration=7.61731316 podStartE2EDuration="17.908202247s" podCreationTimestamp="2026-01-30 05:24:28 +0000 UTC" firstStartedPulling="2026-01-30 05:24:32.956861429 +0000 UTC m=+1008.326771686" lastFinishedPulling="2026-01-30 05:24:43.247750486 +0000 UTC m=+1018.617660773" observedRunningTime="2026-01-30 05:24:45.905180703 +0000 UTC m=+1021.275090980" watchObservedRunningTime="2026-01-30 05:24:45.908202247 +0000 UTC m=+1021.278112504" Jan 30 05:24:45 crc kubenswrapper[4931]: I0130 05:24:45.927806 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.025953387 podStartE2EDuration="14.927789897s" podCreationTimestamp="2026-01-30 05:24:31 +0000 UTC" firstStartedPulling="2026-01-30 05:24:33.741434554 +0000 UTC m=+1009.111344811" lastFinishedPulling="2026-01-30 05:24:44.643271064 +0000 UTC m=+1020.013181321" observedRunningTime="2026-01-30 05:24:45.924722691 +0000 UTC m=+1021.294632948" watchObservedRunningTime="2026-01-30 05:24:45.927789897 +0000 UTC m=+1021.297700154" Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.756417 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc3f4796-66b1-452b-afca-5e62cbf2a53b","Type":"ContainerStarted","Data":"8bdcfbd624616a917de046867a4b176539e978b80dd1b9fad737dfdab9cb1bce"} Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.760364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" event={"ID":"a34b87df-8978-4e2d-9875-a6b81a09fa84","Type":"ContainerStarted","Data":"9d327fe89d71a738231f8f91b804639f88053bf30faeeb4596482e0db97b8f02"} Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.760546 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.764026 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"49a63fb4-24bc-4834-b6e7-937688c5de09","Type":"ContainerStarted","Data":"36f1f59d90f7e1367de837bd2375e2c11d0df21e5687e4d77b474faff3e8df0b"} Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.768611 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerStarted","Data":"52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17"} Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.768683 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerStarted","Data":"ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1"} Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.769466 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ggjtl" Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.842074 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" podStartSLOduration=12.8420398 podStartE2EDuration="12.8420398s" podCreationTimestamp="2026-01-30 05:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:24:46.836868614 +0000 UTC m=+1022.206778911" watchObservedRunningTime="2026-01-30 05:24:46.8420398 +0000 UTC m=+1022.211950097" Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.881401 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-thxc2" podStartSLOduration=10.690363848 podStartE2EDuration="18.881361503s" podCreationTimestamp="2026-01-30 05:24:28 +0000 UTC" firstStartedPulling="2026-01-30 05:24:33.067515874 +0000 UTC m=+1008.437426131" lastFinishedPulling="2026-01-30 05:24:41.258513529 +0000 UTC m=+1016.628423786" observedRunningTime="2026-01-30 05:24:46.863398129 +0000 UTC m=+1022.233308426" watchObservedRunningTime="2026-01-30 05:24:46.881361503 +0000 UTC m=+1022.251271800" Jan 30 05:24:46 crc kubenswrapper[4931]: I0130 05:24:46.907952 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.304373317 podStartE2EDuration="18.907914858s" podCreationTimestamp="2026-01-30 05:24:28 +0000 UTC" firstStartedPulling="2026-01-30 05:24:34.652475247 +0000 UTC m=+1010.022385504" lastFinishedPulling="2026-01-30 05:24:43.256016788 +0000 UTC m=+1018.625927045" observedRunningTime="2026-01-30 05:24:46.904185383 +0000 UTC m=+1022.274095700" watchObservedRunningTime="2026-01-30 05:24:46.907914858 +0000 UTC m=+1022.277825165" Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.780218 4931 generic.go:334] "Generic (PLEG): container finished" podID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerID="8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07" exitCode=0 Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.780346 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"348ffd7a-9b7f-40aa-ada9-145a3a783d09","Type":"ContainerDied","Data":"8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07"} Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.785777 4931 generic.go:334] "Generic (PLEG): container finished" podID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerID="c1a9c2221cdd695e864a018795817226f2731d9a93a0ec46938ca160bd878ce0" exitCode=0 Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.785922 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef","Type":"ContainerDied","Data":"c1a9c2221cdd695e864a018795817226f2731d9a93a0ec46938ca160bd878ce0"} Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.787190 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.787460 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.810172 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:47 crc kubenswrapper[4931]: I0130 05:24:47.908511 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.118799 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.119068 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.187638 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.479262 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.794900 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"348ffd7a-9b7f-40aa-ada9-145a3a783d09","Type":"ContainerStarted","Data":"2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257"} Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.797302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef","Type":"ContainerStarted","Data":"1c39e215a63df32503630d9061f8096755ed069b7c32eec93d18140193ca977b"} Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.797810 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.834243 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.497309281 podStartE2EDuration="28.83421763s" podCreationTimestamp="2026-01-30 05:24:20 +0000 UTC" firstStartedPulling="2026-01-30 05:24:32.122159468 +0000 UTC m=+1007.492069725" lastFinishedPulling="2026-01-30 05:24:41.459067817 +0000 UTC m=+1016.828978074" observedRunningTime="2026-01-30 05:24:48.821644257 +0000 UTC m=+1024.191554514" watchObservedRunningTime="2026-01-30 05:24:48.83421763 +0000 UTC m=+1024.204127897" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.853312 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.035221224 podStartE2EDuration="27.853289685s" podCreationTimestamp="2026-01-30 05:24:21 +0000 UTC" firstStartedPulling="2026-01-30 05:24:33.001679446 +0000 UTC m=+1008.371589703" lastFinishedPulling="2026-01-30 05:24:41.819747917 +0000 UTC m=+1017.189658164" observedRunningTime="2026-01-30 05:24:48.849530309 +0000 UTC m=+1024.219440616" watchObservedRunningTime="2026-01-30 05:24:48.853289685 +0000 UTC m=+1024.223199952" Jan 30 05:24:48 crc kubenswrapper[4931]: I0130 05:24:48.860833 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:49 crc kubenswrapper[4931]: I0130 05:24:49.867788 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.045718 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:24:50 crc kubenswrapper[4931]: E0130 05:24:50.046002 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="init" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.046019 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="init" Jan 30 05:24:50 crc kubenswrapper[4931]: E0130 05:24:50.046042 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ff6901-d6c9-467a-a4d2-35ddb8050570" containerName="init" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.046048 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ff6901-d6c9-467a-a4d2-35ddb8050570" containerName="init" Jan 30 05:24:50 crc kubenswrapper[4931]: E0130 05:24:50.046077 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="dnsmasq-dns" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.046083 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="dnsmasq-dns" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.046215 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ff6901-d6c9-467a-a4d2-35ddb8050570" containerName="init" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.046231 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7358be48-7c82-45bd-8165-0c02dcdb3666" containerName="dnsmasq-dns" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.046950 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.048895 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.049039 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7t2b7" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.049125 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.049409 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.066684 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.167790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.168019 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.168067 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.168134 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-scripts\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.168161 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-config\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.168276 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb59q\" (UniqueName: \"kubernetes.io/projected/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-kube-api-access-qb59q\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.168325 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.269760 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.269807 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.269835 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-scripts\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.269852 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-config\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.269877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb59q\" (UniqueName: \"kubernetes.io/projected/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-kube-api-access-qb59q\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.269895 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.270203 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.270287 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.270834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-config\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.271306 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-scripts\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.275603 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.275722 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.278069 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.289660 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb59q\" (UniqueName: \"kubernetes.io/projected/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-kube-api-access-qb59q\") pod \"ovn-northd-0\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.406393 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:24:50 crc kubenswrapper[4931]: E0130 05:24:50.719518 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.179:36928->38.102.83.179:45103: write tcp 38.102.83.179:36928->38.102.83.179:45103: write: broken pipe Jan 30 05:24:50 crc kubenswrapper[4931]: I0130 05:24:50.896847 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:24:51 crc kubenswrapper[4931]: I0130 05:24:51.715393 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 05:24:51 crc kubenswrapper[4931]: I0130 05:24:51.715939 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 05:24:51 crc kubenswrapper[4931]: I0130 05:24:51.840122 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91","Type":"ContainerStarted","Data":"14085ad5b1fb30e2e472a98f1ef3cb304ab6fa42857d4a5f5e235f581937f71b"} Jan 30 05:24:52 crc kubenswrapper[4931]: I0130 05:24:52.851621 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91","Type":"ContainerStarted","Data":"dea51d6ee685a2470eaa0864347990ea744cedc85d00846568c67d56ba221ee1"} Jan 30 05:24:52 crc kubenswrapper[4931]: I0130 05:24:52.852140 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91","Type":"ContainerStarted","Data":"cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949"} Jan 30 05:24:52 crc kubenswrapper[4931]: I0130 05:24:52.852165 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 05:24:52 crc kubenswrapper[4931]: I0130 05:24:52.890749 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.909024077 podStartE2EDuration="2.890717783s" podCreationTimestamp="2026-01-30 05:24:50 +0000 UTC" firstStartedPulling="2026-01-30 05:24:50.926175558 +0000 UTC m=+1026.296085825" lastFinishedPulling="2026-01-30 05:24:51.907869234 +0000 UTC m=+1027.277779531" observedRunningTime="2026-01-30 05:24:52.874717464 +0000 UTC m=+1028.244627761" watchObservedRunningTime="2026-01-30 05:24:52.890717783 +0000 UTC m=+1028.260628080" Jan 30 05:24:53 crc kubenswrapper[4931]: I0130 05:24:53.162184 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:53 crc kubenswrapper[4931]: I0130 05:24:53.162262 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:53 crc kubenswrapper[4931]: I0130 05:24:53.285832 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:53 crc kubenswrapper[4931]: I0130 05:24:53.965954 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:54 crc kubenswrapper[4931]: I0130 05:24:54.694680 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:54 crc kubenswrapper[4931]: I0130 05:24:54.811640 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:24:54 crc kubenswrapper[4931]: I0130 05:24:54.886991 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-bzjkd"] Jan 30 05:24:54 crc kubenswrapper[4931]: I0130 05:24:54.896243 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerName="dnsmasq-dns" containerID="cri-o://9d327fe89d71a738231f8f91b804639f88053bf30faeeb4596482e0db97b8f02" gracePeriod=10 Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.271645 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-5hp4b"] Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.272842 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.288564 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-5hp4b"] Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.366895 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.366966 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.367012 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.367034 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-config\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.367056 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pjqc\" (UniqueName: \"kubernetes.io/projected/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-kube-api-access-4pjqc\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.387529 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.468580 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-config\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.468626 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pjqc\" (UniqueName: \"kubernetes.io/projected/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-kube-api-access-4pjqc\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.468695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.468735 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.468784 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.469514 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-config\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.469563 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.469641 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.469757 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.489096 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pjqc\" (UniqueName: \"kubernetes.io/projected/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-kube-api-access-4pjqc\") pod \"dnsmasq-dns-6cb545bd4c-5hp4b\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.590432 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.820918 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-5hp4b"] Jan 30 05:24:55 crc kubenswrapper[4931]: W0130 05:24:55.823146 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb20eb5e_4f22_4088_98dc_44eaf5ac5958.slice/crio-683aec62918f40b319d4b21f6811c3625fb69dac85c6d8a9170f3b1e7160bffa WatchSource:0}: Error finding container 683aec62918f40b319d4b21f6811c3625fb69dac85c6d8a9170f3b1e7160bffa: Status 404 returned error can't find the container with id 683aec62918f40b319d4b21f6811c3625fb69dac85c6d8a9170f3b1e7160bffa Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.906198 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" event={"ID":"eb20eb5e-4f22-4088-98dc-44eaf5ac5958","Type":"ContainerStarted","Data":"683aec62918f40b319d4b21f6811c3625fb69dac85c6d8a9170f3b1e7160bffa"} Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.908927 4931 generic.go:334] "Generic (PLEG): container finished" podID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerID="9d327fe89d71a738231f8f91b804639f88053bf30faeeb4596482e0db97b8f02" exitCode=0 Jan 30 05:24:55 crc kubenswrapper[4931]: I0130 05:24:55.908973 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" event={"ID":"a34b87df-8978-4e2d-9875-a6b81a09fa84","Type":"ContainerDied","Data":"9d327fe89d71a738231f8f91b804639f88053bf30faeeb4596482e0db97b8f02"} Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.224075 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.303701 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.384130 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.430055 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.430342 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.440145 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.440283 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.440539 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.440673 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-nmd2f" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.484080 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.484138 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-cache\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.484168 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52577244-c181-4919-b5b0-040e229163db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.484241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-lock\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.484297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.484344 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56w5l\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-kube-api-access-56w5l\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.586216 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.586291 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-cache\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.586335 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52577244-c181-4919-b5b0-040e229163db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.586440 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-lock\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: E0130 05:24:56.586462 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:56 crc kubenswrapper[4931]: E0130 05:24:56.586492 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.586525 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: E0130 05:24:56.586553 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift podName:52577244-c181-4919-b5b0-040e229163db nodeName:}" failed. No retries permitted until 2026-01-30 05:24:57.086530636 +0000 UTC m=+1032.456440903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift") pod "swift-storage-0" (UID: "52577244-c181-4919-b5b0-040e229163db") : configmap "swift-ring-files" not found Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.586575 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56w5l\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-kube-api-access-56w5l\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.587051 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.587205 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-lock\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.588007 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-cache\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.593617 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52577244-c181-4919-b5b0-040e229163db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.611414 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56w5l\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-kube-api-access-56w5l\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.631802 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.879930 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bcdcb"] Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.881223 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.886250 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.886938 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.888446 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.909455 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bcdcb"] Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993091 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-dispersionconf\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993147 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-ring-data-devices\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993171 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b9ebe73-0201-4486-9de9-e8828e84de53-etc-swift\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993217 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-combined-ca-bundle\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993493 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-scripts\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993556 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-swiftconf\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:56 crc kubenswrapper[4931]: I0130 05:24:56.993596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xkk\" (UniqueName: \"kubernetes.io/projected/9b9ebe73-0201-4486-9de9-e8828e84de53-kube-api-access-88xkk\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094623 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-dispersionconf\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-ring-data-devices\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094711 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b9ebe73-0201-4486-9de9-e8828e84de53-etc-swift\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094749 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-combined-ca-bundle\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094784 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094815 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-scripts\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094839 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-swiftconf\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.094859 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xkk\" (UniqueName: \"kubernetes.io/projected/9b9ebe73-0201-4486-9de9-e8828e84de53-kube-api-access-88xkk\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.095255 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b9ebe73-0201-4486-9de9-e8828e84de53-etc-swift\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: E0130 05:24:57.095284 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:57 crc kubenswrapper[4931]: E0130 05:24:57.095308 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.095675 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-scripts\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.095785 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-ring-data-devices\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: E0130 05:24:57.095848 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift podName:52577244-c181-4919-b5b0-040e229163db nodeName:}" failed. No retries permitted until 2026-01-30 05:24:58.095833577 +0000 UTC m=+1033.465743834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift") pod "swift-storage-0" (UID: "52577244-c181-4919-b5b0-040e229163db") : configmap "swift-ring-files" not found Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.098229 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-dispersionconf\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.098716 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-swiftconf\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.099471 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-combined-ca-bundle\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.122142 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xkk\" (UniqueName: \"kubernetes.io/projected/9b9ebe73-0201-4486-9de9-e8828e84de53-kube-api-access-88xkk\") pod \"swift-ring-rebalance-bcdcb\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.207240 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.759562 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bcdcb"] Jan 30 05:24:57 crc kubenswrapper[4931]: W0130 05:24:57.763624 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b9ebe73_0201_4486_9de9_e8828e84de53.slice/crio-e0aeb9238b4a2a03f01dbc9b575b6093a31f0d939b0c8c92582fa2fe6528a320 WatchSource:0}: Error finding container e0aeb9238b4a2a03f01dbc9b575b6093a31f0d939b0c8c92582fa2fe6528a320: Status 404 returned error can't find the container with id e0aeb9238b4a2a03f01dbc9b575b6093a31f0d939b0c8c92582fa2fe6528a320 Jan 30 05:24:57 crc kubenswrapper[4931]: I0130 05:24:57.924195 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bcdcb" event={"ID":"9b9ebe73-0201-4486-9de9-e8828e84de53","Type":"ContainerStarted","Data":"e0aeb9238b4a2a03f01dbc9b575b6093a31f0d939b0c8c92582fa2fe6528a320"} Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.114385 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:24:58 crc kubenswrapper[4931]: E0130 05:24:58.114622 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:58 crc kubenswrapper[4931]: E0130 05:24:58.114797 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:58 crc kubenswrapper[4931]: E0130 05:24:58.114857 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift podName:52577244-c181-4919-b5b0-040e229163db nodeName:}" failed. No retries permitted until 2026-01-30 05:25:00.114840489 +0000 UTC m=+1035.484750746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift") pod "swift-storage-0" (UID: "52577244-c181-4919-b5b0-040e229163db") : configmap "swift-ring-files" not found Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.748785 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-df05-account-create-update-xmzpk"] Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.750027 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.752270 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.780319 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-df05-account-create-update-xmzpk"] Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.793168 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9z9pd"] Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.794623 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.802402 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9z9pd"] Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.826182 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k69lh\" (UniqueName: \"kubernetes.io/projected/bf1b1f6c-2147-48f7-87ea-e64672036831-kube-api-access-k69lh\") pod \"glance-df05-account-create-update-xmzpk\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.826232 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvbc\" (UniqueName: \"kubernetes.io/projected/6ee75b9c-df74-490e-94ff-21eacce0b65a-kube-api-access-snvbc\") pod \"glance-db-create-9z9pd\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.826286 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee75b9c-df74-490e-94ff-21eacce0b65a-operator-scripts\") pod \"glance-db-create-9z9pd\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.826321 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1b1f6c-2147-48f7-87ea-e64672036831-operator-scripts\") pod \"glance-df05-account-create-update-xmzpk\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.932624 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1b1f6c-2147-48f7-87ea-e64672036831-operator-scripts\") pod \"glance-df05-account-create-update-xmzpk\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.932769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k69lh\" (UniqueName: \"kubernetes.io/projected/bf1b1f6c-2147-48f7-87ea-e64672036831-kube-api-access-k69lh\") pod \"glance-df05-account-create-update-xmzpk\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.932791 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvbc\" (UniqueName: \"kubernetes.io/projected/6ee75b9c-df74-490e-94ff-21eacce0b65a-kube-api-access-snvbc\") pod \"glance-db-create-9z9pd\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.932826 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee75b9c-df74-490e-94ff-21eacce0b65a-operator-scripts\") pod \"glance-db-create-9z9pd\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.933922 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1b1f6c-2147-48f7-87ea-e64672036831-operator-scripts\") pod \"glance-df05-account-create-update-xmzpk\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.934851 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee75b9c-df74-490e-94ff-21eacce0b65a-operator-scripts\") pod \"glance-db-create-9z9pd\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.960254 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvbc\" (UniqueName: \"kubernetes.io/projected/6ee75b9c-df74-490e-94ff-21eacce0b65a-kube-api-access-snvbc\") pod \"glance-db-create-9z9pd\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:58 crc kubenswrapper[4931]: I0130 05:24:58.962013 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k69lh\" (UniqueName: \"kubernetes.io/projected/bf1b1f6c-2147-48f7-87ea-e64672036831-kube-api-access-k69lh\") pod \"glance-df05-account-create-update-xmzpk\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.077187 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.120925 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9z9pd" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.314669 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.445005 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv89g\" (UniqueName: \"kubernetes.io/projected/a34b87df-8978-4e2d-9875-a6b81a09fa84-kube-api-access-xv89g\") pod \"a34b87df-8978-4e2d-9875-a6b81a09fa84\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.445387 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-config\") pod \"a34b87df-8978-4e2d-9875-a6b81a09fa84\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.445457 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-ovsdbserver-sb\") pod \"a34b87df-8978-4e2d-9875-a6b81a09fa84\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.445540 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-dns-svc\") pod \"a34b87df-8978-4e2d-9875-a6b81a09fa84\" (UID: \"a34b87df-8978-4e2d-9875-a6b81a09fa84\") " Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.450161 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34b87df-8978-4e2d-9875-a6b81a09fa84-kube-api-access-xv89g" (OuterVolumeSpecName: "kube-api-access-xv89g") pod "a34b87df-8978-4e2d-9875-a6b81a09fa84" (UID: "a34b87df-8978-4e2d-9875-a6b81a09fa84"). InnerVolumeSpecName "kube-api-access-xv89g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.484747 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-config" (OuterVolumeSpecName: "config") pod "a34b87df-8978-4e2d-9875-a6b81a09fa84" (UID: "a34b87df-8978-4e2d-9875-a6b81a09fa84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.486890 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a34b87df-8978-4e2d-9875-a6b81a09fa84" (UID: "a34b87df-8978-4e2d-9875-a6b81a09fa84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.492120 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a34b87df-8978-4e2d-9875-a6b81a09fa84" (UID: "a34b87df-8978-4e2d-9875-a6b81a09fa84"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.547186 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.547220 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv89g\" (UniqueName: \"kubernetes.io/projected/a34b87df-8978-4e2d-9875-a6b81a09fa84-kube-api-access-xv89g\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.547230 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.547238 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a34b87df-8978-4e2d-9875-a6b81a09fa84-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.550940 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-df05-account-create-update-xmzpk"] Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.641492 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9z9pd"] Jan 30 05:24:59 crc kubenswrapper[4931]: W0130 05:24:59.650556 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ee75b9c_df74_490e_94ff_21eacce0b65a.slice/crio-9419c3c276f323b6a29f149a011311b1d434b4549c0a4bf9eff6ed75a0a8a514 WatchSource:0}: Error finding container 9419c3c276f323b6a29f149a011311b1d434b4549c0a4bf9eff6ed75a0a8a514: Status 404 returned error can't find the container with id 9419c3c276f323b6a29f149a011311b1d434b4549c0a4bf9eff6ed75a0a8a514 Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.946761 4931 generic.go:334] "Generic (PLEG): container finished" podID="bf1b1f6c-2147-48f7-87ea-e64672036831" containerID="b62af9a31208f4045d6ab5fc627a9d3f9b63bc460555779073074656653065f9" exitCode=0 Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.946840 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df05-account-create-update-xmzpk" event={"ID":"bf1b1f6c-2147-48f7-87ea-e64672036831","Type":"ContainerDied","Data":"b62af9a31208f4045d6ab5fc627a9d3f9b63bc460555779073074656653065f9"} Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.946871 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df05-account-create-update-xmzpk" event={"ID":"bf1b1f6c-2147-48f7-87ea-e64672036831","Type":"ContainerStarted","Data":"03bab058fa21df89a2e2e3b3b9b06339747851c18634d63060a8d6a53301dcfa"} Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.949021 4931 generic.go:334] "Generic (PLEG): container finished" podID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerID="5b51c3e6a6e67206beccccc2be017d2e75bb1a8386fa12f6af6b641475f06048" exitCode=0 Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.949355 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" event={"ID":"eb20eb5e-4f22-4088-98dc-44eaf5ac5958","Type":"ContainerDied","Data":"5b51c3e6a6e67206beccccc2be017d2e75bb1a8386fa12f6af6b641475f06048"} Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.964717 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" event={"ID":"a34b87df-8978-4e2d-9875-a6b81a09fa84","Type":"ContainerDied","Data":"d1724450fd332ef12f3dd92f8b86fd94720600e08c372a71b460370b61dc3ade"} Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.964972 4931 scope.go:117] "RemoveContainer" containerID="9d327fe89d71a738231f8f91b804639f88053bf30faeeb4596482e0db97b8f02" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.965073 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-bzjkd" Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.980357 4931 generic.go:334] "Generic (PLEG): container finished" podID="6ee75b9c-df74-490e-94ff-21eacce0b65a" containerID="b7fd522240b80788d80f7919145a4aa75ecf42cdb18b9fd6434f7a190f674261" exitCode=0 Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.980388 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9z9pd" event={"ID":"6ee75b9c-df74-490e-94ff-21eacce0b65a","Type":"ContainerDied","Data":"b7fd522240b80788d80f7919145a4aa75ecf42cdb18b9fd6434f7a190f674261"} Jan 30 05:24:59 crc kubenswrapper[4931]: I0130 05:24:59.980411 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9z9pd" event={"ID":"6ee75b9c-df74-490e-94ff-21eacce0b65a","Type":"ContainerStarted","Data":"9419c3c276f323b6a29f149a011311b1d434b4549c0a4bf9eff6ed75a0a8a514"} Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.159890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:25:00 crc kubenswrapper[4931]: E0130 05:25:00.160121 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:25:00 crc kubenswrapper[4931]: E0130 05:25:00.160149 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:25:00 crc kubenswrapper[4931]: E0130 05:25:00.160224 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift podName:52577244-c181-4919-b5b0-040e229163db nodeName:}" failed. No retries permitted until 2026-01-30 05:25:04.160200911 +0000 UTC m=+1039.530111168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift") pod "swift-storage-0" (UID: "52577244-c181-4919-b5b0-040e229163db") : configmap "swift-ring-files" not found Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.163857 4931 scope.go:117] "RemoveContainer" containerID="b4197288cd21de4e055a7299affb1cb43a53991838539d7286331173ba92c743" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.188462 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-bzjkd"] Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.197393 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-bzjkd"] Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.325383 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9w9jf"] Jan 30 05:25:00 crc kubenswrapper[4931]: E0130 05:25:00.326114 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerName="init" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.328270 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerName="init" Jan 30 05:25:00 crc kubenswrapper[4931]: E0130 05:25:00.328369 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerName="dnsmasq-dns" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.328460 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerName="dnsmasq-dns" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.328772 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" containerName="dnsmasq-dns" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.329529 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.331834 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.355548 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9w9jf"] Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.465868 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9490636b-6e3e-48ea-85e7-3712196bc768-operator-scripts\") pod \"root-account-create-update-9w9jf\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.466003 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq2f9\" (UniqueName: \"kubernetes.io/projected/9490636b-6e3e-48ea-85e7-3712196bc768-kube-api-access-wq2f9\") pod \"root-account-create-update-9w9jf\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.567665 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9490636b-6e3e-48ea-85e7-3712196bc768-operator-scripts\") pod \"root-account-create-update-9w9jf\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.567789 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq2f9\" (UniqueName: \"kubernetes.io/projected/9490636b-6e3e-48ea-85e7-3712196bc768-kube-api-access-wq2f9\") pod \"root-account-create-update-9w9jf\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.568670 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9490636b-6e3e-48ea-85e7-3712196bc768-operator-scripts\") pod \"root-account-create-update-9w9jf\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.583885 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq2f9\" (UniqueName: \"kubernetes.io/projected/9490636b-6e3e-48ea-85e7-3712196bc768-kube-api-access-wq2f9\") pod \"root-account-create-update-9w9jf\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:00 crc kubenswrapper[4931]: I0130 05:25:00.645542 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.016917 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" event={"ID":"eb20eb5e-4f22-4088-98dc-44eaf5ac5958","Type":"ContainerStarted","Data":"48a476fc993377e732a41ec538be9f206289fff472ea517e4a1e5122eac3f5e4"} Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.017880 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.035585 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" podStartSLOduration=6.035568647 podStartE2EDuration="6.035568647s" podCreationTimestamp="2026-01-30 05:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:01.031516321 +0000 UTC m=+1036.401426598" watchObservedRunningTime="2026-01-30 05:25:01.035568647 +0000 UTC m=+1036.405478914" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.434146 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34b87df-8978-4e2d-9875-a6b81a09fa84" path="/var/lib/kubelet/pods/a34b87df-8978-4e2d-9875-a6b81a09fa84/volumes" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.561510 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.717989 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1b1f6c-2147-48f7-87ea-e64672036831-operator-scripts\") pod \"bf1b1f6c-2147-48f7-87ea-e64672036831\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.718180 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k69lh\" (UniqueName: \"kubernetes.io/projected/bf1b1f6c-2147-48f7-87ea-e64672036831-kube-api-access-k69lh\") pod \"bf1b1f6c-2147-48f7-87ea-e64672036831\" (UID: \"bf1b1f6c-2147-48f7-87ea-e64672036831\") " Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.719299 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1b1f6c-2147-48f7-87ea-e64672036831-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf1b1f6c-2147-48f7-87ea-e64672036831" (UID: "bf1b1f6c-2147-48f7-87ea-e64672036831"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.724409 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1b1f6c-2147-48f7-87ea-e64672036831-kube-api-access-k69lh" (OuterVolumeSpecName: "kube-api-access-k69lh") pod "bf1b1f6c-2147-48f7-87ea-e64672036831" (UID: "bf1b1f6c-2147-48f7-87ea-e64672036831"). InnerVolumeSpecName "kube-api-access-k69lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.819914 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k69lh\" (UniqueName: \"kubernetes.io/projected/bf1b1f6c-2147-48f7-87ea-e64672036831-kube-api-access-k69lh\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:01 crc kubenswrapper[4931]: I0130 05:25:01.819952 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1b1f6c-2147-48f7-87ea-e64672036831-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.033442 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-xmzpk" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.033924 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df05-account-create-update-xmzpk" event={"ID":"bf1b1f6c-2147-48f7-87ea-e64672036831","Type":"ContainerDied","Data":"03bab058fa21df89a2e2e3b3b9b06339747851c18634d63060a8d6a53301dcfa"} Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.033963 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03bab058fa21df89a2e2e3b3b9b06339747851c18634d63060a8d6a53301dcfa" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.718693 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9z9pd" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.841207 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snvbc\" (UniqueName: \"kubernetes.io/projected/6ee75b9c-df74-490e-94ff-21eacce0b65a-kube-api-access-snvbc\") pod \"6ee75b9c-df74-490e-94ff-21eacce0b65a\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.841403 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee75b9c-df74-490e-94ff-21eacce0b65a-operator-scripts\") pod \"6ee75b9c-df74-490e-94ff-21eacce0b65a\" (UID: \"6ee75b9c-df74-490e-94ff-21eacce0b65a\") " Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.842540 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee75b9c-df74-490e-94ff-21eacce0b65a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ee75b9c-df74-490e-94ff-21eacce0b65a" (UID: "6ee75b9c-df74-490e-94ff-21eacce0b65a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.848867 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee75b9c-df74-490e-94ff-21eacce0b65a-kube-api-access-snvbc" (OuterVolumeSpecName: "kube-api-access-snvbc") pod "6ee75b9c-df74-490e-94ff-21eacce0b65a" (UID: "6ee75b9c-df74-490e-94ff-21eacce0b65a"). InnerVolumeSpecName "kube-api-access-snvbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.943688 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ee75b9c-df74-490e-94ff-21eacce0b65a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.943953 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snvbc\" (UniqueName: \"kubernetes.io/projected/6ee75b9c-df74-490e-94ff-21eacce0b65a-kube-api-access-snvbc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:02 crc kubenswrapper[4931]: I0130 05:25:02.980642 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9w9jf"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.015804 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9bbdw"] Jan 30 05:25:03 crc kubenswrapper[4931]: E0130 05:25:03.016189 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee75b9c-df74-490e-94ff-21eacce0b65a" containerName="mariadb-database-create" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.016202 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee75b9c-df74-490e-94ff-21eacce0b65a" containerName="mariadb-database-create" Jan 30 05:25:03 crc kubenswrapper[4931]: E0130 05:25:03.016218 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1b1f6c-2147-48f7-87ea-e64672036831" containerName="mariadb-account-create-update" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.016224 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1b1f6c-2147-48f7-87ea-e64672036831" containerName="mariadb-account-create-update" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.016370 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1b1f6c-2147-48f7-87ea-e64672036831" containerName="mariadb-account-create-update" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.016387 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee75b9c-df74-490e-94ff-21eacce0b65a" containerName="mariadb-database-create" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.016917 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.027581 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9bbdw"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.041986 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bcdcb" event={"ID":"9b9ebe73-0201-4486-9de9-e8828e84de53","Type":"ContainerStarted","Data":"397e61b0c2ae3421e485ee4187f967d9302d0416b0227f42b0fd6c0769496dc7"} Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.043814 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9w9jf" event={"ID":"9490636b-6e3e-48ea-85e7-3712196bc768","Type":"ContainerStarted","Data":"96c690aab96e9e6dade39cf91865ee404be8f47488bd67014a7dbe9d3a7a4709"} Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.050367 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9z9pd" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.050406 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9z9pd" event={"ID":"6ee75b9c-df74-490e-94ff-21eacce0b65a","Type":"ContainerDied","Data":"9419c3c276f323b6a29f149a011311b1d434b4549c0a4bf9eff6ed75a0a8a514"} Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.050476 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9419c3c276f323b6a29f149a011311b1d434b4549c0a4bf9eff6ed75a0a8a514" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.066382 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bcdcb" podStartSLOduration=2.103062727 podStartE2EDuration="7.0663642s" podCreationTimestamp="2026-01-30 05:24:56 +0000 UTC" firstStartedPulling="2026-01-30 05:24:57.766203506 +0000 UTC m=+1033.136113763" lastFinishedPulling="2026-01-30 05:25:02.729504949 +0000 UTC m=+1038.099415236" observedRunningTime="2026-01-30 05:25:03.058975158 +0000 UTC m=+1038.428885415" watchObservedRunningTime="2026-01-30 05:25:03.0663642 +0000 UTC m=+1038.436274457" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.110246 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-595b-account-create-update-hcchn"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.111553 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.113356 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.116565 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595b-account-create-update-hcchn"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.146703 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d338366-1ff1-4c95-aa94-30ba5c813138-operator-scripts\") pod \"keystone-db-create-9bbdw\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.146819 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgmsd\" (UniqueName: \"kubernetes.io/projected/0d338366-1ff1-4c95-aa94-30ba5c813138-kube-api-access-cgmsd\") pod \"keystone-db-create-9bbdw\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.247710 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d338366-1ff1-4c95-aa94-30ba5c813138-operator-scripts\") pod \"keystone-db-create-9bbdw\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.248045 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgmsd\" (UniqueName: \"kubernetes.io/projected/0d338366-1ff1-4c95-aa94-30ba5c813138-kube-api-access-cgmsd\") pod \"keystone-db-create-9bbdw\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.248144 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5722020-7619-4a17-8990-e025402e2c3a-operator-scripts\") pod \"keystone-595b-account-create-update-hcchn\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.248292 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxdl\" (UniqueName: \"kubernetes.io/projected/c5722020-7619-4a17-8990-e025402e2c3a-kube-api-access-vbxdl\") pod \"keystone-595b-account-create-update-hcchn\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.248394 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d338366-1ff1-4c95-aa94-30ba5c813138-operator-scripts\") pod \"keystone-db-create-9bbdw\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.262983 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgmsd\" (UniqueName: \"kubernetes.io/projected/0d338366-1ff1-4c95-aa94-30ba5c813138-kube-api-access-cgmsd\") pod \"keystone-db-create-9bbdw\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.298952 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hqm5b"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.300080 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.311963 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hqm5b"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.334045 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.349743 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxdl\" (UniqueName: \"kubernetes.io/projected/c5722020-7619-4a17-8990-e025402e2c3a-kube-api-access-vbxdl\") pod \"keystone-595b-account-create-update-hcchn\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.350079 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5722020-7619-4a17-8990-e025402e2c3a-operator-scripts\") pod \"keystone-595b-account-create-update-hcchn\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.350725 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5722020-7619-4a17-8990-e025402e2c3a-operator-scripts\") pod \"keystone-595b-account-create-update-hcchn\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.365039 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxdl\" (UniqueName: \"kubernetes.io/projected/c5722020-7619-4a17-8990-e025402e2c3a-kube-api-access-vbxdl\") pod \"keystone-595b-account-create-update-hcchn\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.436006 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a921-account-create-update-mqpxv"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.437493 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.439168 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.439860 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.457041 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph7tv\" (UniqueName: \"kubernetes.io/projected/cae14e96-e869-491f-bbab-32bccf87cc10-kube-api-access-ph7tv\") pod \"placement-db-create-hqm5b\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.457158 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae14e96-e869-491f-bbab-32bccf87cc10-operator-scripts\") pod \"placement-db-create-hqm5b\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.458528 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a921-account-create-update-mqpxv"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.559093 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph7tv\" (UniqueName: \"kubernetes.io/projected/cae14e96-e869-491f-bbab-32bccf87cc10-kube-api-access-ph7tv\") pod \"placement-db-create-hqm5b\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.559173 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-operator-scripts\") pod \"placement-a921-account-create-update-mqpxv\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.559222 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2m2x\" (UniqueName: \"kubernetes.io/projected/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-kube-api-access-l2m2x\") pod \"placement-a921-account-create-update-mqpxv\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.559254 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae14e96-e869-491f-bbab-32bccf87cc10-operator-scripts\") pod \"placement-db-create-hqm5b\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.561359 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae14e96-e869-491f-bbab-32bccf87cc10-operator-scripts\") pod \"placement-db-create-hqm5b\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.580187 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph7tv\" (UniqueName: \"kubernetes.io/projected/cae14e96-e869-491f-bbab-32bccf87cc10-kube-api-access-ph7tv\") pod \"placement-db-create-hqm5b\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.614072 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.660392 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-operator-scripts\") pod \"placement-a921-account-create-update-mqpxv\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.660472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2m2x\" (UniqueName: \"kubernetes.io/projected/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-kube-api-access-l2m2x\") pod \"placement-a921-account-create-update-mqpxv\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.661395 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-operator-scripts\") pod \"placement-a921-account-create-update-mqpxv\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.683668 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2m2x\" (UniqueName: \"kubernetes.io/projected/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-kube-api-access-l2m2x\") pod \"placement-a921-account-create-update-mqpxv\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.760660 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.811275 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9bbdw"] Jan 30 05:25:03 crc kubenswrapper[4931]: I0130 05:25:03.888893 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595b-account-create-update-hcchn"] Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.011602 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wxb94"] Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.012720 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.015081 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.015551 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lnq99" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.019662 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wxb94"] Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.062418 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hqm5b"] Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.069199 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bbdw" event={"ID":"0d338366-1ff1-4c95-aa94-30ba5c813138","Type":"ContainerStarted","Data":"572f09b29f02dcc488a3f5a5c3037d927c6d17a2fec69ebb6da1e983b4bf5d1f"} Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.069255 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bbdw" event={"ID":"0d338366-1ff1-4c95-aa94-30ba5c813138","Type":"ContainerStarted","Data":"79c07dc3658fb0f780ed178d88836e00752deac8a60e3cf4f66c4d5151cb9b1c"} Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.070397 4931 generic.go:334] "Generic (PLEG): container finished" podID="9490636b-6e3e-48ea-85e7-3712196bc768" containerID="0aa30d8d9eae66f63b97cadd6e1c8c0a9f5fe5356f82b3165d21d6b90e8f054f" exitCode=0 Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.070465 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9w9jf" event={"ID":"9490636b-6e3e-48ea-85e7-3712196bc768","Type":"ContainerDied","Data":"0aa30d8d9eae66f63b97cadd6e1c8c0a9f5fe5356f82b3165d21d6b90e8f054f"} Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.072671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595b-account-create-update-hcchn" event={"ID":"c5722020-7619-4a17-8990-e025402e2c3a","Type":"ContainerStarted","Data":"99d8e77ad688a72b40a0abbb974ba43df90e44bf29240bd1bb5c2d0a67083646"} Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.173695 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-config-data\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.173852 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpl8h\" (UniqueName: \"kubernetes.io/projected/08c65b18-0526-4eec-a608-20478c5eb008-kube-api-access-xpl8h\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.174007 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.174152 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-combined-ca-bundle\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.174233 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-db-sync-config-data\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: E0130 05:25:04.174696 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:25:04 crc kubenswrapper[4931]: E0130 05:25:04.174722 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:25:04 crc kubenswrapper[4931]: E0130 05:25:04.174766 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift podName:52577244-c181-4919-b5b0-040e229163db nodeName:}" failed. No retries permitted until 2026-01-30 05:25:12.174749168 +0000 UTC m=+1047.544659425 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift") pod "swift-storage-0" (UID: "52577244-c181-4919-b5b0-040e229163db") : configmap "swift-ring-files" not found Jan 30 05:25:04 crc kubenswrapper[4931]: W0130 05:25:04.239581 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda1ef5f2_7d57_4f89_9b48_9c603b322e5e.slice/crio-c4ee527560bb58f58bc3c41da98915d6f9a864bb846ed3e7aaa30a35f39dbc90 WatchSource:0}: Error finding container c4ee527560bb58f58bc3c41da98915d6f9a864bb846ed3e7aaa30a35f39dbc90: Status 404 returned error can't find the container with id c4ee527560bb58f58bc3c41da98915d6f9a864bb846ed3e7aaa30a35f39dbc90 Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.240857 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a921-account-create-update-mqpxv"] Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.275670 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-combined-ca-bundle\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.275726 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-db-sync-config-data\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.275783 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-config-data\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.275822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpl8h\" (UniqueName: \"kubernetes.io/projected/08c65b18-0526-4eec-a608-20478c5eb008-kube-api-access-xpl8h\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.282839 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-combined-ca-bundle\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.287168 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-db-sync-config-data\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.289399 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-config-data\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.302164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpl8h\" (UniqueName: \"kubernetes.io/projected/08c65b18-0526-4eec-a608-20478c5eb008-kube-api-access-xpl8h\") pod \"glance-db-sync-wxb94\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.326031 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:04 crc kubenswrapper[4931]: I0130 05:25:04.921794 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wxb94"] Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.090074 4931 generic.go:334] "Generic (PLEG): container finished" podID="0d338366-1ff1-4c95-aa94-30ba5c813138" containerID="572f09b29f02dcc488a3f5a5c3037d927c6d17a2fec69ebb6da1e983b4bf5d1f" exitCode=0 Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.090158 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bbdw" event={"ID":"0d338366-1ff1-4c95-aa94-30ba5c813138","Type":"ContainerDied","Data":"572f09b29f02dcc488a3f5a5c3037d927c6d17a2fec69ebb6da1e983b4bf5d1f"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.092179 4931 generic.go:334] "Generic (PLEG): container finished" podID="cae14e96-e869-491f-bbab-32bccf87cc10" containerID="1140a7961d708d05c85bc33a569a12461dd710e3403faa5dc7621241292e7e99" exitCode=0 Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.092233 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hqm5b" event={"ID":"cae14e96-e869-491f-bbab-32bccf87cc10","Type":"ContainerDied","Data":"1140a7961d708d05c85bc33a569a12461dd710e3403faa5dc7621241292e7e99"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.092292 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hqm5b" event={"ID":"cae14e96-e869-491f-bbab-32bccf87cc10","Type":"ContainerStarted","Data":"2dca52403afafa858d8c38ec0a9e5cde23ae060bb8c4aa75b1a7b8fb8ca506d0"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.093798 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5722020-7619-4a17-8990-e025402e2c3a" containerID="424ff0eefa4783d3488bc19f3934cfec69b31ed4d156eca267b961eb0d363be6" exitCode=0 Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.093839 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595b-account-create-update-hcchn" event={"ID":"c5722020-7619-4a17-8990-e025402e2c3a","Type":"ContainerDied","Data":"424ff0eefa4783d3488bc19f3934cfec69b31ed4d156eca267b961eb0d363be6"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.096053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wxb94" event={"ID":"08c65b18-0526-4eec-a608-20478c5eb008","Type":"ContainerStarted","Data":"1dde39fd71deaa1577ea5797017a140ffe24ad73bc61b3566927cd1bee60c4f1"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.097455 4931 generic.go:334] "Generic (PLEG): container finished" podID="da1ef5f2-7d57-4f89-9b48-9c603b322e5e" containerID="72e98c8676f758af58c2fffef7c54cd9bedf5ae4210e865b9220280e84a05578" exitCode=0 Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.097518 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a921-account-create-update-mqpxv" event={"ID":"da1ef5f2-7d57-4f89-9b48-9c603b322e5e","Type":"ContainerDied","Data":"72e98c8676f758af58c2fffef7c54cd9bedf5ae4210e865b9220280e84a05578"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.097587 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a921-account-create-update-mqpxv" event={"ID":"da1ef5f2-7d57-4f89-9b48-9c603b322e5e","Type":"ContainerStarted","Data":"c4ee527560bb58f58bc3c41da98915d6f9a864bb846ed3e7aaa30a35f39dbc90"} Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.500517 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.592675 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.619875 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9490636b-6e3e-48ea-85e7-3712196bc768-operator-scripts\") pod \"9490636b-6e3e-48ea-85e7-3712196bc768\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.619948 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq2f9\" (UniqueName: \"kubernetes.io/projected/9490636b-6e3e-48ea-85e7-3712196bc768-kube-api-access-wq2f9\") pod \"9490636b-6e3e-48ea-85e7-3712196bc768\" (UID: \"9490636b-6e3e-48ea-85e7-3712196bc768\") " Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.620736 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9490636b-6e3e-48ea-85e7-3712196bc768-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9490636b-6e3e-48ea-85e7-3712196bc768" (UID: "9490636b-6e3e-48ea-85e7-3712196bc768"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.665844 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9490636b-6e3e-48ea-85e7-3712196bc768-kube-api-access-wq2f9" (OuterVolumeSpecName: "kube-api-access-wq2f9") pod "9490636b-6e3e-48ea-85e7-3712196bc768" (UID: "9490636b-6e3e-48ea-85e7-3712196bc768"). InnerVolumeSpecName "kube-api-access-wq2f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.693750 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-v6tmx"] Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.694158 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" podUID="c052a747-4d6e-459f-80c2-b690015e411d" containerName="dnsmasq-dns" containerID="cri-o://3eb355680179efbcbb2cf73e83f9b34f38755a348dc73a0a8db4b58a9c1de2f1" gracePeriod=10 Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.723103 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9490636b-6e3e-48ea-85e7-3712196bc768-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:05 crc kubenswrapper[4931]: I0130 05:25:05.723133 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq2f9\" (UniqueName: \"kubernetes.io/projected/9490636b-6e3e-48ea-85e7-3712196bc768-kube-api-access-wq2f9\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.108697 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9w9jf" event={"ID":"9490636b-6e3e-48ea-85e7-3712196bc768","Type":"ContainerDied","Data":"96c690aab96e9e6dade39cf91865ee404be8f47488bd67014a7dbe9d3a7a4709"} Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.108743 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96c690aab96e9e6dade39cf91865ee404be8f47488bd67014a7dbe9d3a7a4709" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.108809 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9w9jf" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.116550 4931 generic.go:334] "Generic (PLEG): container finished" podID="c052a747-4d6e-459f-80c2-b690015e411d" containerID="3eb355680179efbcbb2cf73e83f9b34f38755a348dc73a0a8db4b58a9c1de2f1" exitCode=0 Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.116776 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" event={"ID":"c052a747-4d6e-459f-80c2-b690015e411d","Type":"ContainerDied","Data":"3eb355680179efbcbb2cf73e83f9b34f38755a348dc73a0a8db4b58a9c1de2f1"} Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.116803 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" event={"ID":"c052a747-4d6e-459f-80c2-b690015e411d","Type":"ContainerDied","Data":"d8ae5c6c06a93c29197bfde41e6a215859930a15dc388d2269865aa48021ba9a"} Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.116815 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ae5c6c06a93c29197bfde41e6a215859930a15dc388d2269865aa48021ba9a" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.158337 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.344361 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-sb\") pod \"c052a747-4d6e-459f-80c2-b690015e411d\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.344460 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-config\") pod \"c052a747-4d6e-459f-80c2-b690015e411d\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.344527 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-nb\") pod \"c052a747-4d6e-459f-80c2-b690015e411d\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.344599 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpfk6\" (UniqueName: \"kubernetes.io/projected/c052a747-4d6e-459f-80c2-b690015e411d-kube-api-access-dpfk6\") pod \"c052a747-4d6e-459f-80c2-b690015e411d\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.344640 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-dns-svc\") pod \"c052a747-4d6e-459f-80c2-b690015e411d\" (UID: \"c052a747-4d6e-459f-80c2-b690015e411d\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.363465 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c052a747-4d6e-459f-80c2-b690015e411d-kube-api-access-dpfk6" (OuterVolumeSpecName: "kube-api-access-dpfk6") pod "c052a747-4d6e-459f-80c2-b690015e411d" (UID: "c052a747-4d6e-459f-80c2-b690015e411d"). InnerVolumeSpecName "kube-api-access-dpfk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.393641 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-config" (OuterVolumeSpecName: "config") pod "c052a747-4d6e-459f-80c2-b690015e411d" (UID: "c052a747-4d6e-459f-80c2-b690015e411d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.396272 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c052a747-4d6e-459f-80c2-b690015e411d" (UID: "c052a747-4d6e-459f-80c2-b690015e411d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.402233 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c052a747-4d6e-459f-80c2-b690015e411d" (UID: "c052a747-4d6e-459f-80c2-b690015e411d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.420987 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c052a747-4d6e-459f-80c2-b690015e411d" (UID: "c052a747-4d6e-459f-80c2-b690015e411d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.446620 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.446642 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.446650 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.446661 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpfk6\" (UniqueName: \"kubernetes.io/projected/c052a747-4d6e-459f-80c2-b690015e411d-kube-api-access-dpfk6\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.446671 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c052a747-4d6e-459f-80c2-b690015e411d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.506848 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.633209 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.651719 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgmsd\" (UniqueName: \"kubernetes.io/projected/0d338366-1ff1-4c95-aa94-30ba5c813138-kube-api-access-cgmsd\") pod \"0d338366-1ff1-4c95-aa94-30ba5c813138\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.652224 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d338366-1ff1-4c95-aa94-30ba5c813138-operator-scripts\") pod \"0d338366-1ff1-4c95-aa94-30ba5c813138\" (UID: \"0d338366-1ff1-4c95-aa94-30ba5c813138\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.652997 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d338366-1ff1-4c95-aa94-30ba5c813138-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d338366-1ff1-4c95-aa94-30ba5c813138" (UID: "0d338366-1ff1-4c95-aa94-30ba5c813138"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.658314 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d338366-1ff1-4c95-aa94-30ba5c813138-kube-api-access-cgmsd" (OuterVolumeSpecName: "kube-api-access-cgmsd") pod "0d338366-1ff1-4c95-aa94-30ba5c813138" (UID: "0d338366-1ff1-4c95-aa94-30ba5c813138"). InnerVolumeSpecName "kube-api-access-cgmsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.662702 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.668227 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.753902 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae14e96-e869-491f-bbab-32bccf87cc10-operator-scripts\") pod \"cae14e96-e869-491f-bbab-32bccf87cc10\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.753988 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph7tv\" (UniqueName: \"kubernetes.io/projected/cae14e96-e869-491f-bbab-32bccf87cc10-kube-api-access-ph7tv\") pod \"cae14e96-e869-491f-bbab-32bccf87cc10\" (UID: \"cae14e96-e869-491f-bbab-32bccf87cc10\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.754287 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae14e96-e869-491f-bbab-32bccf87cc10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cae14e96-e869-491f-bbab-32bccf87cc10" (UID: "cae14e96-e869-491f-bbab-32bccf87cc10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.754582 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae14e96-e869-491f-bbab-32bccf87cc10-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.754600 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgmsd\" (UniqueName: \"kubernetes.io/projected/0d338366-1ff1-4c95-aa94-30ba5c813138-kube-api-access-cgmsd\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.754611 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d338366-1ff1-4c95-aa94-30ba5c813138-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.757642 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae14e96-e869-491f-bbab-32bccf87cc10-kube-api-access-ph7tv" (OuterVolumeSpecName: "kube-api-access-ph7tv") pod "cae14e96-e869-491f-bbab-32bccf87cc10" (UID: "cae14e96-e869-491f-bbab-32bccf87cc10"). InnerVolumeSpecName "kube-api-access-ph7tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.856701 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-operator-scripts\") pod \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.856783 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbxdl\" (UniqueName: \"kubernetes.io/projected/c5722020-7619-4a17-8990-e025402e2c3a-kube-api-access-vbxdl\") pod \"c5722020-7619-4a17-8990-e025402e2c3a\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.857003 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2m2x\" (UniqueName: \"kubernetes.io/projected/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-kube-api-access-l2m2x\") pod \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\" (UID: \"da1ef5f2-7d57-4f89-9b48-9c603b322e5e\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.857068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5722020-7619-4a17-8990-e025402e2c3a-operator-scripts\") pod \"c5722020-7619-4a17-8990-e025402e2c3a\" (UID: \"c5722020-7619-4a17-8990-e025402e2c3a\") " Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.857902 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5722020-7619-4a17-8990-e025402e2c3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5722020-7619-4a17-8990-e025402e2c3a" (UID: "c5722020-7619-4a17-8990-e025402e2c3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.858571 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5722020-7619-4a17-8990-e025402e2c3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.858602 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph7tv\" (UniqueName: \"kubernetes.io/projected/cae14e96-e869-491f-bbab-32bccf87cc10-kube-api-access-ph7tv\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.858849 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da1ef5f2-7d57-4f89-9b48-9c603b322e5e" (UID: "da1ef5f2-7d57-4f89-9b48-9c603b322e5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.862539 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5722020-7619-4a17-8990-e025402e2c3a-kube-api-access-vbxdl" (OuterVolumeSpecName: "kube-api-access-vbxdl") pod "c5722020-7619-4a17-8990-e025402e2c3a" (UID: "c5722020-7619-4a17-8990-e025402e2c3a"). InnerVolumeSpecName "kube-api-access-vbxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.865974 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-kube-api-access-l2m2x" (OuterVolumeSpecName: "kube-api-access-l2m2x") pod "da1ef5f2-7d57-4f89-9b48-9c603b322e5e" (UID: "da1ef5f2-7d57-4f89-9b48-9c603b322e5e"). InnerVolumeSpecName "kube-api-access-l2m2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.882764 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9w9jf"] Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.889720 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9w9jf"] Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.959821 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbxdl\" (UniqueName: \"kubernetes.io/projected/c5722020-7619-4a17-8990-e025402e2c3a-kube-api-access-vbxdl\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.959859 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2m2x\" (UniqueName: \"kubernetes.io/projected/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-kube-api-access-l2m2x\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:06 crc kubenswrapper[4931]: I0130 05:25:06.959871 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1ef5f2-7d57-4f89-9b48-9c603b322e5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.126227 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hqm5b" event={"ID":"cae14e96-e869-491f-bbab-32bccf87cc10","Type":"ContainerDied","Data":"2dca52403afafa858d8c38ec0a9e5cde23ae060bb8c4aa75b1a7b8fb8ca506d0"} Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.126262 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dca52403afafa858d8c38ec0a9e5cde23ae060bb8c4aa75b1a7b8fb8ca506d0" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.126333 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hqm5b" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.127923 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595b-account-create-update-hcchn" event={"ID":"c5722020-7619-4a17-8990-e025402e2c3a","Type":"ContainerDied","Data":"99d8e77ad688a72b40a0abbb974ba43df90e44bf29240bd1bb5c2d0a67083646"} Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.127959 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d8e77ad688a72b40a0abbb974ba43df90e44bf29240bd1bb5c2d0a67083646" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.128017 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-hcchn" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.139512 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a921-account-create-update-mqpxv" event={"ID":"da1ef5f2-7d57-4f89-9b48-9c603b322e5e","Type":"ContainerDied","Data":"c4ee527560bb58f58bc3c41da98915d6f9a864bb846ed3e7aaa30a35f39dbc90"} Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.139550 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4ee527560bb58f58bc3c41da98915d6f9a864bb846ed3e7aaa30a35f39dbc90" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.139526 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a921-account-create-update-mqpxv" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.141884 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9bbdw" event={"ID":"0d338366-1ff1-4c95-aa94-30ba5c813138","Type":"ContainerDied","Data":"79c07dc3658fb0f780ed178d88836e00752deac8a60e3cf4f66c4d5151cb9b1c"} Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.141946 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79c07dc3658fb0f780ed178d88836e00752deac8a60e3cf4f66c4d5151cb9b1c" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.141904 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-v6tmx" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.141998 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9bbdw" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.195938 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-v6tmx"] Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.203893 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-v6tmx"] Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.430076 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9490636b-6e3e-48ea-85e7-3712196bc768" path="/var/lib/kubelet/pods/9490636b-6e3e-48ea-85e7-3712196bc768/volumes" Jan 30 05:25:07 crc kubenswrapper[4931]: I0130 05:25:07.430580 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c052a747-4d6e-459f-80c2-b690015e411d" path="/var/lib/kubelet/pods/c052a747-4d6e-459f-80c2-b690015e411d/volumes" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.167317 4931 generic.go:334] "Generic (PLEG): container finished" podID="9b9ebe73-0201-4486-9de9-e8828e84de53" containerID="397e61b0c2ae3421e485ee4187f967d9302d0416b0227f42b0fd6c0769496dc7" exitCode=0 Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.167466 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bcdcb" event={"ID":"9b9ebe73-0201-4486-9de9-e8828e84de53","Type":"ContainerDied","Data":"397e61b0c2ae3421e485ee4187f967d9302d0416b0227f42b0fd6c0769496dc7"} Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.340372 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qzj5h"] Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.340871 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae14e96-e869-491f-bbab-32bccf87cc10" containerName="mariadb-database-create" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.340902 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae14e96-e869-491f-bbab-32bccf87cc10" containerName="mariadb-database-create" Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.340932 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d338366-1ff1-4c95-aa94-30ba5c813138" containerName="mariadb-database-create" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.340944 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d338366-1ff1-4c95-aa94-30ba5c813138" containerName="mariadb-database-create" Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.340969 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5722020-7619-4a17-8990-e025402e2c3a" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.340982 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5722020-7619-4a17-8990-e025402e2c3a" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.341003 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c052a747-4d6e-459f-80c2-b690015e411d" containerName="init" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341015 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c052a747-4d6e-459f-80c2-b690015e411d" containerName="init" Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.341034 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9490636b-6e3e-48ea-85e7-3712196bc768" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341047 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9490636b-6e3e-48ea-85e7-3712196bc768" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.341061 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1ef5f2-7d57-4f89-9b48-9c603b322e5e" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341073 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1ef5f2-7d57-4f89-9b48-9c603b322e5e" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: E0130 05:25:10.341108 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c052a747-4d6e-459f-80c2-b690015e411d" containerName="dnsmasq-dns" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341120 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c052a747-4d6e-459f-80c2-b690015e411d" containerName="dnsmasq-dns" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341402 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9490636b-6e3e-48ea-85e7-3712196bc768" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341465 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d338366-1ff1-4c95-aa94-30ba5c813138" containerName="mariadb-database-create" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341490 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae14e96-e869-491f-bbab-32bccf87cc10" containerName="mariadb-database-create" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341507 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5722020-7619-4a17-8990-e025402e2c3a" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341526 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1ef5f2-7d57-4f89-9b48-9c603b322e5e" containerName="mariadb-account-create-update" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.341551 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c052a747-4d6e-459f-80c2-b690015e411d" containerName="dnsmasq-dns" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.342321 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.344493 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.348976 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qzj5h"] Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.458395 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.538160 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw2q7\" (UniqueName: \"kubernetes.io/projected/602175a2-25e6-472d-b423-5ab4e6d97769-kube-api-access-qw2q7\") pod \"root-account-create-update-qzj5h\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.538829 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/602175a2-25e6-472d-b423-5ab4e6d97769-operator-scripts\") pod \"root-account-create-update-qzj5h\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.640342 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/602175a2-25e6-472d-b423-5ab4e6d97769-operator-scripts\") pod \"root-account-create-update-qzj5h\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.640573 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw2q7\" (UniqueName: \"kubernetes.io/projected/602175a2-25e6-472d-b423-5ab4e6d97769-kube-api-access-qw2q7\") pod \"root-account-create-update-qzj5h\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.641402 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/602175a2-25e6-472d-b423-5ab4e6d97769-operator-scripts\") pod \"root-account-create-update-qzj5h\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.667102 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw2q7\" (UniqueName: \"kubernetes.io/projected/602175a2-25e6-472d-b423-5ab4e6d97769-kube-api-access-qw2q7\") pod \"root-account-create-update-qzj5h\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:10 crc kubenswrapper[4931]: I0130 05:25:10.677473 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:12 crc kubenswrapper[4931]: I0130 05:25:12.275817 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:25:12 crc kubenswrapper[4931]: I0130 05:25:12.282272 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"swift-storage-0\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " pod="openstack/swift-storage-0" Jan 30 05:25:12 crc kubenswrapper[4931]: I0130 05:25:12.358743 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.242317 4931 generic.go:334] "Generic (PLEG): container finished" podID="081e3873-ea99-4486-925f-784a98e49405" containerID="4db4fc560f1e0be65146b56bdc4340b3ff1c5a4fe7510a353795090f99291213" exitCode=0 Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.242390 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"081e3873-ea99-4486-925f-784a98e49405","Type":"ContainerDied","Data":"4db4fc560f1e0be65146b56bdc4340b3ff1c5a4fe7510a353795090f99291213"} Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.861559 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.974495 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-combined-ca-bundle\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.974787 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-swiftconf\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.974853 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-ring-data-devices\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.974872 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b9ebe73-0201-4486-9de9-e8828e84de53-etc-swift\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.974942 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-scripts\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.974964 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-dispersionconf\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.975019 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88xkk\" (UniqueName: \"kubernetes.io/projected/9b9ebe73-0201-4486-9de9-e8828e84de53-kube-api-access-88xkk\") pod \"9b9ebe73-0201-4486-9de9-e8828e84de53\" (UID: \"9b9ebe73-0201-4486-9de9-e8828e84de53\") " Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.977398 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.977586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b9ebe73-0201-4486-9de9-e8828e84de53-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.977622 4931 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.981244 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9ebe73-0201-4486-9de9-e8828e84de53-kube-api-access-88xkk" (OuterVolumeSpecName: "kube-api-access-88xkk") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "kube-api-access-88xkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:17 crc kubenswrapper[4931]: I0130 05:25:17.989086 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.001414 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-scripts" (OuterVolumeSpecName: "scripts") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.005411 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.026709 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b9ebe73-0201-4486-9de9-e8828e84de53" (UID: "9b9ebe73-0201-4486-9de9-e8828e84de53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.079836 4931 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b9ebe73-0201-4486-9de9-e8828e84de53-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.079863 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b9ebe73-0201-4486-9de9-e8828e84de53-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.079874 4931 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.079882 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88xkk\" (UniqueName: \"kubernetes.io/projected/9b9ebe73-0201-4486-9de9-e8828e84de53-kube-api-access-88xkk\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.079891 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.079899 4931 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b9ebe73-0201-4486-9de9-e8828e84de53-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.253605 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bcdcb" event={"ID":"9b9ebe73-0201-4486-9de9-e8828e84de53","Type":"ContainerDied","Data":"e0aeb9238b4a2a03f01dbc9b575b6093a31f0d939b0c8c92582fa2fe6528a320"} Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.253648 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0aeb9238b4a2a03f01dbc9b575b6093a31f0d939b0c8c92582fa2fe6528a320" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.253701 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bcdcb" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.262847 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"081e3873-ea99-4486-925f-784a98e49405","Type":"ContainerStarted","Data":"1bb7e19530d33f5a4cf134ca5c6644743c868cc750cac0c1bb313f0f47240dd8"} Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.263124 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.303247 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.850274399 podStartE2EDuration="1m0.303227176s" podCreationTimestamp="2026-01-30 05:24:18 +0000 UTC" firstStartedPulling="2026-01-30 05:24:33.016760119 +0000 UTC m=+1008.386670376" lastFinishedPulling="2026-01-30 05:24:41.469712896 +0000 UTC m=+1016.839623153" observedRunningTime="2026-01-30 05:25:18.298319155 +0000 UTC m=+1053.668229432" watchObservedRunningTime="2026-01-30 05:25:18.303227176 +0000 UTC m=+1053.673137433" Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.347081 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qzj5h"] Jan 30 05:25:18 crc kubenswrapper[4931]: W0130 05:25:18.353645 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602175a2_25e6_472d_b423_5ab4e6d97769.slice/crio-b27357bffaa312a9f30da66999e22bd02e43c53839aafa7ab73c9113f285a395 WatchSource:0}: Error finding container b27357bffaa312a9f30da66999e22bd02e43c53839aafa7ab73c9113f285a395: Status 404 returned error can't find the container with id b27357bffaa312a9f30da66999e22bd02e43c53839aafa7ab73c9113f285a395 Jan 30 05:25:18 crc kubenswrapper[4931]: I0130 05:25:18.414896 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.282672 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wxb94" event={"ID":"08c65b18-0526-4eec-a608-20478c5eb008","Type":"ContainerStarted","Data":"342d9a5530d63ffbf73ce7c8e40d07fb36a0b7d11a82e475df103d0dfda95398"} Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.291348 4931 generic.go:334] "Generic (PLEG): container finished" podID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerID="8bdcfbd624616a917de046867a4b176539e978b80dd1b9fad737dfdab9cb1bce" exitCode=0 Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.291402 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc3f4796-66b1-452b-afca-5e62cbf2a53b","Type":"ContainerDied","Data":"8bdcfbd624616a917de046867a4b176539e978b80dd1b9fad737dfdab9cb1bce"} Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.321091 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wxb94" podStartSLOduration=3.3785584379999998 podStartE2EDuration="16.321078138s" podCreationTimestamp="2026-01-30 05:25:03 +0000 UTC" firstStartedPulling="2026-01-30 05:25:04.92912182 +0000 UTC m=+1040.299032087" lastFinishedPulling="2026-01-30 05:25:17.87164153 +0000 UTC m=+1053.241551787" observedRunningTime="2026-01-30 05:25:19.320636665 +0000 UTC m=+1054.690546922" watchObservedRunningTime="2026-01-30 05:25:19.321078138 +0000 UTC m=+1054.690988395" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.322472 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"8f709bd92c7c6c28297de5f91b3d8f5726929abc3fede49c29940651ade456cb"} Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.361908 4931 generic.go:334] "Generic (PLEG): container finished" podID="602175a2-25e6-472d-b423-5ab4e6d97769" containerID="a64e91cbe33af673e6689e436885784e9c445a56b737d4748cfcdbf6fce27a53" exitCode=0 Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.363353 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qzj5h" event={"ID":"602175a2-25e6-472d-b423-5ab4e6d97769","Type":"ContainerDied","Data":"a64e91cbe33af673e6689e436885784e9c445a56b737d4748cfcdbf6fce27a53"} Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.363397 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qzj5h" event={"ID":"602175a2-25e6-472d-b423-5ab4e6d97769","Type":"ContainerStarted","Data":"b27357bffaa312a9f30da66999e22bd02e43c53839aafa7ab73c9113f285a395"} Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.401686 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.405802 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.436692 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ggjtl" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" probeResult="failure" output=< Jan 30 05:25:19 crc kubenswrapper[4931]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 05:25:19 crc kubenswrapper[4931]: > Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.621231 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ggjtl-config-57fgk"] Jan 30 05:25:19 crc kubenswrapper[4931]: E0130 05:25:19.621608 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9ebe73-0201-4486-9de9-e8828e84de53" containerName="swift-ring-rebalance" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.621626 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9ebe73-0201-4486-9de9-e8828e84de53" containerName="swift-ring-rebalance" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.621815 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9ebe73-0201-4486-9de9-e8828e84de53" containerName="swift-ring-rebalance" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.622416 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.626397 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.635341 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ggjtl-config-57fgk"] Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.746816 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-scripts\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.746893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-additional-scripts\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.746921 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.747016 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-log-ovn\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.747043 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2cqz\" (UniqueName: \"kubernetes.io/projected/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-kube-api-access-s2cqz\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.747067 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run-ovn\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.849707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-log-ovn\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.849737 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2cqz\" (UniqueName: \"kubernetes.io/projected/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-kube-api-access-s2cqz\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.849764 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run-ovn\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.849805 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-scripts\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.849850 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-additional-scripts\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.849872 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.850001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-log-ovn\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.850060 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.850070 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run-ovn\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.850806 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-additional-scripts\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.852008 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-scripts\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.867646 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2cqz\" (UniqueName: \"kubernetes.io/projected/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-kube-api-access-s2cqz\") pod \"ovn-controller-ggjtl-config-57fgk\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:19 crc kubenswrapper[4931]: I0130 05:25:19.954651 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.383123 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc3f4796-66b1-452b-afca-5e62cbf2a53b","Type":"ContainerStarted","Data":"1c82fc5914a60be58942659c6c59b346ba961ba1c401d7f0c82d22447fc0b135"} Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.383614 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.385558 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"64945c3ef451f83b413801249e5cc8cedcb622a5a129ee80defec98e393eed29"} Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.385611 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"de7939eb8c76f478b34a03e7dd08f4a308dc8bbb63a287db0a5b3eec8794cc7c"} Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.385624 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"e78b14aff0684ab7de691a18e86ce169b9e67b8f02342d8eba7927de4cb39ec6"} Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.405627 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ggjtl-config-57fgk"] Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.438377 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.708201877 podStartE2EDuration="1m2.438357871s" podCreationTimestamp="2026-01-30 05:24:18 +0000 UTC" firstStartedPulling="2026-01-30 05:24:33.017768068 +0000 UTC m=+1008.387678325" lastFinishedPulling="2026-01-30 05:24:42.747924052 +0000 UTC m=+1018.117834319" observedRunningTime="2026-01-30 05:25:20.422041531 +0000 UTC m=+1055.791951788" watchObservedRunningTime="2026-01-30 05:25:20.438357871 +0000 UTC m=+1055.808268118" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.671349 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.763144 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw2q7\" (UniqueName: \"kubernetes.io/projected/602175a2-25e6-472d-b423-5ab4e6d97769-kube-api-access-qw2q7\") pod \"602175a2-25e6-472d-b423-5ab4e6d97769\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.763210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/602175a2-25e6-472d-b423-5ab4e6d97769-operator-scripts\") pod \"602175a2-25e6-472d-b423-5ab4e6d97769\" (UID: \"602175a2-25e6-472d-b423-5ab4e6d97769\") " Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.763886 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602175a2-25e6-472d-b423-5ab4e6d97769-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "602175a2-25e6-472d-b423-5ab4e6d97769" (UID: "602175a2-25e6-472d-b423-5ab4e6d97769"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.770124 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602175a2-25e6-472d-b423-5ab4e6d97769-kube-api-access-qw2q7" (OuterVolumeSpecName: "kube-api-access-qw2q7") pod "602175a2-25e6-472d-b423-5ab4e6d97769" (UID: "602175a2-25e6-472d-b423-5ab4e6d97769"). InnerVolumeSpecName "kube-api-access-qw2q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.865513 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw2q7\" (UniqueName: \"kubernetes.io/projected/602175a2-25e6-472d-b423-5ab4e6d97769-kube-api-access-qw2q7\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:20 crc kubenswrapper[4931]: I0130 05:25:20.865737 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/602175a2-25e6-472d-b423-5ab4e6d97769-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.398240 4931 generic.go:334] "Generic (PLEG): container finished" podID="063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" containerID="31798e9f13d46b8721aae715c1edfd7a01d30cecc4d59728bf20993fd26d459b" exitCode=0 Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.398348 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl-config-57fgk" event={"ID":"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b","Type":"ContainerDied","Data":"31798e9f13d46b8721aae715c1edfd7a01d30cecc4d59728bf20993fd26d459b"} Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.398443 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl-config-57fgk" event={"ID":"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b","Type":"ContainerStarted","Data":"5f16f48af4771a2f9b6d640da0d9b09a85cbb3309209ff0e5843e65b694d4ea0"} Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.399789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qzj5h" event={"ID":"602175a2-25e6-472d-b423-5ab4e6d97769","Type":"ContainerDied","Data":"b27357bffaa312a9f30da66999e22bd02e43c53839aafa7ab73c9113f285a395"} Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.399824 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27357bffaa312a9f30da66999e22bd02e43c53839aafa7ab73c9113f285a395" Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.399886 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qzj5h" Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.407065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"9eac75907fb7af02eb9159bedaf64c4ca7dee04ca441b549c3a48132b186515f"} Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.915225 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qzj5h"] Jan 30 05:25:21 crc kubenswrapper[4931]: I0130 05:25:21.927053 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qzj5h"] Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.418184 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"6088dfc85b1e09a936dd16faccae994e80e2ccb29840c2c09302842b83328fc2"} Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.418447 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"072ac216076cb0fec2ec21789975f4b6fdf297b846d9774f980cb280a52a2718"} Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.418461 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"840bb675eae49d372214aa49017516eb7cc03feb5f0cebb6fb56a2dd4d0837b9"} Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.418469 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"01987d0b4f025a347544f55c09ea6cf4f3249f746d37222f35ee196eb4525b63"} Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.802305 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.895855 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run-ovn\") pod \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.895936 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-scripts\") pod \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.895962 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-additional-scripts\") pod \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896008 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" (UID: "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896138 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2cqz\" (UniqueName: \"kubernetes.io/projected/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-kube-api-access-s2cqz\") pod \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896209 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run\") pod \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896232 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-log-ovn\") pod \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\" (UID: \"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b\") " Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896306 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run" (OuterVolumeSpecName: "var-run") pod "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" (UID: "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896434 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" (UID: "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896639 4931 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896660 4931 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896670 4931 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896785 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" (UID: "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.896939 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-scripts" (OuterVolumeSpecName: "scripts") pod "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" (UID: "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.910696 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-kube-api-access-s2cqz" (OuterVolumeSpecName: "kube-api-access-s2cqz") pod "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" (UID: "063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b"). InnerVolumeSpecName "kube-api-access-s2cqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.998395 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2cqz\" (UniqueName: \"kubernetes.io/projected/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-kube-api-access-s2cqz\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.998451 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:22 crc kubenswrapper[4931]: I0130 05:25:22.998461 4931 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:23 crc kubenswrapper[4931]: I0130 05:25:23.434210 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl-config-57fgk" Jan 30 05:25:23 crc kubenswrapper[4931]: I0130 05:25:23.434953 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602175a2-25e6-472d-b423-5ab4e6d97769" path="/var/lib/kubelet/pods/602175a2-25e6-472d-b423-5ab4e6d97769/volumes" Jan 30 05:25:23 crc kubenswrapper[4931]: I0130 05:25:23.437864 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl-config-57fgk" event={"ID":"063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b","Type":"ContainerDied","Data":"5f16f48af4771a2f9b6d640da0d9b09a85cbb3309209ff0e5843e65b694d4ea0"} Jan 30 05:25:23 crc kubenswrapper[4931]: I0130 05:25:23.437899 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f16f48af4771a2f9b6d640da0d9b09a85cbb3309209ff0e5843e65b694d4ea0" Jan 30 05:25:23 crc kubenswrapper[4931]: I0130 05:25:23.920118 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ggjtl-config-57fgk"] Jan 30 05:25:23 crc kubenswrapper[4931]: I0130 05:25:23.931060 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ggjtl-config-57fgk"] Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.401969 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ggjtl" Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.451081 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"fa1a93081b269f4ada317cd8046ebb7f1a7c1edf1f6e97c13ed393eacd7e1973"} Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.452058 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"cf71a5f4e5a8611b3edb8a350ba0e2eedbd78c0fb76770c94841152df4a3ab69"} Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.452147 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"577bb47efe5f44d38e3c888fdf879028a229599eb11554a344d6a077afa58802"} Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.452209 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"2aa8176e0269c78ed82e92b582f8a0a44311ad87daa1079e948a15315c72207f"} Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.452267 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"cc9cceab2cf461936102038fbf1707f4be2e195decb5808193c7f51c4adb08d3"} Jan 30 05:25:24 crc kubenswrapper[4931]: I0130 05:25:24.452332 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"b4e57fcd32132c1d41dc41783803f5dfdbd53a0317437ca189732c5c62a33471"} Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.367429 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-r9xdc"] Jan 30 05:25:25 crc kubenswrapper[4931]: E0130 05:25:25.367963 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602175a2-25e6-472d-b423-5ab4e6d97769" containerName="mariadb-account-create-update" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.367980 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="602175a2-25e6-472d-b423-5ab4e6d97769" containerName="mariadb-account-create-update" Jan 30 05:25:25 crc kubenswrapper[4931]: E0130 05:25:25.368013 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" containerName="ovn-config" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.368020 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" containerName="ovn-config" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.368153 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" containerName="ovn-config" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.368175 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="602175a2-25e6-472d-b423-5ab4e6d97769" containerName="mariadb-account-create-update" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.368644 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.371027 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.381037 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r9xdc"] Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.444589 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj824\" (UniqueName: \"kubernetes.io/projected/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-kube-api-access-cj824\") pod \"root-account-create-update-r9xdc\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.444766 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-operator-scripts\") pod \"root-account-create-update-r9xdc\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.464211 4931 generic.go:334] "Generic (PLEG): container finished" podID="08c65b18-0526-4eec-a608-20478c5eb008" containerID="342d9a5530d63ffbf73ce7c8e40d07fb36a0b7d11a82e475df103d0dfda95398" exitCode=0 Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.465296 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b" path="/var/lib/kubelet/pods/063f4bc7-8b66-4d8d-b8e2-cf8c713f6a2b/volumes" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.465875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wxb94" event={"ID":"08c65b18-0526-4eec-a608-20478c5eb008","Type":"ContainerDied","Data":"342d9a5530d63ffbf73ce7c8e40d07fb36a0b7d11a82e475df103d0dfda95398"} Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.483926 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerStarted","Data":"7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719"} Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.552520 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-operator-scripts\") pod \"root-account-create-update-r9xdc\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.552692 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj824\" (UniqueName: \"kubernetes.io/projected/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-kube-api-access-cj824\") pod \"root-account-create-update-r9xdc\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.553638 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-operator-scripts\") pod \"root-account-create-update-r9xdc\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.572764 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj824\" (UniqueName: \"kubernetes.io/projected/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-kube-api-access-cj824\") pod \"root-account-create-update-r9xdc\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.596006 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=25.717109971 podStartE2EDuration="30.595984539s" podCreationTimestamp="2026-01-30 05:24:55 +0000 UTC" firstStartedPulling="2026-01-30 05:25:18.425397071 +0000 UTC m=+1053.795307328" lastFinishedPulling="2026-01-30 05:25:23.304271639 +0000 UTC m=+1058.674181896" observedRunningTime="2026-01-30 05:25:25.590735148 +0000 UTC m=+1060.960645405" watchObservedRunningTime="2026-01-30 05:25:25.595984539 +0000 UTC m=+1060.965894796" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.749507 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.937071 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-z2kmr"] Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.938307 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.942895 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.960625 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.960711 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.960776 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.960828 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-config\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.960858 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktd6m\" (UniqueName: \"kubernetes.io/projected/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-kube-api-access-ktd6m\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.960896 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:25 crc kubenswrapper[4931]: I0130 05:25:25.974263 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-z2kmr"] Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.061925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.061990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.062028 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-config\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.062046 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktd6m\" (UniqueName: \"kubernetes.io/projected/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-kube-api-access-ktd6m\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.062072 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.062119 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.062860 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.063398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-config\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.063499 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.064597 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.065463 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.089684 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktd6m\" (UniqueName: \"kubernetes.io/projected/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-kube-api-access-ktd6m\") pod \"dnsmasq-dns-8467b54bcc-z2kmr\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.198106 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r9xdc"] Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.264815 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.492404 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9xdc" event={"ID":"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4","Type":"ContainerStarted","Data":"cf97da0cb0eda4f19afadca8bff99c148f7d4875216c60c7fab1145ddf6c9ea3"} Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.492674 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9xdc" event={"ID":"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4","Type":"ContainerStarted","Data":"b215b0a315c6bcf28b690ce191fa2523a30b8bde34ca9d45018d198fcb7ee9fe"} Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.515028 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-r9xdc" podStartSLOduration=1.515007708 podStartE2EDuration="1.515007708s" podCreationTimestamp="2026-01-30 05:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:26.512142415 +0000 UTC m=+1061.882052682" watchObservedRunningTime="2026-01-30 05:25:26.515007708 +0000 UTC m=+1061.884917985" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.727183 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-z2kmr"] Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.950237 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.974882 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-db-sync-config-data\") pod \"08c65b18-0526-4eec-a608-20478c5eb008\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.974953 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpl8h\" (UniqueName: \"kubernetes.io/projected/08c65b18-0526-4eec-a608-20478c5eb008-kube-api-access-xpl8h\") pod \"08c65b18-0526-4eec-a608-20478c5eb008\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.975035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-combined-ca-bundle\") pod \"08c65b18-0526-4eec-a608-20478c5eb008\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.975068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-config-data\") pod \"08c65b18-0526-4eec-a608-20478c5eb008\" (UID: \"08c65b18-0526-4eec-a608-20478c5eb008\") " Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.978714 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c65b18-0526-4eec-a608-20478c5eb008-kube-api-access-xpl8h" (OuterVolumeSpecName: "kube-api-access-xpl8h") pod "08c65b18-0526-4eec-a608-20478c5eb008" (UID: "08c65b18-0526-4eec-a608-20478c5eb008"). InnerVolumeSpecName "kube-api-access-xpl8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:26 crc kubenswrapper[4931]: I0130 05:25:26.982568 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "08c65b18-0526-4eec-a608-20478c5eb008" (UID: "08c65b18-0526-4eec-a608-20478c5eb008"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.009885 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c65b18-0526-4eec-a608-20478c5eb008" (UID: "08c65b18-0526-4eec-a608-20478c5eb008"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.023336 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-config-data" (OuterVolumeSpecName: "config-data") pod "08c65b18-0526-4eec-a608-20478c5eb008" (UID: "08c65b18-0526-4eec-a608-20478c5eb008"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.076964 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.076998 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.077007 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08c65b18-0526-4eec-a608-20478c5eb008-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.077019 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpl8h\" (UniqueName: \"kubernetes.io/projected/08c65b18-0526-4eec-a608-20478c5eb008-kube-api-access-xpl8h\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.505814 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerID="88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346" exitCode=0 Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.505857 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" event={"ID":"0c2b2206-fcd5-432f-82a7-20e22cd3ceef","Type":"ContainerDied","Data":"88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346"} Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.505902 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" event={"ID":"0c2b2206-fcd5-432f-82a7-20e22cd3ceef","Type":"ContainerStarted","Data":"9d954a97d7ab108beb1f87cd63eb9168552d1563db4086227881a73279ff0b7b"} Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.508621 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wxb94" event={"ID":"08c65b18-0526-4eec-a608-20478c5eb008","Type":"ContainerDied","Data":"1dde39fd71deaa1577ea5797017a140ffe24ad73bc61b3566927cd1bee60c4f1"} Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.508666 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dde39fd71deaa1577ea5797017a140ffe24ad73bc61b3566927cd1bee60c4f1" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.508680 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wxb94" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.510925 4931 generic.go:334] "Generic (PLEG): container finished" podID="6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" containerID="cf97da0cb0eda4f19afadca8bff99c148f7d4875216c60c7fab1145ddf6c9ea3" exitCode=0 Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.510954 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9xdc" event={"ID":"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4","Type":"ContainerDied","Data":"cf97da0cb0eda4f19afadca8bff99c148f7d4875216c60c7fab1145ddf6c9ea3"} Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.961671 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-z2kmr"] Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.979478 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-8c6pt"] Jan 30 05:25:27 crc kubenswrapper[4931]: E0130 05:25:27.979787 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c65b18-0526-4eec-a608-20478c5eb008" containerName="glance-db-sync" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.979804 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c65b18-0526-4eec-a608-20478c5eb008" containerName="glance-db-sync" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.979943 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c65b18-0526-4eec-a608-20478c5eb008" containerName="glance-db-sync" Jan 30 05:25:27 crc kubenswrapper[4931]: I0130 05:25:27.980712 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.008951 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-8c6pt"] Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.096651 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.096710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p52jc\" (UniqueName: \"kubernetes.io/projected/be176172-3d0c-47ae-aa98-d7ee20022f44-kube-api-access-p52jc\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.096730 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.096768 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.096824 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.096858 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-config\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.197876 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-config\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.198141 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.198175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p52jc\" (UniqueName: \"kubernetes.io/projected/be176172-3d0c-47ae-aa98-d7ee20022f44-kube-api-access-p52jc\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.198191 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.198226 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.198277 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.199091 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.199591 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-config\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.200079 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.200888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.201515 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.217822 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p52jc\" (UniqueName: \"kubernetes.io/projected/be176172-3d0c-47ae-aa98-d7ee20022f44-kube-api-access-p52jc\") pod \"dnsmasq-dns-56c9bc6f5c-8c6pt\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.298368 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.520954 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" event={"ID":"0c2b2206-fcd5-432f-82a7-20e22cd3ceef","Type":"ContainerStarted","Data":"a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7"} Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.521194 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.543611 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" podStartSLOduration=3.543592508 podStartE2EDuration="3.543592508s" podCreationTimestamp="2026-01-30 05:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:28.537004689 +0000 UTC m=+1063.906914966" watchObservedRunningTime="2026-01-30 05:25:28.543592508 +0000 UTC m=+1063.913502755" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.571922 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-8c6pt"] Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.891614 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.914281 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj824\" (UniqueName: \"kubernetes.io/projected/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-kube-api-access-cj824\") pod \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.914405 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-operator-scripts\") pod \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\" (UID: \"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4\") " Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.914949 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" (UID: "6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:28 crc kubenswrapper[4931]: I0130 05:25:28.928086 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-kube-api-access-cj824" (OuterVolumeSpecName: "kube-api-access-cj824") pod "6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" (UID: "6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4"). InnerVolumeSpecName "kube-api-access-cj824". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.016457 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj824\" (UniqueName: \"kubernetes.io/projected/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-kube-api-access-cj824\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.016716 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.530118 4931 generic.go:334] "Generic (PLEG): container finished" podID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerID="aa6a0d8cd249f8b0104844bcd59d7c80f0ef6c784ec9f9d65e07215bbb280738" exitCode=0 Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.530166 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" event={"ID":"be176172-3d0c-47ae-aa98-d7ee20022f44","Type":"ContainerDied","Data":"aa6a0d8cd249f8b0104844bcd59d7c80f0ef6c784ec9f9d65e07215bbb280738"} Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.530235 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" event={"ID":"be176172-3d0c-47ae-aa98-d7ee20022f44","Type":"ContainerStarted","Data":"af809bcfb9bcd948f444820cb7e724048ff5c243bf6772c74d31c5eab0630ea9"} Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.531490 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9xdc" Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.532014 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9xdc" event={"ID":"6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4","Type":"ContainerDied","Data":"b215b0a315c6bcf28b690ce191fa2523a30b8bde34ca9d45018d198fcb7ee9fe"} Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.532035 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b215b0a315c6bcf28b690ce191fa2523a30b8bde34ca9d45018d198fcb7ee9fe" Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.532086 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerName="dnsmasq-dns" containerID="cri-o://a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7" gracePeriod=10 Jan 30 05:25:29 crc kubenswrapper[4931]: I0130 05:25:29.992107 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.039951 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-nb\") pod \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.040007 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-svc\") pod \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.040059 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktd6m\" (UniqueName: \"kubernetes.io/projected/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-kube-api-access-ktd6m\") pod \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.040128 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-sb\") pod \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.040173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-config\") pod \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.040217 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-swift-storage-0\") pod \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\" (UID: \"0c2b2206-fcd5-432f-82a7-20e22cd3ceef\") " Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.044753 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-kube-api-access-ktd6m" (OuterVolumeSpecName: "kube-api-access-ktd6m") pod "0c2b2206-fcd5-432f-82a7-20e22cd3ceef" (UID: "0c2b2206-fcd5-432f-82a7-20e22cd3ceef"). InnerVolumeSpecName "kube-api-access-ktd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.083146 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c2b2206-fcd5-432f-82a7-20e22cd3ceef" (UID: "0c2b2206-fcd5-432f-82a7-20e22cd3ceef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.089495 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0c2b2206-fcd5-432f-82a7-20e22cd3ceef" (UID: "0c2b2206-fcd5-432f-82a7-20e22cd3ceef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.090479 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c2b2206-fcd5-432f-82a7-20e22cd3ceef" (UID: "0c2b2206-fcd5-432f-82a7-20e22cd3ceef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.096175 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-config" (OuterVolumeSpecName: "config") pod "0c2b2206-fcd5-432f-82a7-20e22cd3ceef" (UID: "0c2b2206-fcd5-432f-82a7-20e22cd3ceef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.098471 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0c2b2206-fcd5-432f-82a7-20e22cd3ceef" (UID: "0c2b2206-fcd5-432f-82a7-20e22cd3ceef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.143382 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.143718 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.143734 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.143748 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktd6m\" (UniqueName: \"kubernetes.io/projected/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-kube-api-access-ktd6m\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.143762 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.143774 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2b2206-fcd5-432f-82a7-20e22cd3ceef-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.329056 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.436467 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.544392 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" event={"ID":"be176172-3d0c-47ae-aa98-d7ee20022f44","Type":"ContainerStarted","Data":"566ade23da173169f793e0bfc68dfb7fc94d967bbb01c1ecaa6d6c7476150a48"} Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.548035 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.552634 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerID="a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7" exitCode=0 Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.552665 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" event={"ID":"0c2b2206-fcd5-432f-82a7-20e22cd3ceef","Type":"ContainerDied","Data":"a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7"} Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.552684 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" event={"ID":"0c2b2206-fcd5-432f-82a7-20e22cd3ceef","Type":"ContainerDied","Data":"9d954a97d7ab108beb1f87cd63eb9168552d1563db4086227881a73279ff0b7b"} Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.552702 4931 scope.go:117] "RemoveContainer" containerID="a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.552809 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-z2kmr" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.584903 4931 scope.go:117] "RemoveContainer" containerID="88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.590732 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" podStartSLOduration=3.590719371 podStartE2EDuration="3.590719371s" podCreationTimestamp="2026-01-30 05:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:30.586482359 +0000 UTC m=+1065.956392616" watchObservedRunningTime="2026-01-30 05:25:30.590719371 +0000 UTC m=+1065.960629628" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.606049 4931 scope.go:117] "RemoveContainer" containerID="a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7" Jan 30 05:25:30 crc kubenswrapper[4931]: E0130 05:25:30.610500 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7\": container with ID starting with a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7 not found: ID does not exist" containerID="a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.610541 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7"} err="failed to get container status \"a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7\": rpc error: code = NotFound desc = could not find container \"a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7\": container with ID starting with a37b67a5432034fca03956c6e5e234c7b9a5f80ea311a013b6f1df6284c784f7 not found: ID does not exist" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.610576 4931 scope.go:117] "RemoveContainer" containerID="88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346" Jan 30 05:25:30 crc kubenswrapper[4931]: E0130 05:25:30.613671 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346\": container with ID starting with 88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346 not found: ID does not exist" containerID="88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.613705 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346"} err="failed to get container status \"88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346\": rpc error: code = NotFound desc = could not find container \"88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346\": container with ID starting with 88439359b01e7aed3c2e8e25a07c1a042c6d964234627219eaa5cf8b88027346 not found: ID does not exist" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.620211 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-z2kmr"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.626317 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-z2kmr"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.684937 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cb5c-account-create-update-7n4vq"] Jan 30 05:25:30 crc kubenswrapper[4931]: E0130 05:25:30.685235 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" containerName="mariadb-account-create-update" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.685251 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" containerName="mariadb-account-create-update" Jan 30 05:25:30 crc kubenswrapper[4931]: E0130 05:25:30.685265 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerName="init" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.685271 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerName="init" Jan 30 05:25:30 crc kubenswrapper[4931]: E0130 05:25:30.685285 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerName="dnsmasq-dns" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.685291 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerName="dnsmasq-dns" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.685449 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" containerName="dnsmasq-dns" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.685474 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" containerName="mariadb-account-create-update" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.685941 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.689331 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.712337 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cb5c-account-create-update-7n4vq"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.721083 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wtjbg"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.721983 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.735630 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wtjbg"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.757279 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsmf\" (UniqueName: \"kubernetes.io/projected/d98e6af1-4571-4da7-a6e8-0b54505af47c-kube-api-access-xwsmf\") pod \"barbican-cb5c-account-create-update-7n4vq\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.757339 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-operator-scripts\") pod \"cinder-db-create-wtjbg\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.757398 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfn76\" (UniqueName: \"kubernetes.io/projected/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-kube-api-access-nfn76\") pod \"cinder-db-create-wtjbg\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.757519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d98e6af1-4571-4da7-a6e8-0b54505af47c-operator-scripts\") pod \"barbican-cb5c-account-create-update-7n4vq\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.808758 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4c2nt"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.809663 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.826318 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4c2nt"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.858320 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsmf\" (UniqueName: \"kubernetes.io/projected/d98e6af1-4571-4da7-a6e8-0b54505af47c-kube-api-access-xwsmf\") pod \"barbican-cb5c-account-create-update-7n4vq\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.858481 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3b14699-8089-4af7-b0bd-654a8fda9715-operator-scripts\") pod \"barbican-db-create-4c2nt\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.858563 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-operator-scripts\") pod \"cinder-db-create-wtjbg\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.858769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfn76\" (UniqueName: \"kubernetes.io/projected/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-kube-api-access-nfn76\") pod \"cinder-db-create-wtjbg\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.858919 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d98e6af1-4571-4da7-a6e8-0b54505af47c-operator-scripts\") pod \"barbican-cb5c-account-create-update-7n4vq\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.858975 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24htz\" (UniqueName: \"kubernetes.io/projected/c3b14699-8089-4af7-b0bd-654a8fda9715-kube-api-access-24htz\") pod \"barbican-db-create-4c2nt\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.859668 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-operator-scripts\") pod \"cinder-db-create-wtjbg\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.859868 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d98e6af1-4571-4da7-a6e8-0b54505af47c-operator-scripts\") pod \"barbican-cb5c-account-create-update-7n4vq\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.885835 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfn76\" (UniqueName: \"kubernetes.io/projected/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-kube-api-access-nfn76\") pod \"cinder-db-create-wtjbg\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.906534 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsmf\" (UniqueName: \"kubernetes.io/projected/d98e6af1-4571-4da7-a6e8-0b54505af47c-kube-api-access-xwsmf\") pod \"barbican-cb5c-account-create-update-7n4vq\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.944444 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8ee9-account-create-update-sdn4j"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.945724 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.952865 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.960742 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3b14699-8089-4af7-b0bd-654a8fda9715-operator-scripts\") pod \"barbican-db-create-4c2nt\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.961551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3b14699-8089-4af7-b0bd-654a8fda9715-operator-scripts\") pod \"barbican-db-create-4c2nt\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.961606 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b78d3f2-c575-4b24-bbb8-c956f61a575d-operator-scripts\") pod \"cinder-8ee9-account-create-update-sdn4j\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.961632 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgs2\" (UniqueName: \"kubernetes.io/projected/3b78d3f2-c575-4b24-bbb8-c956f61a575d-kube-api-access-blgs2\") pod \"cinder-8ee9-account-create-update-sdn4j\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.961606 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8ee9-account-create-update-sdn4j"] Jan 30 05:25:30 crc kubenswrapper[4931]: I0130 05:25:30.961729 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24htz\" (UniqueName: \"kubernetes.io/projected/c3b14699-8089-4af7-b0bd-654a8fda9715-kube-api-access-24htz\") pod \"barbican-db-create-4c2nt\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.001402 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.007057 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24htz\" (UniqueName: \"kubernetes.io/projected/c3b14699-8089-4af7-b0bd-654a8fda9715-kube-api-access-24htz\") pod \"barbican-db-create-4c2nt\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.046749 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.050952 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4gqzx"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.051875 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: W0130 05:25:31.059046 4931 reflector.go:561] object-"openstack"/"keystone-keystone-dockercfg-llv5h": failed to list *v1.Secret: secrets "keystone-keystone-dockercfg-llv5h" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Jan 30 05:25:31 crc kubenswrapper[4931]: E0130 05:25:31.059084 4931 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-keystone-dockercfg-llv5h\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-keystone-dockercfg-llv5h\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 05:25:31 crc kubenswrapper[4931]: W0130 05:25:31.059121 4931 reflector.go:561] object-"openstack"/"keystone-config-data": failed to list *v1.Secret: secrets "keystone-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Jan 30 05:25:31 crc kubenswrapper[4931]: E0130 05:25:31.059132 4931 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 05:25:31 crc kubenswrapper[4931]: W0130 05:25:31.059159 4931 reflector.go:561] object-"openstack"/"keystone": failed to list *v1.Secret: secrets "keystone" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Jan 30 05:25:31 crc kubenswrapper[4931]: E0130 05:25:31.059169 4931 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 05:25:31 crc kubenswrapper[4931]: W0130 05:25:31.062718 4931 reflector.go:561] object-"openstack"/"keystone-scripts": failed to list *v1.Secret: secrets "keystone-scripts" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Jan 30 05:25:31 crc kubenswrapper[4931]: E0130 05:25:31.062745 4931 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-scripts\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.063558 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.063627 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-combined-ca-bundle\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.063725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b78d3f2-c575-4b24-bbb8-c956f61a575d-operator-scripts\") pod \"cinder-8ee9-account-create-update-sdn4j\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.063749 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blgs2\" (UniqueName: \"kubernetes.io/projected/3b78d3f2-c575-4b24-bbb8-c956f61a575d-kube-api-access-blgs2\") pod \"cinder-8ee9-account-create-update-sdn4j\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.063775 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lgmx\" (UniqueName: \"kubernetes.io/projected/9d923658-472c-4565-bae3-5eb1e329a92c-kube-api-access-6lgmx\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.064469 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b78d3f2-c575-4b24-bbb8-c956f61a575d-operator-scripts\") pod \"cinder-8ee9-account-create-update-sdn4j\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.108178 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4gqzx"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.110129 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgs2\" (UniqueName: \"kubernetes.io/projected/3b78d3f2-c575-4b24-bbb8-c956f61a575d-kube-api-access-blgs2\") pod \"cinder-8ee9-account-create-update-sdn4j\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.151199 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-557f-account-create-update-6vjq5"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.152108 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: W0130 05:25:31.158206 4931 reflector.go:561] object-"openstack"/"neutron-db-secret": failed to list *v1.Secret: secrets "neutron-db-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Jan 30 05:25:31 crc kubenswrapper[4931]: E0130 05:25:31.158239 4931 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"neutron-db-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"neutron-db-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.164769 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-operator-scripts\") pod \"neutron-557f-account-create-update-6vjq5\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.164817 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lgmx\" (UniqueName: \"kubernetes.io/projected/9d923658-472c-4565-bae3-5eb1e329a92c-kube-api-access-6lgmx\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.164838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.164869 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2w74\" (UniqueName: \"kubernetes.io/projected/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-kube-api-access-q2w74\") pod \"neutron-557f-account-create-update-6vjq5\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.164898 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-combined-ca-bundle\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.166117 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.200173 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-combined-ca-bundle\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.200342 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-557f-account-create-update-6vjq5"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.212649 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lgmx\" (UniqueName: \"kubernetes.io/projected/9d923658-472c-4565-bae3-5eb1e329a92c-kube-api-access-6lgmx\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.230083 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-cwv94"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.231163 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.262652 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cwv94"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.267587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2w74\" (UniqueName: \"kubernetes.io/projected/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-kube-api-access-q2w74\") pod \"neutron-557f-account-create-update-6vjq5\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.267839 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-operator-scripts\") pod \"neutron-557f-account-create-update-6vjq5\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.268609 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-operator-scripts\") pod \"neutron-557f-account-create-update-6vjq5\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.302696 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.344020 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2w74\" (UniqueName: \"kubernetes.io/projected/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-kube-api-access-q2w74\") pod \"neutron-557f-account-create-update-6vjq5\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.372363 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnnxt\" (UniqueName: \"kubernetes.io/projected/2c29ace9-3be7-44a1-b8eb-d356a4721152-kube-api-access-pnnxt\") pod \"neutron-db-create-cwv94\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.372537 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c29ace9-3be7-44a1-b8eb-d356a4721152-operator-scripts\") pod \"neutron-db-create-cwv94\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.475010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnnxt\" (UniqueName: \"kubernetes.io/projected/2c29ace9-3be7-44a1-b8eb-d356a4721152-kube-api-access-pnnxt\") pod \"neutron-db-create-cwv94\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.475111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c29ace9-3be7-44a1-b8eb-d356a4721152-operator-scripts\") pod \"neutron-db-create-cwv94\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.475746 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c29ace9-3be7-44a1-b8eb-d356a4721152-operator-scripts\") pod \"neutron-db-create-cwv94\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.480156 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2b2206-fcd5-432f-82a7-20e22cd3ceef" path="/var/lib/kubelet/pods/0c2b2206-fcd5-432f-82a7-20e22cd3ceef/volumes" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.480876 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cb5c-account-create-update-7n4vq"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.491077 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.505672 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wtjbg"] Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.514035 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnnxt\" (UniqueName: \"kubernetes.io/projected/2c29ace9-3be7-44a1-b8eb-d356a4721152-kube-api-access-pnnxt\") pod \"neutron-db-create-cwv94\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.566876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.570291 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wtjbg" event={"ID":"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0","Type":"ContainerStarted","Data":"612d6371cb07082ef0429b90b0d863ecd1c92b6f45ac7168b52671d88d7ef25d"} Jan 30 05:25:31 crc kubenswrapper[4931]: I0130 05:25:31.584943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cb5c-account-create-update-7n4vq" event={"ID":"d98e6af1-4571-4da7-a6e8-0b54505af47c","Type":"ContainerStarted","Data":"aa1bc5770dea471d374a2f149f70de91cf907c08892cfe93a195b132fc8e0d87"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:31.880133 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4c2nt"] Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:31.882397 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:31.959107 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8ee9-account-create-update-sdn4j"] Jan 30 05:25:35 crc kubenswrapper[4931]: W0130 05:25:31.965322 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b78d3f2_c575_4b24_bbb8_c956f61a575d.slice/crio-64352ea50cbd49b7e446033adc71bd1636d6e240e488199b49931a95d2cff069 WatchSource:0}: Error finding container 64352ea50cbd49b7e446033adc71bd1636d6e240e488199b49931a95d2cff069: Status 404 returned error can't find the container with id 64352ea50cbd49b7e446033adc71bd1636d6e240e488199b49931a95d2cff069 Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:31.999885 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-r9xdc"] Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.000306 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.007056 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-r9xdc"] Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.032759 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-557f-account-create-update-6vjq5"] Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.138211 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llv5h" Jan 30 05:25:35 crc kubenswrapper[4931]: E0130 05:25:32.166876 4931 secret.go:188] Couldn't get secret openstack/keystone-config-data: failed to sync secret cache: timed out waiting for the condition Jan 30 05:25:35 crc kubenswrapper[4931]: E0130 05:25:32.167042 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data podName:9d923658-472c-4565-bae3-5eb1e329a92c nodeName:}" failed. No retries permitted until 2026-01-30 05:25:32.667013539 +0000 UTC m=+1068.036923796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data") pod "keystone-db-sync-4gqzx" (UID: "9d923658-472c-4565-bae3-5eb1e329a92c") : failed to sync secret cache: timed out waiting for the condition Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.367148 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.392408 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.591787 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-557f-account-create-update-6vjq5" event={"ID":"df6b82f5-5c39-4101-b9f8-05aaf9547a0b","Type":"ContainerStarted","Data":"aa23f5e073a3cef33987241913fa85543847130828c056bd851165f88aec0d30"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.592930 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8ee9-account-create-update-sdn4j" event={"ID":"3b78d3f2-c575-4b24-bbb8-c956f61a575d","Type":"ContainerStarted","Data":"64352ea50cbd49b7e446033adc71bd1636d6e240e488199b49931a95d2cff069"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.594152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c2nt" event={"ID":"c3b14699-8089-4af7-b0bd-654a8fda9715","Type":"ContainerStarted","Data":"c98a171eb3b4b2a4cc6684d3ea0e312812cf19d06e872769140d149acb612bf9"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.595511 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wtjbg" event={"ID":"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0","Type":"ContainerStarted","Data":"15692c1b35f8b38884128c50d64f7fe3e0155bf28a22d7aceb44dcecc3b74210"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.597033 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cb5c-account-create-update-7n4vq" event={"ID":"d98e6af1-4571-4da7-a6e8-0b54505af47c","Type":"ContainerStarted","Data":"98322602699d7d942a6271f0a7fc74a73af1b5a299d4d538e2ee24bc7375a406"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.612895 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-wtjbg" podStartSLOduration=2.612873856 podStartE2EDuration="2.612873856s" podCreationTimestamp="2026-01-30 05:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:32.606790591 +0000 UTC m=+1067.976700848" watchObservedRunningTime="2026-01-30 05:25:32.612873856 +0000 UTC m=+1067.982784113" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.627849 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-cb5c-account-create-update-7n4vq" podStartSLOduration=2.627831146 podStartE2EDuration="2.627831146s" podCreationTimestamp="2026-01-30 05:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:32.624955754 +0000 UTC m=+1067.994866011" watchObservedRunningTime="2026-01-30 05:25:32.627831146 +0000 UTC m=+1067.997741403" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.700319 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.706605 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data\") pod \"keystone-db-sync-4gqzx\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:32.911801 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:33.432608 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4" path="/var/lib/kubelet/pods/6cbb8ac5-f347-4f13-9c58-ecc57a62dfb4/volumes" Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:33.626906 4931 generic.go:334] "Generic (PLEG): container finished" podID="d98e6af1-4571-4da7-a6e8-0b54505af47c" containerID="98322602699d7d942a6271f0a7fc74a73af1b5a299d4d538e2ee24bc7375a406" exitCode=0 Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:33.626988 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cb5c-account-create-update-7n4vq" event={"ID":"d98e6af1-4571-4da7-a6e8-0b54505af47c","Type":"ContainerDied","Data":"98322602699d7d942a6271f0a7fc74a73af1b5a299d4d538e2ee24bc7375a406"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:33.636230 4931 generic.go:334] "Generic (PLEG): container finished" podID="e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" containerID="15692c1b35f8b38884128c50d64f7fe3e0155bf28a22d7aceb44dcecc3b74210" exitCode=0 Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:33.636343 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wtjbg" event={"ID":"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0","Type":"ContainerDied","Data":"15692c1b35f8b38884128c50d64f7fe3e0155bf28a22d7aceb44dcecc3b74210"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:35.657209 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8ee9-account-create-update-sdn4j" event={"ID":"3b78d3f2-c575-4b24-bbb8-c956f61a575d","Type":"ContainerStarted","Data":"2f4a9744870428aceb547b2acdf130704ce1aaa6370e1105462da3c72da4e168"} Jan 30 05:25:35 crc kubenswrapper[4931]: I0130 05:25:35.663035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c2nt" event={"ID":"c3b14699-8089-4af7-b0bd-654a8fda9715","Type":"ContainerStarted","Data":"1a58b66910dccd0a3d3aecf3a69cc3be05007daec35ed4f4da6ecaf7deb3050f"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.086438 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cwv94"] Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.192738 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4gqzx"] Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.304632 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.309181 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.459060 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwsmf\" (UniqueName: \"kubernetes.io/projected/d98e6af1-4571-4da7-a6e8-0b54505af47c-kube-api-access-xwsmf\") pod \"d98e6af1-4571-4da7-a6e8-0b54505af47c\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.459317 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfn76\" (UniqueName: \"kubernetes.io/projected/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-kube-api-access-nfn76\") pod \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.459436 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-operator-scripts\") pod \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\" (UID: \"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0\") " Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.459491 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d98e6af1-4571-4da7-a6e8-0b54505af47c-operator-scripts\") pod \"d98e6af1-4571-4da7-a6e8-0b54505af47c\" (UID: \"d98e6af1-4571-4da7-a6e8-0b54505af47c\") " Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.460169 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" (UID: "e65373ae-84e0-4338-be4c-8cc8bd2d3fb0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.460439 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d98e6af1-4571-4da7-a6e8-0b54505af47c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d98e6af1-4571-4da7-a6e8-0b54505af47c" (UID: "d98e6af1-4571-4da7-a6e8-0b54505af47c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.465150 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-kube-api-access-nfn76" (OuterVolumeSpecName: "kube-api-access-nfn76") pod "e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" (UID: "e65373ae-84e0-4338-be4c-8cc8bd2d3fb0"). InnerVolumeSpecName "kube-api-access-nfn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.465301 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98e6af1-4571-4da7-a6e8-0b54505af47c-kube-api-access-xwsmf" (OuterVolumeSpecName: "kube-api-access-xwsmf") pod "d98e6af1-4571-4da7-a6e8-0b54505af47c" (UID: "d98e6af1-4571-4da7-a6e8-0b54505af47c"). InnerVolumeSpecName "kube-api-access-xwsmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.561773 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwsmf\" (UniqueName: \"kubernetes.io/projected/d98e6af1-4571-4da7-a6e8-0b54505af47c-kube-api-access-xwsmf\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.561815 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfn76\" (UniqueName: \"kubernetes.io/projected/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-kube-api-access-nfn76\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.561827 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.561840 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d98e6af1-4571-4da7-a6e8-0b54505af47c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.675379 4931 generic.go:334] "Generic (PLEG): container finished" podID="2c29ace9-3be7-44a1-b8eb-d356a4721152" containerID="f8a2b41856adf7471c684772afc9b12f445fbd24f6ab5036ce18fde6331c17d4" exitCode=0 Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.675472 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cwv94" event={"ID":"2c29ace9-3be7-44a1-b8eb-d356a4721152","Type":"ContainerDied","Data":"f8a2b41856adf7471c684772afc9b12f445fbd24f6ab5036ce18fde6331c17d4"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.675550 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cwv94" event={"ID":"2c29ace9-3be7-44a1-b8eb-d356a4721152","Type":"ContainerStarted","Data":"5db11c59dc0ea93fac43524325f66f48d5401cc5cc845a375c0bc2d6e3288c9e"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.678562 4931 generic.go:334] "Generic (PLEG): container finished" podID="3b78d3f2-c575-4b24-bbb8-c956f61a575d" containerID="2f4a9744870428aceb547b2acdf130704ce1aaa6370e1105462da3c72da4e168" exitCode=0 Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.678739 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8ee9-account-create-update-sdn4j" event={"ID":"3b78d3f2-c575-4b24-bbb8-c956f61a575d","Type":"ContainerDied","Data":"2f4a9744870428aceb547b2acdf130704ce1aaa6370e1105462da3c72da4e168"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.680577 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3b14699-8089-4af7-b0bd-654a8fda9715" containerID="1a58b66910dccd0a3d3aecf3a69cc3be05007daec35ed4f4da6ecaf7deb3050f" exitCode=0 Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.680674 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c2nt" event={"ID":"c3b14699-8089-4af7-b0bd-654a8fda9715","Type":"ContainerDied","Data":"1a58b66910dccd0a3d3aecf3a69cc3be05007daec35ed4f4da6ecaf7deb3050f"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.682400 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wtjbg" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.682407 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wtjbg" event={"ID":"e65373ae-84e0-4338-be4c-8cc8bd2d3fb0","Type":"ContainerDied","Data":"612d6371cb07082ef0429b90b0d863ecd1c92b6f45ac7168b52671d88d7ef25d"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.682909 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612d6371cb07082ef0429b90b0d863ecd1c92b6f45ac7168b52671d88d7ef25d" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.683559 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4gqzx" event={"ID":"9d923658-472c-4565-bae3-5eb1e329a92c","Type":"ContainerStarted","Data":"685d0bdee5d1a947f16ef0e29880c274ee3422ec09c45cb58fd08bec46c96278"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.685085 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-7n4vq" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.685051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cb5c-account-create-update-7n4vq" event={"ID":"d98e6af1-4571-4da7-a6e8-0b54505af47c","Type":"ContainerDied","Data":"aa1bc5770dea471d374a2f149f70de91cf907c08892cfe93a195b132fc8e0d87"} Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.685138 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa1bc5770dea471d374a2f149f70de91cf907c08892cfe93a195b132fc8e0d87" Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.686876 4931 generic.go:334] "Generic (PLEG): container finished" podID="df6b82f5-5c39-4101-b9f8-05aaf9547a0b" containerID="94fc6d9869d9820d8c965d9ddc61b4a6003c2bcfb528dd4f82ab1c383ce5be01" exitCode=0 Jan 30 05:25:36 crc kubenswrapper[4931]: I0130 05:25:36.687016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-557f-account-create-update-6vjq5" event={"ID":"df6b82f5-5c39-4101-b9f8-05aaf9547a0b","Type":"ContainerDied","Data":"94fc6d9869d9820d8c965d9ddc61b4a6003c2bcfb528dd4f82ab1c383ce5be01"} Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.002093 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-p975f"] Jan 30 05:25:37 crc kubenswrapper[4931]: E0130 05:25:37.003326 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e6af1-4571-4da7-a6e8-0b54505af47c" containerName="mariadb-account-create-update" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.003370 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e6af1-4571-4da7-a6e8-0b54505af47c" containerName="mariadb-account-create-update" Jan 30 05:25:37 crc kubenswrapper[4931]: E0130 05:25:37.003393 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" containerName="mariadb-database-create" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.003402 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" containerName="mariadb-database-create" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.004170 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" containerName="mariadb-database-create" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.004207 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98e6af1-4571-4da7-a6e8-0b54505af47c" containerName="mariadb-account-create-update" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.006471 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.009535 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.011863 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p975f"] Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.175770 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd612f9b-4de8-48e4-a945-c97e5c495292-operator-scripts\") pod \"root-account-create-update-p975f\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.175952 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv7l4\" (UniqueName: \"kubernetes.io/projected/dd612f9b-4de8-48e4-a945-c97e5c495292-kube-api-access-tv7l4\") pod \"root-account-create-update-p975f\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.277497 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv7l4\" (UniqueName: \"kubernetes.io/projected/dd612f9b-4de8-48e4-a945-c97e5c495292-kube-api-access-tv7l4\") pod \"root-account-create-update-p975f\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.277577 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd612f9b-4de8-48e4-a945-c97e5c495292-operator-scripts\") pod \"root-account-create-update-p975f\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.278390 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd612f9b-4de8-48e4-a945-c97e5c495292-operator-scripts\") pod \"root-account-create-update-p975f\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.300681 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv7l4\" (UniqueName: \"kubernetes.io/projected/dd612f9b-4de8-48e4-a945-c97e5c495292-kube-api-access-tv7l4\") pod \"root-account-create-update-p975f\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.331924 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p975f" Jan 30 05:25:37 crc kubenswrapper[4931]: I0130 05:25:37.798382 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p975f"] Jan 30 05:25:37 crc kubenswrapper[4931]: W0130 05:25:37.805724 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd612f9b_4de8_48e4_a945_c97e5c495292.slice/crio-c42924e228e0faa34852385f08124501fd35f78817571e20aa497d3a1537a563 WatchSource:0}: Error finding container c42924e228e0faa34852385f08124501fd35f78817571e20aa497d3a1537a563: Status 404 returned error can't find the container with id c42924e228e0faa34852385f08124501fd35f78817571e20aa497d3a1537a563 Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.006027 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.096990 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2w74\" (UniqueName: \"kubernetes.io/projected/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-kube-api-access-q2w74\") pod \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.097171 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-operator-scripts\") pod \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\" (UID: \"df6b82f5-5c39-4101-b9f8-05aaf9547a0b\") " Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.098081 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df6b82f5-5c39-4101-b9f8-05aaf9547a0b" (UID: "df6b82f5-5c39-4101-b9f8-05aaf9547a0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.105634 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-kube-api-access-q2w74" (OuterVolumeSpecName: "kube-api-access-q2w74") pod "df6b82f5-5c39-4101-b9f8-05aaf9547a0b" (UID: "df6b82f5-5c39-4101-b9f8-05aaf9547a0b"). InnerVolumeSpecName "kube-api-access-q2w74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.198808 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2w74\" (UniqueName: \"kubernetes.io/projected/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-kube-api-access-q2w74\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.198847 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b82f5-5c39-4101-b9f8-05aaf9547a0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.300642 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.376305 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-5hp4b"] Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.376595 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="dnsmasq-dns" containerID="cri-o://48a476fc993377e732a41ec538be9f206289fff472ea517e4a1e5122eac3f5e4" gracePeriod=10 Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.704659 4931 generic.go:334] "Generic (PLEG): container finished" podID="dd612f9b-4de8-48e4-a945-c97e5c495292" containerID="e65d7d2b5f976da6a48bf573c615d7b8b7b4da4391bf0bdecb5b42aeee5717fb" exitCode=0 Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.704719 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p975f" event={"ID":"dd612f9b-4de8-48e4-a945-c97e5c495292","Type":"ContainerDied","Data":"e65d7d2b5f976da6a48bf573c615d7b8b7b4da4391bf0bdecb5b42aeee5717fb"} Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.704744 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p975f" event={"ID":"dd612f9b-4de8-48e4-a945-c97e5c495292","Type":"ContainerStarted","Data":"c42924e228e0faa34852385f08124501fd35f78817571e20aa497d3a1537a563"} Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.711255 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-557f-account-create-update-6vjq5" event={"ID":"df6b82f5-5c39-4101-b9f8-05aaf9547a0b","Type":"ContainerDied","Data":"aa23f5e073a3cef33987241913fa85543847130828c056bd851165f88aec0d30"} Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.711281 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa23f5e073a3cef33987241913fa85543847130828c056bd851165f88aec0d30" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.711465 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-557f-account-create-update-6vjq5" Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.715935 4931 generic.go:334] "Generic (PLEG): container finished" podID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerID="48a476fc993377e732a41ec538be9f206289fff472ea517e4a1e5122eac3f5e4" exitCode=0 Jan 30 05:25:38 crc kubenswrapper[4931]: I0130 05:25:38.716039 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" event={"ID":"eb20eb5e-4f22-4088-98dc-44eaf5ac5958","Type":"ContainerDied","Data":"48a476fc993377e732a41ec538be9f206289fff472ea517e4a1e5122eac3f5e4"} Jan 30 05:25:40 crc kubenswrapper[4931]: I0130 05:25:40.591157 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.180277 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p975f" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.208013 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.219810 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.248150 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.252272 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd612f9b-4de8-48e4-a945-c97e5c495292-operator-scripts\") pod \"dd612f9b-4de8-48e4-a945-c97e5c495292\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.252367 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv7l4\" (UniqueName: \"kubernetes.io/projected/dd612f9b-4de8-48e4-a945-c97e5c495292-kube-api-access-tv7l4\") pod \"dd612f9b-4de8-48e4-a945-c97e5c495292\" (UID: \"dd612f9b-4de8-48e4-a945-c97e5c495292\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.253678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd612f9b-4de8-48e4-a945-c97e5c495292-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd612f9b-4de8-48e4-a945-c97e5c495292" (UID: "dd612f9b-4de8-48e4-a945-c97e5c495292"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.262216 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd612f9b-4de8-48e4-a945-c97e5c495292-kube-api-access-tv7l4" (OuterVolumeSpecName: "kube-api-access-tv7l4") pod "dd612f9b-4de8-48e4-a945-c97e5c495292" (UID: "dd612f9b-4de8-48e4-a945-c97e5c495292"). InnerVolumeSpecName "kube-api-access-tv7l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.302123 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353313 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3b14699-8089-4af7-b0bd-654a8fda9715-operator-scripts\") pod \"c3b14699-8089-4af7-b0bd-654a8fda9715\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353596 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c29ace9-3be7-44a1-b8eb-d356a4721152-operator-scripts\") pod \"2c29ace9-3be7-44a1-b8eb-d356a4721152\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353664 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blgs2\" (UniqueName: \"kubernetes.io/projected/3b78d3f2-c575-4b24-bbb8-c956f61a575d-kube-api-access-blgs2\") pod \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3b14699-8089-4af7-b0bd-654a8fda9715-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3b14699-8089-4af7-b0bd-654a8fda9715" (UID: "c3b14699-8089-4af7-b0bd-654a8fda9715"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353790 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24htz\" (UniqueName: \"kubernetes.io/projected/c3b14699-8089-4af7-b0bd-654a8fda9715-kube-api-access-24htz\") pod \"c3b14699-8089-4af7-b0bd-654a8fda9715\" (UID: \"c3b14699-8089-4af7-b0bd-654a8fda9715\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353865 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b78d3f2-c575-4b24-bbb8-c956f61a575d-operator-scripts\") pod \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\" (UID: \"3b78d3f2-c575-4b24-bbb8-c956f61a575d\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.353888 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnnxt\" (UniqueName: \"kubernetes.io/projected/2c29ace9-3be7-44a1-b8eb-d356a4721152-kube-api-access-pnnxt\") pod \"2c29ace9-3be7-44a1-b8eb-d356a4721152\" (UID: \"2c29ace9-3be7-44a1-b8eb-d356a4721152\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.354473 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3b14699-8089-4af7-b0bd-654a8fda9715-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.354488 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd612f9b-4de8-48e4-a945-c97e5c495292-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.354498 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv7l4\" (UniqueName: \"kubernetes.io/projected/dd612f9b-4de8-48e4-a945-c97e5c495292-kube-api-access-tv7l4\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.354538 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c29ace9-3be7-44a1-b8eb-d356a4721152-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c29ace9-3be7-44a1-b8eb-d356a4721152" (UID: "2c29ace9-3be7-44a1-b8eb-d356a4721152"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.358127 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b78d3f2-c575-4b24-bbb8-c956f61a575d-kube-api-access-blgs2" (OuterVolumeSpecName: "kube-api-access-blgs2") pod "3b78d3f2-c575-4b24-bbb8-c956f61a575d" (UID: "3b78d3f2-c575-4b24-bbb8-c956f61a575d"). InnerVolumeSpecName "kube-api-access-blgs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.358320 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b14699-8089-4af7-b0bd-654a8fda9715-kube-api-access-24htz" (OuterVolumeSpecName: "kube-api-access-24htz") pod "c3b14699-8089-4af7-b0bd-654a8fda9715" (UID: "c3b14699-8089-4af7-b0bd-654a8fda9715"). InnerVolumeSpecName "kube-api-access-24htz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.359053 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c29ace9-3be7-44a1-b8eb-d356a4721152-kube-api-access-pnnxt" (OuterVolumeSpecName: "kube-api-access-pnnxt") pod "2c29ace9-3be7-44a1-b8eb-d356a4721152" (UID: "2c29ace9-3be7-44a1-b8eb-d356a4721152"). InnerVolumeSpecName "kube-api-access-pnnxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.359624 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b78d3f2-c575-4b24-bbb8-c956f61a575d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b78d3f2-c575-4b24-bbb8-c956f61a575d" (UID: "3b78d3f2-c575-4b24-bbb8-c956f61a575d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.455916 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-nb\") pod \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.455964 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-config\") pod \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.455988 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pjqc\" (UniqueName: \"kubernetes.io/projected/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-kube-api-access-4pjqc\") pod \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456140 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-sb\") pod \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456182 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-dns-svc\") pod \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\" (UID: \"eb20eb5e-4f22-4088-98dc-44eaf5ac5958\") " Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456495 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c29ace9-3be7-44a1-b8eb-d356a4721152-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456510 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blgs2\" (UniqueName: \"kubernetes.io/projected/3b78d3f2-c575-4b24-bbb8-c956f61a575d-kube-api-access-blgs2\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456522 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24htz\" (UniqueName: \"kubernetes.io/projected/c3b14699-8089-4af7-b0bd-654a8fda9715-kube-api-access-24htz\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456533 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b78d3f2-c575-4b24-bbb8-c956f61a575d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.456541 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnnxt\" (UniqueName: \"kubernetes.io/projected/2c29ace9-3be7-44a1-b8eb-d356a4721152-kube-api-access-pnnxt\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.471210 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-kube-api-access-4pjqc" (OuterVolumeSpecName: "kube-api-access-4pjqc") pod "eb20eb5e-4f22-4088-98dc-44eaf5ac5958" (UID: "eb20eb5e-4f22-4088-98dc-44eaf5ac5958"). InnerVolumeSpecName "kube-api-access-4pjqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.497150 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb20eb5e-4f22-4088-98dc-44eaf5ac5958" (UID: "eb20eb5e-4f22-4088-98dc-44eaf5ac5958"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.506845 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb20eb5e-4f22-4088-98dc-44eaf5ac5958" (UID: "eb20eb5e-4f22-4088-98dc-44eaf5ac5958"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.508062 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb20eb5e-4f22-4088-98dc-44eaf5ac5958" (UID: "eb20eb5e-4f22-4088-98dc-44eaf5ac5958"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.510217 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-config" (OuterVolumeSpecName: "config") pod "eb20eb5e-4f22-4088-98dc-44eaf5ac5958" (UID: "eb20eb5e-4f22-4088-98dc-44eaf5ac5958"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.559047 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.559185 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.559356 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.559398 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.559414 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pjqc\" (UniqueName: \"kubernetes.io/projected/eb20eb5e-4f22-4088-98dc-44eaf5ac5958-kube-api-access-4pjqc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.746768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8ee9-account-create-update-sdn4j" event={"ID":"3b78d3f2-c575-4b24-bbb8-c956f61a575d","Type":"ContainerDied","Data":"64352ea50cbd49b7e446033adc71bd1636d6e240e488199b49931a95d2cff069"} Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.746814 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64352ea50cbd49b7e446033adc71bd1636d6e240e488199b49931a95d2cff069" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.746883 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-sdn4j" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.754595 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4c2nt" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.754630 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4c2nt" event={"ID":"c3b14699-8089-4af7-b0bd-654a8fda9715","Type":"ContainerDied","Data":"c98a171eb3b4b2a4cc6684d3ea0e312812cf19d06e872769140d149acb612bf9"} Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.754671 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98a171eb3b4b2a4cc6684d3ea0e312812cf19d06e872769140d149acb612bf9" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.761039 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4gqzx" event={"ID":"9d923658-472c-4565-bae3-5eb1e329a92c","Type":"ContainerStarted","Data":"508e6e3003e86ee32a2b32dcec684271942a70c7d32070551d5605127eb8d9ad"} Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.764557 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" event={"ID":"eb20eb5e-4f22-4088-98dc-44eaf5ac5958","Type":"ContainerDied","Data":"683aec62918f40b319d4b21f6811c3625fb69dac85c6d8a9170f3b1e7160bffa"} Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.764622 4931 scope.go:117] "RemoveContainer" containerID="48a476fc993377e732a41ec538be9f206289fff472ea517e4a1e5122eac3f5e4" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.764791 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-5hp4b" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.768688 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cwv94" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.768882 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cwv94" event={"ID":"2c29ace9-3be7-44a1-b8eb-d356a4721152","Type":"ContainerDied","Data":"5db11c59dc0ea93fac43524325f66f48d5401cc5cc845a375c0bc2d6e3288c9e"} Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.768927 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db11c59dc0ea93fac43524325f66f48d5401cc5cc845a375c0bc2d6e3288c9e" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.772086 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p975f" event={"ID":"dd612f9b-4de8-48e4-a945-c97e5c495292","Type":"ContainerDied","Data":"c42924e228e0faa34852385f08124501fd35f78817571e20aa497d3a1537a563"} Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.772115 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42924e228e0faa34852385f08124501fd35f78817571e20aa497d3a1537a563" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.772174 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p975f" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.778719 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4gqzx" podStartSLOduration=5.912243733 podStartE2EDuration="10.778706695s" podCreationTimestamp="2026-01-30 05:25:31 +0000 UTC" firstStartedPulling="2026-01-30 05:25:36.199361395 +0000 UTC m=+1071.569271652" lastFinishedPulling="2026-01-30 05:25:41.065824357 +0000 UTC m=+1076.435734614" observedRunningTime="2026-01-30 05:25:41.777153991 +0000 UTC m=+1077.147064268" watchObservedRunningTime="2026-01-30 05:25:41.778706695 +0000 UTC m=+1077.148616962" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.806823 4931 scope.go:117] "RemoveContainer" containerID="5b51c3e6a6e67206beccccc2be017d2e75bb1a8386fa12f6af6b641475f06048" Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.810739 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-5hp4b"] Jan 30 05:25:41 crc kubenswrapper[4931]: I0130 05:25:41.817795 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-5hp4b"] Jan 30 05:25:43 crc kubenswrapper[4931]: I0130 05:25:43.439042 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" path="/var/lib/kubelet/pods/eb20eb5e-4f22-4088-98dc-44eaf5ac5958/volumes" Jan 30 05:25:44 crc kubenswrapper[4931]: I0130 05:25:44.811467 4931 generic.go:334] "Generic (PLEG): container finished" podID="9d923658-472c-4565-bae3-5eb1e329a92c" containerID="508e6e3003e86ee32a2b32dcec684271942a70c7d32070551d5605127eb8d9ad" exitCode=0 Jan 30 05:25:44 crc kubenswrapper[4931]: I0130 05:25:44.811471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4gqzx" event={"ID":"9d923658-472c-4565-bae3-5eb1e329a92c","Type":"ContainerDied","Data":"508e6e3003e86ee32a2b32dcec684271942a70c7d32070551d5605127eb8d9ad"} Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.244045 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.356030 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lgmx\" (UniqueName: \"kubernetes.io/projected/9d923658-472c-4565-bae3-5eb1e329a92c-kube-api-access-6lgmx\") pod \"9d923658-472c-4565-bae3-5eb1e329a92c\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.356118 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-combined-ca-bundle\") pod \"9d923658-472c-4565-bae3-5eb1e329a92c\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.356173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data\") pod \"9d923658-472c-4565-bae3-5eb1e329a92c\" (UID: \"9d923658-472c-4565-bae3-5eb1e329a92c\") " Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.363653 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d923658-472c-4565-bae3-5eb1e329a92c-kube-api-access-6lgmx" (OuterVolumeSpecName: "kube-api-access-6lgmx") pod "9d923658-472c-4565-bae3-5eb1e329a92c" (UID: "9d923658-472c-4565-bae3-5eb1e329a92c"). InnerVolumeSpecName "kube-api-access-6lgmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.397619 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d923658-472c-4565-bae3-5eb1e329a92c" (UID: "9d923658-472c-4565-bae3-5eb1e329a92c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.399745 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data" (OuterVolumeSpecName: "config-data") pod "9d923658-472c-4565-bae3-5eb1e329a92c" (UID: "9d923658-472c-4565-bae3-5eb1e329a92c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.457753 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lgmx\" (UniqueName: \"kubernetes.io/projected/9d923658-472c-4565-bae3-5eb1e329a92c-kube-api-access-6lgmx\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.457795 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.457806 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d923658-472c-4565-bae3-5eb1e329a92c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.839047 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4gqzx" event={"ID":"9d923658-472c-4565-bae3-5eb1e329a92c","Type":"ContainerDied","Data":"685d0bdee5d1a947f16ef0e29880c274ee3422ec09c45cb58fd08bec46c96278"} Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.839105 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685d0bdee5d1a947f16ef0e29880c274ee3422ec09c45cb58fd08bec46c96278" Jan 30 05:25:46 crc kubenswrapper[4931]: I0130 05:25:46.839116 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4gqzx" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.078381 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zrpdg"] Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079015 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c29ace9-3be7-44a1-b8eb-d356a4721152" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079035 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c29ace9-3be7-44a1-b8eb-d356a4721152" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079054 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b14699-8089-4af7-b0bd-654a8fda9715" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079062 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b14699-8089-4af7-b0bd-654a8fda9715" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079078 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d923658-472c-4565-bae3-5eb1e329a92c" containerName="keystone-db-sync" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079086 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d923658-472c-4565-bae3-5eb1e329a92c" containerName="keystone-db-sync" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079105 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd612f9b-4de8-48e4-a945-c97e5c495292" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079114 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd612f9b-4de8-48e4-a945-c97e5c495292" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079130 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b78d3f2-c575-4b24-bbb8-c956f61a575d" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079140 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b78d3f2-c575-4b24-bbb8-c956f61a575d" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079157 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6b82f5-5c39-4101-b9f8-05aaf9547a0b" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079165 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6b82f5-5c39-4101-b9f8-05aaf9547a0b" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079175 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="init" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079183 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="init" Jan 30 05:25:47 crc kubenswrapper[4931]: E0130 05:25:47.079205 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="dnsmasq-dns" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079213 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="dnsmasq-dns" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079394 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6b82f5-5c39-4101-b9f8-05aaf9547a0b" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079406 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d923658-472c-4565-bae3-5eb1e329a92c" containerName="keystone-db-sync" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079442 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd612f9b-4de8-48e4-a945-c97e5c495292" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079453 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b78d3f2-c575-4b24-bbb8-c956f61a575d" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079472 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c29ace9-3be7-44a1-b8eb-d356a4721152" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079485 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b14699-8089-4af7-b0bd-654a8fda9715" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.079502 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb20eb5e-4f22-4088-98dc-44eaf5ac5958" containerName="dnsmasq-dns" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.080490 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.089046 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zrpdg"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.149079 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2gl9c"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.150408 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.156472 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.156624 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.156703 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.156815 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.156936 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llv5h" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.161521 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2gl9c"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.170861 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.170942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.170979 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-config\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.171014 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbtg6\" (UniqueName: \"kubernetes.io/projected/81259525-e98e-4119-9071-2e17b0fb1640-kube-api-access-nbtg6\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.171031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.171051 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wwl\" (UniqueName: \"kubernetes.io/projected/8946f758-7352-4859-a3c3-b98bca9b99e4-kube-api-access-p7wwl\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272512 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-fernet-keys\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272595 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-config\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272631 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbtg6\" (UniqueName: \"kubernetes.io/projected/81259525-e98e-4119-9071-2e17b0fb1640-kube-api-access-nbtg6\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272649 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272667 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-config-data\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272690 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272720 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-credential-keys\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272742 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-combined-ca-bundle\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272757 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.272782 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-scripts\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.273625 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.274154 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-config\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.274887 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.275389 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.275900 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.311484 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbtg6\" (UniqueName: \"kubernetes.io/projected/81259525-e98e-4119-9071-2e17b0fb1640-kube-api-access-nbtg6\") pod \"dnsmasq-dns-54b4bb76d5-zrpdg\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.326143 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.327938 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.333952 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.334109 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.344008 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.374914 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wwl\" (UniqueName: \"kubernetes.io/projected/8946f758-7352-4859-a3c3-b98bca9b99e4-kube-api-access-p7wwl\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.374984 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-fernet-keys\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.375094 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-config-data\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.375141 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-credential-keys\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.375175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-combined-ca-bundle\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.375209 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-scripts\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.379302 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-scripts\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.379537 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-credential-keys\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.379563 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-config-data\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.381179 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-fernet-keys\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.387365 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-combined-ca-bundle\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.394997 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wwl\" (UniqueName: \"kubernetes.io/projected/8946f758-7352-4859-a3c3-b98bca9b99e4-kube-api-access-p7wwl\") pod \"keystone-bootstrap-2gl9c\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.404910 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ldr24"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.405874 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.407992 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vt49t" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.409045 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.410577 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.439179 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ldr24"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.461487 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kbkmb"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.462488 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.465936 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.466196 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ffrzt" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.468015 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.471279 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.476905 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-combined-ca-bundle\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.476959 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-scripts\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.476994 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-config-data\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.477031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-log-httpd\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.477059 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-db-sync-config-data\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.477658 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rpr97"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.484484 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.478325 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.484977 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p852h\" (UniqueName: \"kubernetes.io/projected/f3ddcee7-a757-43b5-bf76-552cbd8d9078-kube-api-access-p852h\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.485004 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-run-httpd\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.485095 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9x6\" (UniqueName: \"kubernetes.io/projected/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-kube-api-access-nl9x6\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.485128 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.486267 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kv6bp" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.486578 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.488593 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.506259 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kbkmb"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.518022 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rpr97"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.578782 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fkqxj"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.579770 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.586977 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.587289 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.587815 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fkqxj"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.588884 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-config\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.588915 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-db-sync-config-data\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.588955 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-config-data\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.588984 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9x6\" (UniqueName: \"kubernetes.io/projected/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-kube-api-access-nl9x6\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.589008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.589130 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvg45\" (UniqueName: \"kubernetes.io/projected/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-kube-api-access-zvg45\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.589150 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-etc-machine-id\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.598760 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-combined-ca-bundle\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.598855 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-scripts\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.598915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-config-data\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.598963 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-scripts\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599035 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-log-httpd\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599052 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-combined-ca-bundle\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jwz\" (UniqueName: \"kubernetes.io/projected/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-kube-api-access-x6jwz\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599110 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-db-sync-config-data\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599172 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-combined-ca-bundle\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599191 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599211 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p852h\" (UniqueName: \"kubernetes.io/projected/f3ddcee7-a757-43b5-bf76-552cbd8d9078-kube-api-access-p852h\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599232 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-run-httpd\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.599690 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-run-httpd\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.611456 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zrpdg"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.611917 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-log-httpd\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.612481 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-scripts\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.618323 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-db-sync-config-data\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.619347 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9x6\" (UniqueName: \"kubernetes.io/projected/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-kube-api-access-nl9x6\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.620030 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-combined-ca-bundle\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.621009 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-config-data\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.622787 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.625028 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.631970 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fttzx" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.634398 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-6rw7f"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.635282 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p852h\" (UniqueName: \"kubernetes.io/projected/f3ddcee7-a757-43b5-bf76-552cbd8d9078-kube-api-access-p852h\") pod \"barbican-db-sync-ldr24\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.635674 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.659920 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.694021 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-6rw7f"] Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703697 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-combined-ca-bundle\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703740 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jwz\" (UniqueName: \"kubernetes.io/projected/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-kube-api-access-x6jwz\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703784 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-combined-ca-bundle\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703814 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-logs\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703831 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-scripts\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-db-sync-config-data\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-config\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703886 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-config-data\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703927 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-combined-ca-bundle\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703951 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvg45\" (UniqueName: \"kubernetes.io/projected/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-kube-api-access-zvg45\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.703970 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-etc-machine-id\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.704024 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkqq\" (UniqueName: \"kubernetes.io/projected/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-kube-api-access-5lkqq\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.704047 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-config-data\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.704078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-scripts\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.707660 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-combined-ca-bundle\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.707999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-scripts\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.711888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-db-sync-config-data\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.715109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-combined-ca-bundle\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.715188 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-etc-machine-id\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.718012 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-config\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.719402 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-config-data\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.727622 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jwz\" (UniqueName: \"kubernetes.io/projected/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-kube-api-access-x6jwz\") pod \"cinder-db-sync-rpr97\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.752919 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvg45\" (UniqueName: \"kubernetes.io/projected/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-kube-api-access-zvg45\") pod \"neutron-db-sync-kbkmb\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805630 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-logs\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805734 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-scripts\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805767 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mcg\" (UniqueName: \"kubernetes.io/projected/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-kube-api-access-s5mcg\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-combined-ca-bundle\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805843 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805867 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-config\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805891 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkqq\" (UniqueName: \"kubernetes.io/projected/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-kube-api-access-5lkqq\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805916 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-config-data\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.805939 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.806608 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-logs\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.810099 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-combined-ca-bundle\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.810982 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-config-data\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.812210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-scripts\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.825958 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkqq\" (UniqueName: \"kubernetes.io/projected/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-kube-api-access-5lkqq\") pod \"placement-db-sync-fkqxj\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.855628 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ldr24" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.871384 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.896310 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpr97" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.907586 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-config\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.907637 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.907699 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.907753 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.907775 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mcg\" (UniqueName: \"kubernetes.io/projected/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-kube-api-access-s5mcg\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.907811 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.909628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-config\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.909761 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.910299 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.910884 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.911198 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.924930 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mcg\" (UniqueName: \"kubernetes.io/projected/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-kube-api-access-s5mcg\") pod \"dnsmasq-dns-5dc4fcdbc-6rw7f\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.924997 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fkqxj" Jan 30 05:25:47 crc kubenswrapper[4931]: I0130 05:25:47.971630 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.063954 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zrpdg"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.189072 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2gl9c"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.195322 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:48 crc kubenswrapper[4931]: W0130 05:25:48.223347 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8946f758_7352_4859_a3c3_b98bca9b99e4.slice/crio-c26bd96ed17ec97ce33862cd249a01687dfe99dfd46407b53a72a430d1e772ff WatchSource:0}: Error finding container c26bd96ed17ec97ce33862cd249a01687dfe99dfd46407b53a72a430d1e772ff: Status 404 returned error can't find the container with id c26bd96ed17ec97ce33862cd249a01687dfe99dfd46407b53a72a430d1e772ff Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.253302 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.254630 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.257348 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.257558 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lnq99" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.257670 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.257782 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.269387 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.309254 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.314172 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.321866 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.322087 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.324242 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418322 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418615 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418646 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-config-data\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418668 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418694 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418746 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-scripts\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418788 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-logs\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418808 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-logs\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418831 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418849 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xdb\" (UniqueName: \"kubernetes.io/projected/85061f18-4349-447f-b1ca-4a9a54461745-kube-api-access-l6xdb\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418867 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418915 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418935 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2l92\" (UniqueName: \"kubernetes.io/projected/a718b748-698c-44cc-8a28-b66a97405c41-kube-api-access-f2l92\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.418955 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.447888 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ldr24"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529307 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-scripts\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529372 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-logs\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529397 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-logs\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529437 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529462 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xdb\" (UniqueName: \"kubernetes.io/projected/85061f18-4349-447f-b1ca-4a9a54461745-kube-api-access-l6xdb\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529490 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529524 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529549 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529570 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2l92\" (UniqueName: \"kubernetes.io/projected/a718b748-698c-44cc-8a28-b66a97405c41-kube-api-access-f2l92\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529619 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529638 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529658 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-config-data\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529673 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.529712 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.532305 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-logs\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.532636 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-logs\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.533055 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.537333 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.538353 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.550846 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.554360 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.560531 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-config-data\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.569177 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.574505 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.574965 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.576209 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-scripts\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.585043 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rpr97"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.590532 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.591050 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.592390 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xdb\" (UniqueName: \"kubernetes.io/projected/85061f18-4349-447f-b1ca-4a9a54461745-kube-api-access-l6xdb\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.594162 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2l92\" (UniqueName: \"kubernetes.io/projected/a718b748-698c-44cc-8a28-b66a97405c41-kube-api-access-f2l92\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.609679 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.609968 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.662781 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kbkmb"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.680535 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fkqxj"] Jan 30 05:25:48 crc kubenswrapper[4931]: W0130 05:25:48.686831 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3ddcee7_a757_43b5_bf76_552cbd8d9078.slice/crio-52736b18c683d128d17308fceb7bf9f60b140aff7c806fb9ba3a93b56cb26bc0 WatchSource:0}: Error finding container 52736b18c683d128d17308fceb7bf9f60b140aff7c806fb9ba3a93b56cb26bc0: Status 404 returned error can't find the container with id 52736b18c683d128d17308fceb7bf9f60b140aff7c806fb9ba3a93b56cb26bc0 Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.707558 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-6rw7f"] Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.715444 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.881454 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fkqxj" event={"ID":"438fbbb5-a318-4714-9dac-e3f0fc3f63d3","Type":"ContainerStarted","Data":"46af11befd18c4fdbfdc15f44fec26d441cc576260685156c355baab6e60ddb1"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.902843 4931 generic.go:334] "Generic (PLEG): container finished" podID="81259525-e98e-4119-9071-2e17b0fb1640" containerID="869ab78f6d42b66bebe565b222bf7967e6ecee6cf3f268053a0c2c39b7f70563" exitCode=0 Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.902917 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" event={"ID":"81259525-e98e-4119-9071-2e17b0fb1640","Type":"ContainerDied","Data":"869ab78f6d42b66bebe565b222bf7967e6ecee6cf3f268053a0c2c39b7f70563"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.902943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" event={"ID":"81259525-e98e-4119-9071-2e17b0fb1640","Type":"ContainerStarted","Data":"886b6923a64a16210b17afbe28659527c17a059e7a196a2fd3f76fbb734ff512"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.903367 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.942188 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kbkmb" event={"ID":"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719","Type":"ContainerStarted","Data":"7ad5adedbc116cdb578bc473211ba2fbb992ce127a4e1710f27293fb378d6bd0"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.950591 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ldr24" event={"ID":"f3ddcee7-a757-43b5-bf76-552cbd8d9078","Type":"ContainerStarted","Data":"52736b18c683d128d17308fceb7bf9f60b140aff7c806fb9ba3a93b56cb26bc0"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.955880 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpr97" event={"ID":"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6","Type":"ContainerStarted","Data":"f838ce1f11e506679d678bae95342cc3dcecec78b2114b17644603c407ad3619"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.962631 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerStarted","Data":"c0144289ab3513de686db41a01bd60e595d46a3f8bcaea66b48e5c2753f90feb"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.976609 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2gl9c" event={"ID":"8946f758-7352-4859-a3c3-b98bca9b99e4","Type":"ContainerStarted","Data":"43be7b60c2a00736cc6eb1df08fb2617062a5a1b05069e0aa41b60294b71b16c"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.976662 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2gl9c" event={"ID":"8946f758-7352-4859-a3c3-b98bca9b99e4","Type":"ContainerStarted","Data":"c26bd96ed17ec97ce33862cd249a01687dfe99dfd46407b53a72a430d1e772ff"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.979999 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" event={"ID":"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089","Type":"ContainerStarted","Data":"ae9e2c438382358c90343f1970628f2e8ea67a2dacc48b9e3c93a331cd67467e"} Jan 30 05:25:48 crc kubenswrapper[4931]: I0130 05:25:48.997651 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2gl9c" podStartSLOduration=1.997635045 podStartE2EDuration="1.997635045s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:48.995528754 +0000 UTC m=+1084.365439011" watchObservedRunningTime="2026-01-30 05:25:48.997635045 +0000 UTC m=+1084.367545302" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.223629 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.265377 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:49 crc kubenswrapper[4931]: W0130 05:25:49.265536 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85061f18_4349_447f_b1ca_4a9a54461745.slice/crio-e854d57483e337fee1187a460568d29dfc2cb253eaa1fc3cbbd817c58be36b42 WatchSource:0}: Error finding container e854d57483e337fee1187a460568d29dfc2cb253eaa1fc3cbbd817c58be36b42: Status 404 returned error can't find the container with id e854d57483e337fee1187a460568d29dfc2cb253eaa1fc3cbbd817c58be36b42 Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.367894 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-svc\") pod \"81259525-e98e-4119-9071-2e17b0fb1640\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.367968 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-sb\") pod \"81259525-e98e-4119-9071-2e17b0fb1640\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.368082 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-nb\") pod \"81259525-e98e-4119-9071-2e17b0fb1640\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.368122 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-config\") pod \"81259525-e98e-4119-9071-2e17b0fb1640\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.368179 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbtg6\" (UniqueName: \"kubernetes.io/projected/81259525-e98e-4119-9071-2e17b0fb1640-kube-api-access-nbtg6\") pod \"81259525-e98e-4119-9071-2e17b0fb1640\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.368210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-swift-storage-0\") pod \"81259525-e98e-4119-9071-2e17b0fb1640\" (UID: \"81259525-e98e-4119-9071-2e17b0fb1640\") " Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.396412 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81259525-e98e-4119-9071-2e17b0fb1640" (UID: "81259525-e98e-4119-9071-2e17b0fb1640"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.397367 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81259525-e98e-4119-9071-2e17b0fb1640-kube-api-access-nbtg6" (OuterVolumeSpecName: "kube-api-access-nbtg6") pod "81259525-e98e-4119-9071-2e17b0fb1640" (UID: "81259525-e98e-4119-9071-2e17b0fb1640"). InnerVolumeSpecName "kube-api-access-nbtg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.400045 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81259525-e98e-4119-9071-2e17b0fb1640" (UID: "81259525-e98e-4119-9071-2e17b0fb1640"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.410855 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81259525-e98e-4119-9071-2e17b0fb1640" (UID: "81259525-e98e-4119-9071-2e17b0fb1640"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.411101 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81259525-e98e-4119-9071-2e17b0fb1640" (UID: "81259525-e98e-4119-9071-2e17b0fb1640"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.432725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-config" (OuterVolumeSpecName: "config") pod "81259525-e98e-4119-9071-2e17b0fb1640" (UID: "81259525-e98e-4119-9071-2e17b0fb1640"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.470057 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.470088 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.470118 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.470131 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.470139 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbtg6\" (UniqueName: \"kubernetes.io/projected/81259525-e98e-4119-9071-2e17b0fb1640-kube-api-access-nbtg6\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.470149 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81259525-e98e-4119-9071-2e17b0fb1640-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.522659 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.821027 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.870642 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.890227 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.995786 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" event={"ID":"81259525-e98e-4119-9071-2e17b0fb1640","Type":"ContainerDied","Data":"886b6923a64a16210b17afbe28659527c17a059e7a196a2fd3f76fbb734ff512"} Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.995816 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zrpdg" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.995845 4931 scope.go:117] "RemoveContainer" containerID="869ab78f6d42b66bebe565b222bf7967e6ecee6cf3f268053a0c2c39b7f70563" Jan 30 05:25:49 crc kubenswrapper[4931]: I0130 05:25:49.999412 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a718b748-698c-44cc-8a28-b66a97405c41","Type":"ContainerStarted","Data":"c3f9b84bf435cb31e157c89826a592f8cd6b22ac59d33e952e46664d8ee81ab7"} Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.002612 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kbkmb" event={"ID":"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719","Type":"ContainerStarted","Data":"6a0d3a37541dd8bff3edd7d0762f4af19258be892f0b28a3ee8ffd644ba91460"} Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.006095 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85061f18-4349-447f-b1ca-4a9a54461745","Type":"ContainerStarted","Data":"e854d57483e337fee1187a460568d29dfc2cb253eaa1fc3cbbd817c58be36b42"} Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.009953 4931 generic.go:334] "Generic (PLEG): container finished" podID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerID="2de911fa734d3f7bf71674e62b4beae90797f33e1cefb2483c1ee516fdc3ab44" exitCode=0 Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.010012 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" event={"ID":"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089","Type":"ContainerDied","Data":"2de911fa734d3f7bf71674e62b4beae90797f33e1cefb2483c1ee516fdc3ab44"} Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.057314 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zrpdg"] Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.084391 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zrpdg"] Jan 30 05:25:50 crc kubenswrapper[4931]: I0130 05:25:50.092051 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kbkmb" podStartSLOduration=3.09202944 podStartE2EDuration="3.09202944s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:50.04580906 +0000 UTC m=+1085.415719317" watchObservedRunningTime="2026-01-30 05:25:50.09202944 +0000 UTC m=+1085.461939697" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.025842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85061f18-4349-447f-b1ca-4a9a54461745","Type":"ContainerStarted","Data":"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7"} Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.028122 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85061f18-4349-447f-b1ca-4a9a54461745","Type":"ContainerStarted","Data":"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c"} Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.027062 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-httpd" containerID="cri-o://49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7" gracePeriod=30 Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.026631 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-log" containerID="cri-o://0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c" gracePeriod=30 Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.037895 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" event={"ID":"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089","Type":"ContainerStarted","Data":"00f6a2dd44878296bf4733164be83ce28b802aa8c1f8943860365c668511c527"} Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.038047 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.041112 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a718b748-698c-44cc-8a28-b66a97405c41","Type":"ContainerStarted","Data":"372dbc3e463623e2b9f3493644a607e3b3dd6b5d454b6497db8ef4d380851ed9"} Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.048553 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.048511627 podStartE2EDuration="4.048511627s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:51.0430667 +0000 UTC m=+1086.412976957" watchObservedRunningTime="2026-01-30 05:25:51.048511627 +0000 UTC m=+1086.418421884" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.079250 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" podStartSLOduration=4.07921117 podStartE2EDuration="4.07921117s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:51.072038093 +0000 UTC m=+1086.441948360" watchObservedRunningTime="2026-01-30 05:25:51.07921117 +0000 UTC m=+1086.449121427" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.434770 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81259525-e98e-4119-9071-2e17b0fb1640" path="/var/lib/kubelet/pods/81259525-e98e-4119-9071-2e17b0fb1640/volumes" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.638469 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.808679 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-httpd-run\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809028 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-combined-ca-bundle\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809060 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809090 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-config-data\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809121 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-logs\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809144 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-scripts\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809161 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809217 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xdb\" (UniqueName: \"kubernetes.io/projected/85061f18-4349-447f-b1ca-4a9a54461745-kube-api-access-l6xdb\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809250 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-public-tls-certs\") pod \"85061f18-4349-447f-b1ca-4a9a54461745\" (UID: \"85061f18-4349-447f-b1ca-4a9a54461745\") " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809557 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.809690 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-logs" (OuterVolumeSpecName: "logs") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.815592 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.817559 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-scripts" (OuterVolumeSpecName: "scripts") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.818627 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85061f18-4349-447f-b1ca-4a9a54461745-kube-api-access-l6xdb" (OuterVolumeSpecName: "kube-api-access-l6xdb") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "kube-api-access-l6xdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.838988 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.862466 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-config-data" (OuterVolumeSpecName: "config-data") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.874195 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "85061f18-4349-447f-b1ca-4a9a54461745" (UID: "85061f18-4349-447f-b1ca-4a9a54461745"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.910942 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xdb\" (UniqueName: \"kubernetes.io/projected/85061f18-4349-447f-b1ca-4a9a54461745-kube-api-access-l6xdb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.910980 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.910989 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.911024 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.911036 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.911045 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85061f18-4349-447f-b1ca-4a9a54461745-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.911054 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85061f18-4349-447f-b1ca-4a9a54461745-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:51 crc kubenswrapper[4931]: I0130 05:25:51.948218 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.012587 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.057771 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a718b748-698c-44cc-8a28-b66a97405c41","Type":"ContainerStarted","Data":"9c62317118ee4bc559fe155e8ef7df2b681354b8f9e1e6b6cb1521be8624a39c"} Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.057992 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-log" containerID="cri-o://372dbc3e463623e2b9f3493644a607e3b3dd6b5d454b6497db8ef4d380851ed9" gracePeriod=30 Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.058202 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-httpd" containerID="cri-o://9c62317118ee4bc559fe155e8ef7df2b681354b8f9e1e6b6cb1521be8624a39c" gracePeriod=30 Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063085 4931 generic.go:334] "Generic (PLEG): container finished" podID="85061f18-4349-447f-b1ca-4a9a54461745" containerID="49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7" exitCode=143 Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063124 4931 generic.go:334] "Generic (PLEG): container finished" podID="85061f18-4349-447f-b1ca-4a9a54461745" containerID="0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c" exitCode=143 Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063212 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85061f18-4349-447f-b1ca-4a9a54461745","Type":"ContainerDied","Data":"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7"} Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063265 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85061f18-4349-447f-b1ca-4a9a54461745","Type":"ContainerDied","Data":"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c"} Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063321 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85061f18-4349-447f-b1ca-4a9a54461745","Type":"ContainerDied","Data":"e854d57483e337fee1187a460568d29dfc2cb253eaa1fc3cbbd817c58be36b42"} Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063341 4931 scope.go:117] "RemoveContainer" containerID="49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.063925 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.093020 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.092996864 podStartE2EDuration="5.092996864s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:52.081504644 +0000 UTC m=+1087.451414911" watchObservedRunningTime="2026-01-30 05:25:52.092996864 +0000 UTC m=+1087.462907121" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.115162 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.121604 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.139270 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:52 crc kubenswrapper[4931]: E0130 05:25:52.139685 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-httpd" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.139697 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-httpd" Jan 30 05:25:52 crc kubenswrapper[4931]: E0130 05:25:52.139712 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-log" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.139735 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-log" Jan 30 05:25:52 crc kubenswrapper[4931]: E0130 05:25:52.139747 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81259525-e98e-4119-9071-2e17b0fb1640" containerName="init" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.139752 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="81259525-e98e-4119-9071-2e17b0fb1640" containerName="init" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.140662 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-httpd" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.140692 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="81259525-e98e-4119-9071-2e17b0fb1640" containerName="init" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.140703 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="85061f18-4349-447f-b1ca-4a9a54461745" containerName="glance-log" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.141529 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.145480 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.145544 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.149261 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215176 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-scripts\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215221 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czq79\" (UniqueName: \"kubernetes.io/projected/18f01f64-f6e4-42f3-80f8-27c86f82eeef-kube-api-access-czq79\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215284 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215334 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215361 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-config-data\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215386 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-logs\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215410 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.215462 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316502 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-config-data\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-logs\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316597 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316720 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-scripts\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316744 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czq79\" (UniqueName: \"kubernetes.io/projected/18f01f64-f6e4-42f3-80f8-27c86f82eeef-kube-api-access-czq79\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316779 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.316826 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.322501 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.322972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-logs\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.323539 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.329629 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.334191 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.336864 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-config-data\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.341794 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czq79\" (UniqueName: \"kubernetes.io/projected/18f01f64-f6e4-42f3-80f8-27c86f82eeef-kube-api-access-czq79\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.342046 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-scripts\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.348540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:52 crc kubenswrapper[4931]: I0130 05:25:52.468179 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.075266 4931 generic.go:334] "Generic (PLEG): container finished" podID="a718b748-698c-44cc-8a28-b66a97405c41" containerID="9c62317118ee4bc559fe155e8ef7df2b681354b8f9e1e6b6cb1521be8624a39c" exitCode=0 Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.075292 4931 generic.go:334] "Generic (PLEG): container finished" podID="a718b748-698c-44cc-8a28-b66a97405c41" containerID="372dbc3e463623e2b9f3493644a607e3b3dd6b5d454b6497db8ef4d380851ed9" exitCode=143 Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.075318 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a718b748-698c-44cc-8a28-b66a97405c41","Type":"ContainerDied","Data":"9c62317118ee4bc559fe155e8ef7df2b681354b8f9e1e6b6cb1521be8624a39c"} Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.075396 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a718b748-698c-44cc-8a28-b66a97405c41","Type":"ContainerDied","Data":"372dbc3e463623e2b9f3493644a607e3b3dd6b5d454b6497db8ef4d380851ed9"} Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.079978 4931 generic.go:334] "Generic (PLEG): container finished" podID="8946f758-7352-4859-a3c3-b98bca9b99e4" containerID="43be7b60c2a00736cc6eb1df08fb2617062a5a1b05069e0aa41b60294b71b16c" exitCode=0 Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.080019 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2gl9c" event={"ID":"8946f758-7352-4859-a3c3-b98bca9b99e4","Type":"ContainerDied","Data":"43be7b60c2a00736cc6eb1df08fb2617062a5a1b05069e0aa41b60294b71b16c"} Jan 30 05:25:53 crc kubenswrapper[4931]: I0130 05:25:53.457969 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85061f18-4349-447f-b1ca-4a9a54461745" path="/var/lib/kubelet/pods/85061f18-4349-447f-b1ca-4a9a54461745/volumes" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.450699 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.459384 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.604658 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-httpd-run\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605072 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-fernet-keys\") pod \"8946f758-7352-4859-a3c3-b98bca9b99e4\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605277 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-config-data\") pod \"8946f758-7352-4859-a3c3-b98bca9b99e4\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605471 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605490 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7wwl\" (UniqueName: \"kubernetes.io/projected/8946f758-7352-4859-a3c3-b98bca9b99e4-kube-api-access-p7wwl\") pod \"8946f758-7352-4859-a3c3-b98bca9b99e4\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-config-data\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605714 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-credential-keys\") pod \"8946f758-7352-4859-a3c3-b98bca9b99e4\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605773 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-combined-ca-bundle\") pod \"8946f758-7352-4859-a3c3-b98bca9b99e4\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605847 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-scripts\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605888 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-combined-ca-bundle\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.605947 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-internal-tls-certs\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.606012 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-scripts\") pod \"8946f758-7352-4859-a3c3-b98bca9b99e4\" (UID: \"8946f758-7352-4859-a3c3-b98bca9b99e4\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.606044 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-logs\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.606461 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-logs" (OuterVolumeSpecName: "logs") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.606811 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2l92\" (UniqueName: \"kubernetes.io/projected/a718b748-698c-44cc-8a28-b66a97405c41-kube-api-access-f2l92\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.606857 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a718b748-698c-44cc-8a28-b66a97405c41\" (UID: \"a718b748-698c-44cc-8a28-b66a97405c41\") " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.607708 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.607737 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a718b748-698c-44cc-8a28-b66a97405c41-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.611899 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8946f758-7352-4859-a3c3-b98bca9b99e4" (UID: "8946f758-7352-4859-a3c3-b98bca9b99e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.612679 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-scripts" (OuterVolumeSpecName: "scripts") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.612766 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-scripts" (OuterVolumeSpecName: "scripts") pod "8946f758-7352-4859-a3c3-b98bca9b99e4" (UID: "8946f758-7352-4859-a3c3-b98bca9b99e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.616103 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.625150 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a718b748-698c-44cc-8a28-b66a97405c41-kube-api-access-f2l92" (OuterVolumeSpecName: "kube-api-access-f2l92") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "kube-api-access-f2l92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.624601 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8946f758-7352-4859-a3c3-b98bca9b99e4" (UID: "8946f758-7352-4859-a3c3-b98bca9b99e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.627085 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8946f758-7352-4859-a3c3-b98bca9b99e4-kube-api-access-p7wwl" (OuterVolumeSpecName: "kube-api-access-p7wwl") pod "8946f758-7352-4859-a3c3-b98bca9b99e4" (UID: "8946f758-7352-4859-a3c3-b98bca9b99e4"). InnerVolumeSpecName "kube-api-access-p7wwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.640861 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.641565 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-config-data" (OuterVolumeSpecName: "config-data") pod "8946f758-7352-4859-a3c3-b98bca9b99e4" (UID: "8946f758-7352-4859-a3c3-b98bca9b99e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.657691 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-config-data" (OuterVolumeSpecName: "config-data") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.660537 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8946f758-7352-4859-a3c3-b98bca9b99e4" (UID: "8946f758-7352-4859-a3c3-b98bca9b99e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.690940 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a718b748-698c-44cc-8a28-b66a97405c41" (UID: "a718b748-698c-44cc-8a28-b66a97405c41"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709602 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709643 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709657 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2l92\" (UniqueName: \"kubernetes.io/projected/a718b748-698c-44cc-8a28-b66a97405c41-kube-api-access-f2l92\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709696 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709710 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709722 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709733 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7wwl\" (UniqueName: \"kubernetes.io/projected/8946f758-7352-4859-a3c3-b98bca9b99e4-kube-api-access-p7wwl\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709744 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709756 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709767 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8946f758-7352-4859-a3c3-b98bca9b99e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709777 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.709789 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a718b748-698c-44cc-8a28-b66a97405c41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.736983 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 30 05:25:56 crc kubenswrapper[4931]: I0130 05:25:56.811579 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.123710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a718b748-698c-44cc-8a28-b66a97405c41","Type":"ContainerDied","Data":"c3f9b84bf435cb31e157c89826a592f8cd6b22ac59d33e952e46664d8ee81ab7"} Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.123803 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.129439 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2gl9c" event={"ID":"8946f758-7352-4859-a3c3-b98bca9b99e4","Type":"ContainerDied","Data":"c26bd96ed17ec97ce33862cd249a01687dfe99dfd46407b53a72a430d1e772ff"} Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.129472 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26bd96ed17ec97ce33862cd249a01687dfe99dfd46407b53a72a430d1e772ff" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.129517 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2gl9c" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.164924 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.174997 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.197524 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:57 crc kubenswrapper[4931]: E0130 05:25:57.197915 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-httpd" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.197931 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-httpd" Jan 30 05:25:57 crc kubenswrapper[4931]: E0130 05:25:57.197947 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-log" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.197953 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-log" Jan 30 05:25:57 crc kubenswrapper[4931]: E0130 05:25:57.197966 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8946f758-7352-4859-a3c3-b98bca9b99e4" containerName="keystone-bootstrap" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.197972 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8946f758-7352-4859-a3c3-b98bca9b99e4" containerName="keystone-bootstrap" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.198131 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-log" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.198141 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a718b748-698c-44cc-8a28-b66a97405c41" containerName="glance-httpd" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.198151 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="8946f758-7352-4859-a3c3-b98bca9b99e4" containerName="keystone-bootstrap" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.199016 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.201622 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.202640 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.208307 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.321949 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.322348 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn59m\" (UniqueName: \"kubernetes.io/projected/97f44787-3f37-44f1-85a5-4acffef71d95-kube-api-access-vn59m\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.322401 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.322472 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.322506 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.322851 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-logs\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.322931 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.323005 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.363124 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.363190 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424146 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424223 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424245 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn59m\" (UniqueName: \"kubernetes.io/projected/97f44787-3f37-44f1-85a5-4acffef71d95-kube-api-access-vn59m\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424285 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424327 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424354 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424431 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-logs\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424462 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.424883 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.432053 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.432724 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-logs\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.436048 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.437174 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a718b748-698c-44cc-8a28-b66a97405c41" path="/var/lib/kubelet/pods/a718b748-698c-44cc-8a28-b66a97405c41/volumes" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.437941 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.438719 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.440135 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.453815 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn59m\" (UniqueName: \"kubernetes.io/projected/97f44787-3f37-44f1-85a5-4acffef71d95-kube-api-access-vn59m\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.458767 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.530111 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.555263 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2gl9c"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.562319 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2gl9c"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.633085 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sdn7d"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.634344 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.638882 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.638945 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.639121 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.639627 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.639938 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llv5h" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.650841 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sdn7d"] Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.732486 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9zg\" (UniqueName: \"kubernetes.io/projected/2400d2d7-1da5-4a38-a558-c970226f95b9-kube-api-access-6g9zg\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.732558 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-config-data\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.732638 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-combined-ca-bundle\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.732678 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-fernet-keys\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.732703 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-credential-keys\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.732745 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-scripts\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.834729 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9zg\" (UniqueName: \"kubernetes.io/projected/2400d2d7-1da5-4a38-a558-c970226f95b9-kube-api-access-6g9zg\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.834813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-config-data\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.834858 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-combined-ca-bundle\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.834887 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-fernet-keys\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.834918 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-credential-keys\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.834972 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-scripts\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.840494 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-fernet-keys\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.841958 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-combined-ca-bundle\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.843822 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-scripts\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.852663 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-credential-keys\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.855326 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-config-data\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.860152 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9zg\" (UniqueName: \"kubernetes.io/projected/2400d2d7-1da5-4a38-a558-c970226f95b9-kube-api-access-6g9zg\") pod \"keystone-bootstrap-sdn7d\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.960018 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:25:57 crc kubenswrapper[4931]: I0130 05:25:57.973411 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:25:58 crc kubenswrapper[4931]: I0130 05:25:58.037923 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-8c6pt"] Jan 30 05:25:58 crc kubenswrapper[4931]: I0130 05:25:58.038330 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" containerID="cri-o://566ade23da173169f793e0bfc68dfb7fc94d967bbb01c1ecaa6d6c7476150a48" gracePeriod=10 Jan 30 05:25:58 crc kubenswrapper[4931]: I0130 05:25:58.299086 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 30 05:25:59 crc kubenswrapper[4931]: I0130 05:25:59.156706 4931 generic.go:334] "Generic (PLEG): container finished" podID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerID="566ade23da173169f793e0bfc68dfb7fc94d967bbb01c1ecaa6d6c7476150a48" exitCode=0 Jan 30 05:25:59 crc kubenswrapper[4931]: I0130 05:25:59.156846 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" event={"ID":"be176172-3d0c-47ae-aa98-d7ee20022f44","Type":"ContainerDied","Data":"566ade23da173169f793e0bfc68dfb7fc94d967bbb01c1ecaa6d6c7476150a48"} Jan 30 05:25:59 crc kubenswrapper[4931]: I0130 05:25:59.439235 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8946f758-7352-4859-a3c3-b98bca9b99e4" path="/var/lib/kubelet/pods/8946f758-7352-4859-a3c3-b98bca9b99e4/volumes" Jan 30 05:26:00 crc kubenswrapper[4931]: I0130 05:26:00.160253 4931 scope.go:117] "RemoveContainer" containerID="0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c" Jan 30 05:26:01 crc kubenswrapper[4931]: E0130 05:26:01.098112 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Jan 30 05:26:01 crc kubenswrapper[4931]: E0130 05:26:01.098728 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n698h544h5d6h5f4hcbh5d4h5fh579hch65ch566h667h64fh56bh5f8hf4h59dh557h8dh54fh8h557h684h667h575h5f4hd6h686h64ch686h644h5bdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nl9x6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(debbaca0-0d1f-47cd-bb8e-8e09e4a65307): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:26:03 crc kubenswrapper[4931]: I0130 05:26:03.298866 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 30 05:26:08 crc kubenswrapper[4931]: I0130 05:26:08.298944 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 30 05:26:08 crc kubenswrapper[4931]: I0130 05:26:08.299668 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.449413 4931 scope.go:117] "RemoveContainer" containerID="49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7" Jan 30 05:26:10 crc kubenswrapper[4931]: E0130 05:26:10.450371 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7\": container with ID starting with 49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7 not found: ID does not exist" containerID="49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.450414 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7"} err="failed to get container status \"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7\": rpc error: code = NotFound desc = could not find container \"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7\": container with ID starting with 49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7 not found: ID does not exist" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.450464 4931 scope.go:117] "RemoveContainer" containerID="0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c" Jan 30 05:26:10 crc kubenswrapper[4931]: E0130 05:26:10.450733 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c\": container with ID starting with 0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c not found: ID does not exist" containerID="0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.450753 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c"} err="failed to get container status \"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c\": rpc error: code = NotFound desc = could not find container \"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c\": container with ID starting with 0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c not found: ID does not exist" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.450765 4931 scope.go:117] "RemoveContainer" containerID="49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.450957 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7"} err="failed to get container status \"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7\": rpc error: code = NotFound desc = could not find container \"49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7\": container with ID starting with 49b76283e35cae85c5b79682a6f7a6d67465b0ab8bd0c34c0e7867dac854fcf7 not found: ID does not exist" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.450971 4931 scope.go:117] "RemoveContainer" containerID="0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.451168 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c"} err="failed to get container status \"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c\": rpc error: code = NotFound desc = could not find container \"0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c\": container with ID starting with 0e5c240e7423824a5311e12dfc17b9311e67c5ad4c50c5d91193f61d25e11d1c not found: ID does not exist" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.451182 4931 scope.go:117] "RemoveContainer" containerID="9c62317118ee4bc559fe155e8ef7df2b681354b8f9e1e6b6cb1521be8624a39c" Jan 30 05:26:10 crc kubenswrapper[4931]: E0130 05:26:10.458039 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 30 05:26:10 crc kubenswrapper[4931]: E0130 05:26:10.458523 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6jwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rpr97_openstack(6dd6723b-baf8-47eb-a774-68a5dfbcc4a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:26:10 crc kubenswrapper[4931]: E0130 05:26:10.460597 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rpr97" podUID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.562613 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.697299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-svc\") pod \"be176172-3d0c-47ae-aa98-d7ee20022f44\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.697492 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-sb\") pod \"be176172-3d0c-47ae-aa98-d7ee20022f44\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.697554 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-swift-storage-0\") pod \"be176172-3d0c-47ae-aa98-d7ee20022f44\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.697580 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-nb\") pod \"be176172-3d0c-47ae-aa98-d7ee20022f44\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.697605 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-config\") pod \"be176172-3d0c-47ae-aa98-d7ee20022f44\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.697620 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p52jc\" (UniqueName: \"kubernetes.io/projected/be176172-3d0c-47ae-aa98-d7ee20022f44-kube-api-access-p52jc\") pod \"be176172-3d0c-47ae-aa98-d7ee20022f44\" (UID: \"be176172-3d0c-47ae-aa98-d7ee20022f44\") " Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.731863 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be176172-3d0c-47ae-aa98-d7ee20022f44-kube-api-access-p52jc" (OuterVolumeSpecName: "kube-api-access-p52jc") pod "be176172-3d0c-47ae-aa98-d7ee20022f44" (UID: "be176172-3d0c-47ae-aa98-d7ee20022f44"). InnerVolumeSpecName "kube-api-access-p52jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.753284 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-config" (OuterVolumeSpecName: "config") pod "be176172-3d0c-47ae-aa98-d7ee20022f44" (UID: "be176172-3d0c-47ae-aa98-d7ee20022f44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.755371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "be176172-3d0c-47ae-aa98-d7ee20022f44" (UID: "be176172-3d0c-47ae-aa98-d7ee20022f44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.755455 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be176172-3d0c-47ae-aa98-d7ee20022f44" (UID: "be176172-3d0c-47ae-aa98-d7ee20022f44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.758266 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be176172-3d0c-47ae-aa98-d7ee20022f44" (UID: "be176172-3d0c-47ae-aa98-d7ee20022f44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.767250 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be176172-3d0c-47ae-aa98-d7ee20022f44" (UID: "be176172-3d0c-47ae-aa98-d7ee20022f44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.800470 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.800521 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.800532 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p52jc\" (UniqueName: \"kubernetes.io/projected/be176172-3d0c-47ae-aa98-d7ee20022f44-kube-api-access-p52jc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.800543 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.800551 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.800558 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be176172-3d0c-47ae-aa98-d7ee20022f44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.887906 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.900457 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sdn7d"] Jan 30 05:26:10 crc kubenswrapper[4931]: W0130 05:26:10.960385 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2400d2d7_1da5_4a38_a558_c970226f95b9.slice/crio-e46a6a605ff6b2a20359cac2b9ef2f718cb5499b26c58dd581090cd69c65e837 WatchSource:0}: Error finding container e46a6a605ff6b2a20359cac2b9ef2f718cb5499b26c58dd581090cd69c65e837: Status 404 returned error can't find the container with id e46a6a605ff6b2a20359cac2b9ef2f718cb5499b26c58dd581090cd69c65e837 Jan 30 05:26:10 crc kubenswrapper[4931]: I0130 05:26:10.972937 4931 scope.go:117] "RemoveContainer" containerID="372dbc3e463623e2b9f3493644a607e3b3dd6b5d454b6497db8ef4d380851ed9" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.286046 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18f01f64-f6e4-42f3-80f8-27c86f82eeef","Type":"ContainerStarted","Data":"f169988e956408b39f47bea60212630dcedf5b4c3315a89463a6589988357590"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.290339 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ldr24" event={"ID":"f3ddcee7-a757-43b5-bf76-552cbd8d9078","Type":"ContainerStarted","Data":"5737a395d9d729d9146705b5cbd342fcc1f4ff9a1712777cd6b02a06ccdce9e4"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.299500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerStarted","Data":"ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.306354 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ldr24" podStartSLOduration=2.522752548 podStartE2EDuration="24.306332127s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="2026-01-30 05:25:48.689544881 +0000 UTC m=+1084.059455138" lastFinishedPulling="2026-01-30 05:26:10.47312446 +0000 UTC m=+1105.843034717" observedRunningTime="2026-01-30 05:26:11.30386784 +0000 UTC m=+1106.673778097" watchObservedRunningTime="2026-01-30 05:26:11.306332127 +0000 UTC m=+1106.676242384" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.307777 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sdn7d" event={"ID":"2400d2d7-1da5-4a38-a558-c970226f95b9","Type":"ContainerStarted","Data":"703277ac00408ad9f7d1f58fc77ac68f5eef4a1090a051f9ca88ddf484b5fda4"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.307820 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sdn7d" event={"ID":"2400d2d7-1da5-4a38-a558-c970226f95b9","Type":"ContainerStarted","Data":"e46a6a605ff6b2a20359cac2b9ef2f718cb5499b26c58dd581090cd69c65e837"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.315041 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.315071 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-8c6pt" event={"ID":"be176172-3d0c-47ae-aa98-d7ee20022f44","Type":"ContainerDied","Data":"af809bcfb9bcd948f444820cb7e724048ff5c243bf6772c74d31c5eab0630ea9"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.315163 4931 scope.go:117] "RemoveContainer" containerID="566ade23da173169f793e0bfc68dfb7fc94d967bbb01c1ecaa6d6c7476150a48" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.316908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fkqxj" event={"ID":"438fbbb5-a318-4714-9dac-e3f0fc3f63d3","Type":"ContainerStarted","Data":"d39f6f20169ccd7e0eea2f20181dc418bff8322ca21fcf82d4f5c3d022992a6f"} Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.327157 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sdn7d" podStartSLOduration=14.327143485 podStartE2EDuration="14.327143485s" podCreationTimestamp="2026-01-30 05:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:11.323069588 +0000 UTC m=+1106.692979865" watchObservedRunningTime="2026-01-30 05:26:11.327143485 +0000 UTC m=+1106.697053742" Jan 30 05:26:11 crc kubenswrapper[4931]: E0130 05:26:11.329199 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-rpr97" podUID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.344223 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fkqxj" podStartSLOduration=2.612813966 podStartE2EDuration="24.344207687s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="2026-01-30 05:25:48.726381481 +0000 UTC m=+1084.096291738" lastFinishedPulling="2026-01-30 05:26:10.457775202 +0000 UTC m=+1105.827685459" observedRunningTime="2026-01-30 05:26:11.338458778 +0000 UTC m=+1106.708369035" watchObservedRunningTime="2026-01-30 05:26:11.344207687 +0000 UTC m=+1106.714117944" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.374811 4931 scope.go:117] "RemoveContainer" containerID="aa6a0d8cd249f8b0104844bcd59d7c80f0ef6c784ec9f9d65e07215bbb280738" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.389462 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-8c6pt"] Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.394913 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-8c6pt"] Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.441652 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" path="/var/lib/kubelet/pods/be176172-3d0c-47ae-aa98-d7ee20022f44/volumes" Jan 30 05:26:11 crc kubenswrapper[4931]: I0130 05:26:11.811869 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.337515 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18f01f64-f6e4-42f3-80f8-27c86f82eeef","Type":"ContainerStarted","Data":"6f20ab78e04ca2466a780c5cc51a4b37e0f487abee57f4e067c29bab7787be5d"} Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.337558 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18f01f64-f6e4-42f3-80f8-27c86f82eeef","Type":"ContainerStarted","Data":"d2ded790b556dc13af017d23c970f8fe6d49472a2741355949522d19b2e1e389"} Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.344652 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97f44787-3f37-44f1-85a5-4acffef71d95","Type":"ContainerStarted","Data":"c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57"} Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.344699 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97f44787-3f37-44f1-85a5-4acffef71d95","Type":"ContainerStarted","Data":"7bfff4eea4487971b7e050b186c84e3209413100130292fb4b6aba07f7e36bce"} Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.366409 4931 generic.go:334] "Generic (PLEG): container finished" podID="2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" containerID="6a0d3a37541dd8bff3edd7d0762f4af19258be892f0b28a3ee8ffd644ba91460" exitCode=0 Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.367195 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kbkmb" event={"ID":"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719","Type":"ContainerDied","Data":"6a0d3a37541dd8bff3edd7d0762f4af19258be892f0b28a3ee8ffd644ba91460"} Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.376861 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.376838133 podStartE2EDuration="20.376838133s" podCreationTimestamp="2026-01-30 05:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:12.364515219 +0000 UTC m=+1107.734425496" watchObservedRunningTime="2026-01-30 05:26:12.376838133 +0000 UTC m=+1107.746748390" Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.468973 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.469210 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.504344 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:26:12 crc kubenswrapper[4931]: I0130 05:26:12.510246 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:26:13 crc kubenswrapper[4931]: I0130 05:26:13.376780 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97f44787-3f37-44f1-85a5-4acffef71d95","Type":"ContainerStarted","Data":"4dc106a9347c18b30457fcfe0ba0955c89ca0037e4655a069609d973aaa2c8d2"} Jan 30 05:26:13 crc kubenswrapper[4931]: I0130 05:26:13.377106 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:26:13 crc kubenswrapper[4931]: I0130 05:26:13.377135 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:26:13 crc kubenswrapper[4931]: I0130 05:26:13.412658 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.412639897 podStartE2EDuration="16.412639897s" podCreationTimestamp="2026-01-30 05:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:13.400724665 +0000 UTC m=+1108.770634942" watchObservedRunningTime="2026-01-30 05:26:13.412639897 +0000 UTC m=+1108.782550154" Jan 30 05:26:15 crc kubenswrapper[4931]: I0130 05:26:15.435701 4931 generic.go:334] "Generic (PLEG): container finished" podID="2400d2d7-1da5-4a38-a558-c970226f95b9" containerID="703277ac00408ad9f7d1f58fc77ac68f5eef4a1090a051f9ca88ddf484b5fda4" exitCode=0 Jan 30 05:26:15 crc kubenswrapper[4931]: I0130 05:26:15.435801 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sdn7d" event={"ID":"2400d2d7-1da5-4a38-a558-c970226f95b9","Type":"ContainerDied","Data":"703277ac00408ad9f7d1f58fc77ac68f5eef4a1090a051f9ca88ddf484b5fda4"} Jan 30 05:26:15 crc kubenswrapper[4931]: I0130 05:26:15.438548 4931 generic.go:334] "Generic (PLEG): container finished" podID="438fbbb5-a318-4714-9dac-e3f0fc3f63d3" containerID="d39f6f20169ccd7e0eea2f20181dc418bff8322ca21fcf82d4f5c3d022992a6f" exitCode=0 Jan 30 05:26:15 crc kubenswrapper[4931]: I0130 05:26:15.438579 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fkqxj" event={"ID":"438fbbb5-a318-4714-9dac-e3f0fc3f63d3","Type":"ContainerDied","Data":"d39f6f20169ccd7e0eea2f20181dc418bff8322ca21fcf82d4f5c3d022992a6f"} Jan 30 05:26:16 crc kubenswrapper[4931]: I0130 05:26:16.449024 4931 generic.go:334] "Generic (PLEG): container finished" podID="f3ddcee7-a757-43b5-bf76-552cbd8d9078" containerID="5737a395d9d729d9146705b5cbd342fcc1f4ff9a1712777cd6b02a06ccdce9e4" exitCode=0 Jan 30 05:26:16 crc kubenswrapper[4931]: I0130 05:26:16.449107 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ldr24" event={"ID":"f3ddcee7-a757-43b5-bf76-552cbd8d9078","Type":"ContainerDied","Data":"5737a395d9d729d9146705b5cbd342fcc1f4ff9a1712777cd6b02a06ccdce9e4"} Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.466031 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kbkmb" event={"ID":"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719","Type":"ContainerDied","Data":"7ad5adedbc116cdb578bc473211ba2fbb992ce127a4e1710f27293fb378d6bd0"} Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.468282 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ad5adedbc116cdb578bc473211ba2fbb992ce127a4e1710f27293fb378d6bd0" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.471176 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sdn7d" event={"ID":"2400d2d7-1da5-4a38-a558-c970226f95b9","Type":"ContainerDied","Data":"e46a6a605ff6b2a20359cac2b9ef2f718cb5499b26c58dd581090cd69c65e837"} Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.471239 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e46a6a605ff6b2a20359cac2b9ef2f718cb5499b26c58dd581090cd69c65e837" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.479781 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fkqxj" event={"ID":"438fbbb5-a318-4714-9dac-e3f0fc3f63d3","Type":"ContainerDied","Data":"46af11befd18c4fdbfdc15f44fec26d441cc576260685156c355baab6e60ddb1"} Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.479840 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46af11befd18c4fdbfdc15f44fec26d441cc576260685156c355baab6e60ddb1" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.530386 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.530502 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.566040 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.581735 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.617841 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.648236 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fkqxj" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.666882 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724378 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-config-data\") pod \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724725 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-logs\") pod \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724748 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-credential-keys\") pod \"2400d2d7-1da5-4a38-a558-c970226f95b9\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724780 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-scripts\") pod \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724811 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lkqq\" (UniqueName: \"kubernetes.io/projected/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-kube-api-access-5lkqq\") pod \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724836 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-combined-ca-bundle\") pod \"2400d2d7-1da5-4a38-a558-c970226f95b9\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724861 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-config-data\") pod \"2400d2d7-1da5-4a38-a558-c970226f95b9\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724881 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-config\") pod \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724913 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-scripts\") pod \"2400d2d7-1da5-4a38-a558-c970226f95b9\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724941 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-combined-ca-bundle\") pod \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.724985 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvg45\" (UniqueName: \"kubernetes.io/projected/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-kube-api-access-zvg45\") pod \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\" (UID: \"2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.725001 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-fernet-keys\") pod \"2400d2d7-1da5-4a38-a558-c970226f95b9\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.725050 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-combined-ca-bundle\") pod \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\" (UID: \"438fbbb5-a318-4714-9dac-e3f0fc3f63d3\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.725075 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g9zg\" (UniqueName: \"kubernetes.io/projected/2400d2d7-1da5-4a38-a558-c970226f95b9-kube-api-access-6g9zg\") pod \"2400d2d7-1da5-4a38-a558-c970226f95b9\" (UID: \"2400d2d7-1da5-4a38-a558-c970226f95b9\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.727022 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-logs" (OuterVolumeSpecName: "logs") pod "438fbbb5-a318-4714-9dac-e3f0fc3f63d3" (UID: "438fbbb5-a318-4714-9dac-e3f0fc3f63d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.729769 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-kube-api-access-5lkqq" (OuterVolumeSpecName: "kube-api-access-5lkqq") pod "438fbbb5-a318-4714-9dac-e3f0fc3f63d3" (UID: "438fbbb5-a318-4714-9dac-e3f0fc3f63d3"). InnerVolumeSpecName "kube-api-access-5lkqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.730541 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-scripts" (OuterVolumeSpecName: "scripts") pod "438fbbb5-a318-4714-9dac-e3f0fc3f63d3" (UID: "438fbbb5-a318-4714-9dac-e3f0fc3f63d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.732132 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2400d2d7-1da5-4a38-a558-c970226f95b9-kube-api-access-6g9zg" (OuterVolumeSpecName: "kube-api-access-6g9zg") pod "2400d2d7-1da5-4a38-a558-c970226f95b9" (UID: "2400d2d7-1da5-4a38-a558-c970226f95b9"). InnerVolumeSpecName "kube-api-access-6g9zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.732844 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-kube-api-access-zvg45" (OuterVolumeSpecName: "kube-api-access-zvg45") pod "2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" (UID: "2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719"). InnerVolumeSpecName "kube-api-access-zvg45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.748126 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2400d2d7-1da5-4a38-a558-c970226f95b9" (UID: "2400d2d7-1da5-4a38-a558-c970226f95b9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.763513 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-config-data" (OuterVolumeSpecName: "config-data") pod "438fbbb5-a318-4714-9dac-e3f0fc3f63d3" (UID: "438fbbb5-a318-4714-9dac-e3f0fc3f63d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.763597 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-scripts" (OuterVolumeSpecName: "scripts") pod "2400d2d7-1da5-4a38-a558-c970226f95b9" (UID: "2400d2d7-1da5-4a38-a558-c970226f95b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.763611 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2400d2d7-1da5-4a38-a558-c970226f95b9" (UID: "2400d2d7-1da5-4a38-a558-c970226f95b9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.767075 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438fbbb5-a318-4714-9dac-e3f0fc3f63d3" (UID: "438fbbb5-a318-4714-9dac-e3f0fc3f63d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.768679 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-config" (OuterVolumeSpecName: "config") pod "2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" (UID: "2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.770140 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-config-data" (OuterVolumeSpecName: "config-data") pod "2400d2d7-1da5-4a38-a558-c970226f95b9" (UID: "2400d2d7-1da5-4a38-a558-c970226f95b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.771091 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" (UID: "2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.784873 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2400d2d7-1da5-4a38-a558-c970226f95b9" (UID: "2400d2d7-1da5-4a38-a558-c970226f95b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.795225 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ldr24" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.833845 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-db-sync-config-data\") pod \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.834066 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p852h\" (UniqueName: \"kubernetes.io/projected/f3ddcee7-a757-43b5-bf76-552cbd8d9078-kube-api-access-p852h\") pod \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.834099 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-combined-ca-bundle\") pod \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\" (UID: \"f3ddcee7-a757-43b5-bf76-552cbd8d9078\") " Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.837125 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f3ddcee7-a757-43b5-bf76-552cbd8d9078" (UID: "f3ddcee7-a757-43b5-bf76-552cbd8d9078"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.837691 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ddcee7-a757-43b5-bf76-552cbd8d9078-kube-api-access-p852h" (OuterVolumeSpecName: "kube-api-access-p852h") pod "f3ddcee7-a757-43b5-bf76-552cbd8d9078" (UID: "f3ddcee7-a757-43b5-bf76-552cbd8d9078"). InnerVolumeSpecName "kube-api-access-p852h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.844968 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvg45\" (UniqueName: \"kubernetes.io/projected/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-kube-api-access-zvg45\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.845092 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.845170 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.845239 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g9zg\" (UniqueName: \"kubernetes.io/projected/2400d2d7-1da5-4a38-a558-c970226f95b9-kube-api-access-6g9zg\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.845481 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846333 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846507 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846596 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846694 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846767 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lkqq\" (UniqueName: \"kubernetes.io/projected/438fbbb5-a318-4714-9dac-e3f0fc3f63d3-kube-api-access-5lkqq\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846846 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846918 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.846985 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.847052 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2400d2d7-1da5-4a38-a558-c970226f95b9-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.847129 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.847205 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p852h\" (UniqueName: \"kubernetes.io/projected/f3ddcee7-a757-43b5-bf76-552cbd8d9078-kube-api-access-p852h\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.863237 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ddcee7-a757-43b5-bf76-552cbd8d9078" (UID: "f3ddcee7-a757-43b5-bf76-552cbd8d9078"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:17 crc kubenswrapper[4931]: I0130 05:26:17.948873 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ddcee7-a757-43b5-bf76-552cbd8d9078-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.491342 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerStarted","Data":"c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547"} Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.494999 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ldr24" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.495132 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sdn7d" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.495115 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ldr24" event={"ID":"f3ddcee7-a757-43b5-bf76-552cbd8d9078","Type":"ContainerDied","Data":"52736b18c683d128d17308fceb7bf9f60b140aff7c806fb9ba3a93b56cb26bc0"} Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.495220 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52736b18c683d128d17308fceb7bf9f60b140aff7c806fb9ba3a93b56cb26bc0" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.495330 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fkqxj" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.496285 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kbkmb" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.496515 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.496602 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.828733 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6554f656b5-c7z8q"] Jan 30 05:26:18 crc kubenswrapper[4931]: E0130 05:26:18.829041 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" containerName="neutron-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829056 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" containerName="neutron-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: E0130 05:26:18.829067 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ddcee7-a757-43b5-bf76-552cbd8d9078" containerName="barbican-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829073 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ddcee7-a757-43b5-bf76-552cbd8d9078" containerName="barbican-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: E0130 05:26:18.829085 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438fbbb5-a318-4714-9dac-e3f0fc3f63d3" containerName="placement-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829092 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="438fbbb5-a318-4714-9dac-e3f0fc3f63d3" containerName="placement-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: E0130 05:26:18.829100 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="init" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829105 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="init" Jan 30 05:26:18 crc kubenswrapper[4931]: E0130 05:26:18.829120 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829126 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" Jan 30 05:26:18 crc kubenswrapper[4931]: E0130 05:26:18.829139 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2400d2d7-1da5-4a38-a558-c970226f95b9" containerName="keystone-bootstrap" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829145 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2400d2d7-1da5-4a38-a558-c970226f95b9" containerName="keystone-bootstrap" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829293 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="438fbbb5-a318-4714-9dac-e3f0fc3f63d3" containerName="placement-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829310 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ddcee7-a757-43b5-bf76-552cbd8d9078" containerName="barbican-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829318 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="be176172-3d0c-47ae-aa98-d7ee20022f44" containerName="dnsmasq-dns" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829332 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2400d2d7-1da5-4a38-a558-c970226f95b9" containerName="keystone-bootstrap" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.829340 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" containerName="neutron-db-sync" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.830135 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.865549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q72vp\" (UniqueName: \"kubernetes.io/projected/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-kube-api-access-q72vp\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.865635 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-svc\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.865684 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-swift-storage-0\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.865701 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-config\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.865759 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-sb\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.865784 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-nb\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.872837 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6554f656b5-c7z8q"] Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.879688 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-67465d5765-cp74w"] Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.881044 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.884493 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.885459 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vt49t" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.890289 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.946477 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67465d5765-cp74w"] Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.967481 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f7d589966-mkfs5"] Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968535 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vl6j\" (UniqueName: \"kubernetes.io/projected/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-kube-api-access-8vl6j\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968593 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-logs\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968662 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q72vp\" (UniqueName: \"kubernetes.io/projected/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-kube-api-access-q72vp\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968724 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-svc\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968762 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-swift-storage-0\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968782 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-config\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968807 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data-custom\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968829 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968842 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-sb\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968894 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-combined-ca-bundle\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.968919 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-nb\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.969834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-nb\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.970164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-swift-storage-0\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.970414 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-config\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.970930 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-svc\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.971102 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-97bdbd495-2prdt"] Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.971300 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.974935 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-sb\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.981273 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.987329 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.992632 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.992865 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-llv5h" Jan 30 05:26:18 crc kubenswrapper[4931]: I0130 05:26:18.992982 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.004681 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.009504 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.018928 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-97bdbd495-2prdt"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.032280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q72vp\" (UniqueName: \"kubernetes.io/projected/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-kube-api-access-q72vp\") pod \"dnsmasq-dns-6554f656b5-c7z8q\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.054483 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f7d589966-mkfs5"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070382 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data-custom\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data-custom\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070498 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-fernet-keys\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070517 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070540 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-combined-ca-bundle\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070574 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d8709-a403-4186-83f5-ec76aee793fe-logs\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070598 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-combined-ca-bundle\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070617 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx75c\" (UniqueName: \"kubernetes.io/projected/2d6e5156-6e75-4dff-a322-b3d43e596c7e-kube-api-access-hx75c\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070641 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrvbk\" (UniqueName: \"kubernetes.io/projected/807d8709-a403-4186-83f5-ec76aee793fe-kube-api-access-jrvbk\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070689 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vl6j\" (UniqueName: \"kubernetes.io/projected/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-kube-api-access-8vl6j\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070711 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-logs\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.070740 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-public-tls-certs\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.071204 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-logs\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.071276 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-internal-tls-certs\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.071934 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-combined-ca-bundle\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.071988 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-scripts\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.072011 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-credential-keys\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.072065 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-config-data\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.072157 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.079917 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-combined-ca-bundle\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.085338 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data-custom\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.089825 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.121184 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vl6j\" (UniqueName: \"kubernetes.io/projected/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-kube-api-access-8vl6j\") pod \"barbican-worker-67465d5765-cp74w\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.134812 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d9d68b44b-5gp25"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.136091 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.147401 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.147692 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.147803 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.147903 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fttzx" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.148038 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.156203 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176480 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-combined-ca-bundle\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176532 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bz2z\" (UniqueName: \"kubernetes.io/projected/b92991ff-5b79-452a-b5ac-9dc90ab42f68-kube-api-access-9bz2z\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176556 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d8709-a403-4186-83f5-ec76aee793fe-logs\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx75c\" (UniqueName: \"kubernetes.io/projected/2d6e5156-6e75-4dff-a322-b3d43e596c7e-kube-api-access-hx75c\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176600 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrvbk\" (UniqueName: \"kubernetes.io/projected/807d8709-a403-4186-83f5-ec76aee793fe-kube-api-access-jrvbk\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176617 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-internal-tls-certs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176632 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-scripts\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176663 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-public-tls-certs\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176679 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-config-data\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176694 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-internal-tls-certs\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176715 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-combined-ca-bundle\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176732 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-scripts\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176749 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-credential-keys\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176764 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92991ff-5b79-452a-b5ac-9dc90ab42f68-logs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176790 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-config-data\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176807 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-public-tls-certs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176837 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176856 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data-custom\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176877 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-combined-ca-bundle\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.176907 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-fernet-keys\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.181970 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-internal-tls-certs\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.184883 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9d68b44b-5gp25"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.185641 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d8709-a403-4186-83f5-ec76aee793fe-logs\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.191826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data-custom\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.194440 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-scripts\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.194786 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.198343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-combined-ca-bundle\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.198862 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-config-data\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.198871 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-fernet-keys\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.199400 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-combined-ca-bundle\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.200897 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-credential-keys\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.209295 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-public-tls-certs\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.209627 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.237514 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx75c\" (UniqueName: \"kubernetes.io/projected/2d6e5156-6e75-4dff-a322-b3d43e596c7e-kube-api-access-hx75c\") pod \"keystone-97bdbd495-2prdt\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.239508 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrvbk\" (UniqueName: \"kubernetes.io/projected/807d8709-a403-4186-83f5-ec76aee793fe-kube-api-access-jrvbk\") pod \"barbican-keystone-listener-f7d589966-mkfs5\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.272538 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6665f9d796-74mbd"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.277148 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.278977 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-internal-tls-certs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.279030 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-scripts\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.279074 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-config-data\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.279106 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92991ff-5b79-452a-b5ac-9dc90ab42f68-logs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.279133 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-public-tls-certs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.279185 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-combined-ca-bundle\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.279232 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bz2z\" (UniqueName: \"kubernetes.io/projected/b92991ff-5b79-452a-b5ac-9dc90ab42f68-kube-api-access-9bz2z\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.283774 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92991ff-5b79-452a-b5ac-9dc90ab42f68-logs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.287970 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-internal-tls-certs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.292602 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-combined-ca-bundle\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.292880 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.294385 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.295069 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-config-data\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.297015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-scripts\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.297470 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-public-tls-certs\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.309242 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6665f9d796-74mbd"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.309664 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.338462 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dc49c789d-5gcj4"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.339781 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.342707 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.343280 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.343361 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ffrzt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.365131 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.366696 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6554f656b5-c7z8q"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.370232 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bz2z\" (UniqueName: \"kubernetes.io/projected/b92991ff-5b79-452a-b5ac-9dc90ab42f68-kube-api-access-9bz2z\") pod \"placement-d9d68b44b-5gp25\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403466 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lwj\" (UniqueName: \"kubernetes.io/projected/28e3fd91-5906-4368-b156-e0d60f3c268e-kube-api-access-f9lwj\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403533 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-ovndb-tls-certs\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403603 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e3fd91-5906-4368-b156-e0d60f3c268e-logs\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403624 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-config\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403703 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-combined-ca-bundle\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403749 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-combined-ca-bundle\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403801 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgbtk\" (UniqueName: \"kubernetes.io/projected/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-kube-api-access-fgbtk\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data-custom\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403843 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-httpd-config\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.403943 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.515819 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lwj\" (UniqueName: \"kubernetes.io/projected/28e3fd91-5906-4368-b156-e0d60f3c268e-kube-api-access-f9lwj\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.515869 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-ovndb-tls-certs\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.515903 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e3fd91-5906-4368-b156-e0d60f3c268e-logs\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.515921 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-config\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.515958 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-combined-ca-bundle\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.515982 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-combined-ca-bundle\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.516008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data-custom\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.516021 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgbtk\" (UniqueName: \"kubernetes.io/projected/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-kube-api-access-fgbtk\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.516038 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-httpd-config\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.516078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.527581 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e3fd91-5906-4368-b156-e0d60f3c268e-logs\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.535281 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data-custom\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.542587 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.544998 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-httpd-config\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.545435 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-combined-ca-bundle\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.545809 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.551015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-ovndb-tls-certs\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.554107 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-config\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.564476 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgbtk\" (UniqueName: \"kubernetes.io/projected/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-kube-api-access-fgbtk\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.565461 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc49c789d-5gcj4"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.565654 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-zmls6"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.567820 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.619240 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-combined-ca-bundle\") pod \"neutron-dc49c789d-5gcj4\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.620567 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lwj\" (UniqueName: \"kubernetes.io/projected/28e3fd91-5906-4368-b156-e0d60f3c268e-kube-api-access-f9lwj\") pod \"barbican-api-6665f9d796-74mbd\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.631479 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-zmls6"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.647071 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f5d456c6b-66jxb"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.650057 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.664964 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c996f77-c9rqm"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.666485 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.675704 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c996f77-c9rqm"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.690019 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f5d456c6b-66jxb"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.725242 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.729559 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.729630 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.729701 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lmn\" (UniqueName: \"kubernetes.io/projected/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-kube-api-access-z2lmn\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.729875 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-config\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.729948 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.729969 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.738496 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-687c697484-j2btt"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.739948 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.748924 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.754974 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-798b7dc5fb-xl2zq"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.762445 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.764824 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-794bfbdd44-9msr6"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.767412 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.772338 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-687c697484-j2btt"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.784541 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-798b7dc5fb-xl2zq"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831483 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-ovndb-tls-certs\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831746 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831769 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data-custom\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831787 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-combined-ca-bundle\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831805 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831824 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-config\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831842 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831891 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-combined-ca-bundle\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7729e2d8-6c8c-4759-9e5d-535ad1586f47-logs\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831937 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2e60-915e-4447-9465-aa64f7f5c7bb-logs\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831954 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpv7q\" (UniqueName: \"kubernetes.io/projected/7729e2d8-6c8c-4759-9e5d-535ad1586f47-kube-api-access-vpv7q\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.831972 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lmn\" (UniqueName: \"kubernetes.io/projected/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-kube-api-access-z2lmn\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832023 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r29mt\" (UniqueName: \"kubernetes.io/projected/728a2e60-915e-4447-9465-aa64f7f5c7bb-kube-api-access-r29mt\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832068 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-config\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832086 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data-custom\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832106 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832135 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832151 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832171 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-httpd-config\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.832185 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jwc6\" (UniqueName: \"kubernetes.io/projected/84203bc9-afb4-42cb-843d-c211490ce275-kube-api-access-6jwc6\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.833843 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.833856 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.834368 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.834633 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.835136 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-config\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.836133 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-794bfbdd44-9msr6"] Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.853265 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lmn\" (UniqueName: \"kubernetes.io/projected/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-kube-api-access-z2lmn\") pod \"dnsmasq-dns-7bdf86f46f-zmls6\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.904442 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935112 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935152 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data-custom\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935170 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-combined-ca-bundle\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935191 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-config\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935213 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-public-tls-certs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935231 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-combined-ca-bundle\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935290 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-combined-ca-bundle\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935311 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-scripts\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935326 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7729e2d8-6c8c-4759-9e5d-535ad1586f47-logs\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935341 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2e60-915e-4447-9465-aa64f7f5c7bb-logs\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935359 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l264t\" (UniqueName: \"kubernetes.io/projected/c3dfec36-0758-42c6-8c28-997044eb59a3-kube-api-access-l264t\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935377 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpv7q\" (UniqueName: \"kubernetes.io/projected/7729e2d8-6c8c-4759-9e5d-535ad1586f47-kube-api-access-vpv7q\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935393 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-combined-ca-bundle\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935413 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935462 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dfec36-0758-42c6-8c28-997044eb59a3-logs\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935482 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r29mt\" (UniqueName: \"kubernetes.io/projected/728a2e60-915e-4447-9465-aa64f7f5c7bb-kube-api-access-r29mt\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935497 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe4f743-9a60-428f-8b58-14ba160d9fd7-logs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935516 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vcv\" (UniqueName: \"kubernetes.io/projected/ebe4f743-9a60-428f-8b58-14ba160d9fd7-kube-api-access-68vcv\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935557 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data-custom\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935578 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935619 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-httpd-config\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935633 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jwc6\" (UniqueName: \"kubernetes.io/projected/84203bc9-afb4-42cb-843d-c211490ce275-kube-api-access-6jwc6\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935662 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-config-data\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935680 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-internal-tls-certs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935700 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data-custom\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.935722 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-ovndb-tls-certs\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.943692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.945146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7729e2d8-6c8c-4759-9e5d-535ad1586f47-logs\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.945682 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2e60-915e-4447-9465-aa64f7f5c7bb-logs\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.951123 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-combined-ca-bundle\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.952720 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-ovndb-tls-certs\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.953212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data-custom\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.953282 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-combined-ca-bundle\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.953869 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-httpd-config\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.954341 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data-custom\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.955624 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-config\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.959495 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.971832 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jwc6\" (UniqueName: \"kubernetes.io/projected/84203bc9-afb4-42cb-843d-c211490ce275-kube-api-access-6jwc6\") pod \"neutron-687c697484-j2btt\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.991083 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpv7q\" (UniqueName: \"kubernetes.io/projected/7729e2d8-6c8c-4759-9e5d-535ad1586f47-kube-api-access-vpv7q\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:19 crc kubenswrapper[4931]: I0130 05:26:19.991567 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data\") pod \"barbican-worker-7c996f77-c9rqm\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.014922 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r29mt\" (UniqueName: \"kubernetes.io/projected/728a2e60-915e-4447-9465-aa64f7f5c7bb-kube-api-access-r29mt\") pod \"barbican-keystone-listener-5f5d456c6b-66jxb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037513 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-scripts\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037565 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l264t\" (UniqueName: \"kubernetes.io/projected/c3dfec36-0758-42c6-8c28-997044eb59a3-kube-api-access-l264t\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037583 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-combined-ca-bundle\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037613 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037649 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dfec36-0758-42c6-8c28-997044eb59a3-logs\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037667 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe4f743-9a60-428f-8b58-14ba160d9fd7-logs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037685 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vcv\" (UniqueName: \"kubernetes.io/projected/ebe4f743-9a60-428f-8b58-14ba160d9fd7-kube-api-access-68vcv\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037783 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-config-data\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037800 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-internal-tls-certs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037819 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data-custom\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037876 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-public-tls-certs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.037891 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-combined-ca-bundle\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.044756 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe4f743-9a60-428f-8b58-14ba160d9fd7-logs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.045211 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dfec36-0758-42c6-8c28-997044eb59a3-logs\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.048694 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-scripts\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.049359 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-combined-ca-bundle\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.064003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data-custom\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.066049 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-public-tls-certs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.073782 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-internal-tls-certs\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.073873 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-config-data\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.076829 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-combined-ca-bundle\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.077941 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vcv\" (UniqueName: \"kubernetes.io/projected/ebe4f743-9a60-428f-8b58-14ba160d9fd7-kube-api-access-68vcv\") pod \"placement-798b7dc5fb-xl2zq\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.079025 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l264t\" (UniqueName: \"kubernetes.io/projected/c3dfec36-0758-42c6-8c28-997044eb59a3-kube-api-access-l264t\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.079953 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6554f656b5-c7z8q"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.083469 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data\") pod \"barbican-api-794bfbdd44-9msr6\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.216947 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.227146 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.240965 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.252371 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.278342 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9d68b44b-5gp25"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.301856 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.400872 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f7d589966-mkfs5"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.439938 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67465d5765-cp74w"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.554813 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6665f9d796-74mbd"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.582868 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-97bdbd495-2prdt"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.583126 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67465d5765-cp74w" event={"ID":"a83e0ea3-83ba-4e7c-803c-4fd9811318a2","Type":"ContainerStarted","Data":"9f1458d6f86849c7d56580c53cae53507cdf0fec4d72928952c134f8ba2a7ca8"} Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.593274 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9d68b44b-5gp25" event={"ID":"b92991ff-5b79-452a-b5ac-9dc90ab42f68","Type":"ContainerStarted","Data":"040b81795acd0bef7c76b7a99d650deaac66b5fa82f97baf669121be56928797"} Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.596601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" event={"ID":"807d8709-a403-4186-83f5-ec76aee793fe","Type":"ContainerStarted","Data":"fc0c653d3e574db62881709b302919c961837f9a8fc28421f26c150c1cbda477"} Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.596765 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-zmls6"] Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.601994 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" event={"ID":"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224","Type":"ContainerStarted","Data":"af773d7e6d6c3024589870daad5e39942c3e37e8d1998e13765a6119ce565675"} Jan 30 05:26:20 crc kubenswrapper[4931]: I0130 05:26:20.680737 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc49c789d-5gcj4"] Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.128874 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f5d456c6b-66jxb"] Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.254548 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-687c697484-j2btt"] Jan 30 05:26:21 crc kubenswrapper[4931]: W0130 05:26:21.278268 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84203bc9_afb4_42cb_843d_c211490ce275.slice/crio-ae04b2bb5e94306ea64985be9c8f6deb0c75c6536be8008e597d586d1aee985e WatchSource:0}: Error finding container ae04b2bb5e94306ea64985be9c8f6deb0c75c6536be8008e597d586d1aee985e: Status 404 returned error can't find the container with id ae04b2bb5e94306ea64985be9c8f6deb0c75c6536be8008e597d586d1aee985e Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.312261 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-798b7dc5fb-xl2zq"] Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.495647 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c996f77-c9rqm"] Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.495972 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-794bfbdd44-9msr6"] Jan 30 05:26:21 crc kubenswrapper[4931]: W0130 05:26:21.521879 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7729e2d8_6c8c_4759_9e5d_535ad1586f47.slice/crio-ac16bb78f3ca5ff67b0b11f3773806ecd75b6cd0b938e4013f99b8b4e7b2e044 WatchSource:0}: Error finding container ac16bb78f3ca5ff67b0b11f3773806ecd75b6cd0b938e4013f99b8b4e7b2e044: Status 404 returned error can't find the container with id ac16bb78f3ca5ff67b0b11f3773806ecd75b6cd0b938e4013f99b8b4e7b2e044 Jan 30 05:26:21 crc kubenswrapper[4931]: W0130 05:26:21.528556 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3dfec36_0758_42c6_8c28_997044eb59a3.slice/crio-d2345973c91074357738f420a2e030c74216ac2f1de71e5a31e295982e276e96 WatchSource:0}: Error finding container d2345973c91074357738f420a2e030c74216ac2f1de71e5a31e295982e276e96: Status 404 returned error can't find the container with id d2345973c91074357738f420a2e030c74216ac2f1de71e5a31e295982e276e96 Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.671976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97bdbd495-2prdt" event={"ID":"2d6e5156-6e75-4dff-a322-b3d43e596c7e","Type":"ContainerStarted","Data":"2031f531f783ff9fda1aa19098c42b8b6619a54760d8a1056a1788a1c38b669b"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.672019 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97bdbd495-2prdt" event={"ID":"2d6e5156-6e75-4dff-a322-b3d43e596c7e","Type":"ContainerStarted","Data":"c2c40320b6d71850a7db7d062b86807a450e4758cd147671abdfe8fd00c2df62"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.674622 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.686854 4931 generic.go:334] "Generic (PLEG): container finished" podID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerID="42238edb208312fe89370f3d6e71cdbdaaf5a688762779ee0068776a658a91e9" exitCode=0 Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.686979 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" event={"ID":"4aa89fd3-2a8a-424c-b3a7-cf743d90a249","Type":"ContainerDied","Data":"42238edb208312fe89370f3d6e71cdbdaaf5a688762779ee0068776a658a91e9"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.687016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" event={"ID":"4aa89fd3-2a8a-424c-b3a7-cf743d90a249","Type":"ContainerStarted","Data":"b7351472d16e045cf1d352d57e3502d62cfe0a1c627e0387d4154e9570e9d7c6"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.688380 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.688482 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.693535 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798b7dc5fb-xl2zq" event={"ID":"ebe4f743-9a60-428f-8b58-14ba160d9fd7","Type":"ContainerStarted","Data":"7f97af972b1577947f7f7edff42a3df45ac3d6eddfca2ad04dcbcbf60edeb902"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.703623 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c996f77-c9rqm" event={"ID":"7729e2d8-6c8c-4759-9e5d-535ad1586f47","Type":"ContainerStarted","Data":"ac16bb78f3ca5ff67b0b11f3773806ecd75b6cd0b938e4013f99b8b4e7b2e044"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.711396 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-97bdbd495-2prdt" podStartSLOduration=3.7113781169999998 podStartE2EDuration="3.711378117s" podCreationTimestamp="2026-01-30 05:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:21.699162797 +0000 UTC m=+1117.069073054" watchObservedRunningTime="2026-01-30 05:26:21.711378117 +0000 UTC m=+1117.081288374" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.725813 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6665f9d796-74mbd" event={"ID":"28e3fd91-5906-4368-b156-e0d60f3c268e","Type":"ContainerStarted","Data":"e29747a73e8f9c7e78f65f5f5c5542788fec252768ae4f62a8f1f67f3d4ca4ed"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.725880 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6665f9d796-74mbd" event={"ID":"28e3fd91-5906-4368-b156-e0d60f3c268e","Type":"ContainerStarted","Data":"fc46b2c7ab19d04106c22b57ea741652e3c63fe169a85fea4405836701cbe7c6"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.725894 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6665f9d796-74mbd" event={"ID":"28e3fd91-5906-4368-b156-e0d60f3c268e","Type":"ContainerStarted","Data":"b479366de18a258ddf192480628e9708d18845ae58ecac6569bfbb633a96f682"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.727983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.728017 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.742694 4931 generic.go:334] "Generic (PLEG): container finished" podID="6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" containerID="9c3c4b6151a0c51d59294c812422f37d6d21632c4b81b84ecc9451a3cae1e0d5" exitCode=0 Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.742751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" event={"ID":"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224","Type":"ContainerDied","Data":"9c3c4b6151a0c51d59294c812422f37d6d21632c4b81b84ecc9451a3cae1e0d5"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.754240 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" event={"ID":"728a2e60-915e-4447-9465-aa64f7f5c7bb","Type":"ContainerStarted","Data":"cd9a53b66398f13fcc5edf6801d39072217390bf6fb5b5264a9e5d24f429383b"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.762857 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9d68b44b-5gp25" event={"ID":"b92991ff-5b79-452a-b5ac-9dc90ab42f68","Type":"ContainerStarted","Data":"133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.768663 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794bfbdd44-9msr6" event={"ID":"c3dfec36-0758-42c6-8c28-997044eb59a3","Type":"ContainerStarted","Data":"d2345973c91074357738f420a2e030c74216ac2f1de71e5a31e295982e276e96"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.801002 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6665f9d796-74mbd" podStartSLOduration=2.800987688 podStartE2EDuration="2.800987688s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:21.770907151 +0000 UTC m=+1117.140817438" watchObservedRunningTime="2026-01-30 05:26:21.800987688 +0000 UTC m=+1117.170897945" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.802500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-687c697484-j2btt" event={"ID":"84203bc9-afb4-42cb-843d-c211490ce275","Type":"ContainerStarted","Data":"ae04b2bb5e94306ea64985be9c8f6deb0c75c6536be8008e597d586d1aee985e"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.835685 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc49c789d-5gcj4" event={"ID":"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec","Type":"ContainerStarted","Data":"e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.835728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc49c789d-5gcj4" event={"ID":"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec","Type":"ContainerStarted","Data":"f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.835736 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc49c789d-5gcj4" event={"ID":"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec","Type":"ContainerStarted","Data":"49b94c209fcd846b366cb60120c52ee63d74a76288f62e76634d76df2ff577f1"} Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.836845 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:21 crc kubenswrapper[4931]: I0130 05:26:21.863488 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dc49c789d-5gcj4" podStartSLOduration=2.863461923 podStartE2EDuration="2.863461923s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:21.853478332 +0000 UTC m=+1117.223388589" watchObservedRunningTime="2026-01-30 05:26:21.863461923 +0000 UTC m=+1117.233372180" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.106790 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.215194 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.295914 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-swift-storage-0\") pod \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.296061 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-svc\") pod \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.296104 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-nb\") pod \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.296136 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q72vp\" (UniqueName: \"kubernetes.io/projected/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-kube-api-access-q72vp\") pod \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.296204 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-sb\") pod \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.296253 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-config\") pod \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\" (UID: \"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224\") " Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.335429 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-kube-api-access-q72vp" (OuterVolumeSpecName: "kube-api-access-q72vp") pod "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" (UID: "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224"). InnerVolumeSpecName "kube-api-access-q72vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.403343 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q72vp\" (UniqueName: \"kubernetes.io/projected/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-kube-api-access-q72vp\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.528360 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" (UID: "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.536602 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-config" (OuterVolumeSpecName: "config") pod "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" (UID: "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.556944 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" (UID: "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.557078 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" (UID: "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.573554 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" (UID: "6d840d6e-c5f2-4b2e-9dc1-1b6df0950224"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.610708 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.610749 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.610762 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.610775 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.610787 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.864600 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-687c697484-j2btt" event={"ID":"84203bc9-afb4-42cb-843d-c211490ce275","Type":"ContainerStarted","Data":"14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.864643 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-687c697484-j2btt" event={"ID":"84203bc9-afb4-42cb-843d-c211490ce275","Type":"ContainerStarted","Data":"56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.865736 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.887727 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" event={"ID":"6d840d6e-c5f2-4b2e-9dc1-1b6df0950224","Type":"ContainerDied","Data":"af773d7e6d6c3024589870daad5e39942c3e37e8d1998e13765a6119ce565675"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.887780 4931 scope.go:117] "RemoveContainer" containerID="9c3c4b6151a0c51d59294c812422f37d6d21632c4b81b84ecc9451a3cae1e0d5" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.887901 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6554f656b5-c7z8q" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.895720 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-687c697484-j2btt" podStartSLOduration=3.895704927 podStartE2EDuration="3.895704927s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:22.880737621 +0000 UTC m=+1118.250647878" watchObservedRunningTime="2026-01-30 05:26:22.895704927 +0000 UTC m=+1118.265615184" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.904749 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9d68b44b-5gp25" event={"ID":"b92991ff-5b79-452a-b5ac-9dc90ab42f68","Type":"ContainerStarted","Data":"fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.906337 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.906363 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.923261 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794bfbdd44-9msr6" event={"ID":"c3dfec36-0758-42c6-8c28-997044eb59a3","Type":"ContainerStarted","Data":"7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.923313 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794bfbdd44-9msr6" event={"ID":"c3dfec36-0758-42c6-8c28-997044eb59a3","Type":"ContainerStarted","Data":"845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.923359 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.923372 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.941032 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d9d68b44b-5gp25" podStartSLOduration=3.941008478 podStartE2EDuration="3.941008478s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:22.931177862 +0000 UTC m=+1118.301088119" watchObservedRunningTime="2026-01-30 05:26:22.941008478 +0000 UTC m=+1118.310918735" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.951343 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" event={"ID":"4aa89fd3-2a8a-424c-b3a7-cf743d90a249","Type":"ContainerStarted","Data":"d07be3298c7001c20f2b88e58ea237b62170a8ccd4a24bfb286b3d0a2bff7807"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.952149 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.955943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798b7dc5fb-xl2zq" event={"ID":"ebe4f743-9a60-428f-8b58-14ba160d9fd7","Type":"ContainerStarted","Data":"e8519c60ec437acc9c9b5934ab3951ad5ad349186eda26fc85c2bae9b3010203"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.955983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798b7dc5fb-xl2zq" event={"ID":"ebe4f743-9a60-428f-8b58-14ba160d9fd7","Type":"ContainerStarted","Data":"1d733edb3ceaca43f34355e23bbaaced9e55a731057ead7b89c96398337d6e11"} Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.956587 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.956615 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:22 crc kubenswrapper[4931]: I0130 05:26:22.981302 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-794bfbdd44-9msr6" podStartSLOduration=3.981283362 podStartE2EDuration="3.981283362s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:22.964599912 +0000 UTC m=+1118.334510169" watchObservedRunningTime="2026-01-30 05:26:22.981283362 +0000 UTC m=+1118.351193619" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.092983 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6554f656b5-c7z8q"] Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.102272 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6554f656b5-c7z8q"] Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.119760 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-798b7dc5fb-xl2zq" podStartSLOduration=4.119717143 podStartE2EDuration="4.119717143s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:23.066164045 +0000 UTC m=+1118.436074302" watchObservedRunningTime="2026-01-30 05:26:23.119717143 +0000 UTC m=+1118.489627400" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.136640 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" podStartSLOduration=4.136626949 podStartE2EDuration="4.136626949s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:23.09295982 +0000 UTC m=+1118.462870077" watchObservedRunningTime="2026-01-30 05:26:23.136626949 +0000 UTC m=+1118.506537206" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.212876 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-687c697484-j2btt"] Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.239271 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75d9f6f6ff-kmswn"] Jan 30 05:26:23 crc kubenswrapper[4931]: E0130 05:26:23.239714 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" containerName="init" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.239731 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" containerName="init" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.239937 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" containerName="init" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.240888 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.244457 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.246187 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.249878 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75d9f6f6ff-kmswn"] Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.328357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-public-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.328414 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-internal-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.328467 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-httpd-config\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.328823 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-ovndb-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.328889 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c78ch\" (UniqueName: \"kubernetes.io/projected/e1f9790c-c395-4c72-b569-3140f703b56f-kube-api-access-c78ch\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.328917 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-config\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.329074 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-combined-ca-bundle\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431030 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-combined-ca-bundle\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431137 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-public-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431171 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-internal-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431199 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-httpd-config\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431242 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-ovndb-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c78ch\" (UniqueName: \"kubernetes.io/projected/e1f9790c-c395-4c72-b569-3140f703b56f-kube-api-access-c78ch\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.431280 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-config\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.442448 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d840d6e-c5f2-4b2e-9dc1-1b6df0950224" path="/var/lib/kubelet/pods/6d840d6e-c5f2-4b2e-9dc1-1b6df0950224/volumes" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.458225 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-ovndb-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.459547 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-config\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.460146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-internal-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.476383 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-combined-ca-bundle\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.476888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-public-tls-certs\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.477343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-httpd-config\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.479539 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c78ch\" (UniqueName: \"kubernetes.io/projected/e1f9790c-c395-4c72-b569-3140f703b56f-kube-api-access-c78ch\") pod \"neutron-75d9f6f6ff-kmswn\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:23 crc kubenswrapper[4931]: I0130 05:26:23.563019 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:24 crc kubenswrapper[4931]: I0130 05:26:24.703037 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:26:24 crc kubenswrapper[4931]: I0130 05:26:24.934949 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:26:25 crc kubenswrapper[4931]: I0130 05:26:25.007354 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-687c697484-j2btt" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-api" containerID="cri-o://56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e" gracePeriod=30 Jan 30 05:26:25 crc kubenswrapper[4931]: I0130 05:26:25.007783 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-687c697484-j2btt" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-httpd" containerID="cri-o://14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625" gracePeriod=30 Jan 30 05:26:25 crc kubenswrapper[4931]: W0130 05:26:25.459336 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1f9790c_c395_4c72_b569_3140f703b56f.slice/crio-7ca85a404546ccf8741ecd606d0270826591549f7f7979a41ae78e99d2986a63 WatchSource:0}: Error finding container 7ca85a404546ccf8741ecd606d0270826591549f7f7979a41ae78e99d2986a63: Status 404 returned error can't find the container with id 7ca85a404546ccf8741ecd606d0270826591549f7f7979a41ae78e99d2986a63 Jan 30 05:26:25 crc kubenswrapper[4931]: I0130 05:26:25.464569 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75d9f6f6ff-kmswn"] Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.064474 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" event={"ID":"728a2e60-915e-4447-9465-aa64f7f5c7bb","Type":"ContainerStarted","Data":"1bd0c14353cbfd196f658cae7f7167624a1cc818a0ca23ec5151f1c871a22e65"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.064735 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" event={"ID":"728a2e60-915e-4447-9465-aa64f7f5c7bb","Type":"ContainerStarted","Data":"f0f483180c30bc672edc2e00c840d52567eb9b0c61f8c285d3a7c2a185f38020"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.098357 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" podStartSLOduration=3.397344378 podStartE2EDuration="7.098342226s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="2026-01-30 05:26:21.170584727 +0000 UTC m=+1116.540494984" lastFinishedPulling="2026-01-30 05:26:24.871582585 +0000 UTC m=+1120.241492832" observedRunningTime="2026-01-30 05:26:26.095858349 +0000 UTC m=+1121.465768626" watchObservedRunningTime="2026-01-30 05:26:26.098342226 +0000 UTC m=+1121.468252483" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.119659 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67465d5765-cp74w" event={"ID":"a83e0ea3-83ba-4e7c-803c-4fd9811318a2","Type":"ContainerStarted","Data":"e33db123fcf3634be4f056a5ccfd14e3aaf930151b4fce1b9c71c79348a5ff96"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.119703 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67465d5765-cp74w" event={"ID":"a83e0ea3-83ba-4e7c-803c-4fd9811318a2","Type":"ContainerStarted","Data":"fcd84ab06f79a15cca51ab919a2d8f4365b9ca38a548c71882f816752c64d1a8"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.147932 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-f7d589966-mkfs5"] Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.184053 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6665f9d796-74mbd"] Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.184290 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6665f9d796-74mbd" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api-log" containerID="cri-o://fc46b2c7ab19d04106c22b57ea741652e3c63fe169a85fea4405836701cbe7c6" gracePeriod=30 Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.184584 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-67465d5765-cp74w" podStartSLOduration=3.866018903 podStartE2EDuration="8.184562741s" podCreationTimestamp="2026-01-30 05:26:18 +0000 UTC" firstStartedPulling="2026-01-30 05:26:20.554946866 +0000 UTC m=+1115.924857123" lastFinishedPulling="2026-01-30 05:26:24.873490704 +0000 UTC m=+1120.243400961" observedRunningTime="2026-01-30 05:26:26.166116947 +0000 UTC m=+1121.536027194" watchObservedRunningTime="2026-01-30 05:26:26.184562741 +0000 UTC m=+1121.554472998" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.186720 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6665f9d796-74mbd" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api" containerID="cri-o://e29747a73e8f9c7e78f65f5f5c5542788fec252768ae4f62a8f1f67f3d4ca4ed" gracePeriod=30 Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.189751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75d9f6f6ff-kmswn" event={"ID":"e1f9790c-c395-4c72-b569-3140f703b56f","Type":"ContainerStarted","Data":"59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.189783 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75d9f6f6ff-kmswn" event={"ID":"e1f9790c-c395-4c72-b569-3140f703b56f","Type":"ContainerStarted","Data":"7ca85a404546ccf8741ecd606d0270826591549f7f7979a41ae78e99d2986a63"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.198660 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6665f9d796-74mbd" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.219037 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" event={"ID":"807d8709-a403-4186-83f5-ec76aee793fe","Type":"ContainerStarted","Data":"4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.219077 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" event={"ID":"807d8709-a403-4186-83f5-ec76aee793fe","Type":"ContainerStarted","Data":"74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.220075 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d69b6c966-npv8t"] Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.227813 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.233152 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.233732 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.237296 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c996f77-c9rqm" event={"ID":"7729e2d8-6c8c-4759-9e5d-535ad1586f47","Type":"ContainerStarted","Data":"2c58ff417f0ff91cddebc47f633febd5ac50ac3ddd97dcc2fce28574c94ac8a6"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.237398 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c996f77-c9rqm" event={"ID":"7729e2d8-6c8c-4759-9e5d-535ad1586f47","Type":"ContainerStarted","Data":"4ec5d987a513f3f04cf30f8d242bd5ee734a2387c5a279b070c18b72f4a56519"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.247344 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d69b6c966-npv8t"] Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.255122 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" podStartSLOduration=3.938538331 podStartE2EDuration="8.255107998s" podCreationTimestamp="2026-01-30 05:26:18 +0000 UTC" firstStartedPulling="2026-01-30 05:26:20.554694698 +0000 UTC m=+1115.924604955" lastFinishedPulling="2026-01-30 05:26:24.871264365 +0000 UTC m=+1120.241174622" observedRunningTime="2026-01-30 05:26:26.254471748 +0000 UTC m=+1121.624382015" watchObservedRunningTime="2026-01-30 05:26:26.255107998 +0000 UTC m=+1121.625018255" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.261274 4931 generic.go:334] "Generic (PLEG): container finished" podID="84203bc9-afb4-42cb-843d-c211490ce275" containerID="14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625" exitCode=0 Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.261315 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-687c697484-j2btt" event={"ID":"84203bc9-afb4-42cb-843d-c211490ce275","Type":"ContainerDied","Data":"14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625"} Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314668 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-combined-ca-bundle\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314730 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-public-tls-certs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314752 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-internal-tls-certs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314860 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99zt\" (UniqueName: \"kubernetes.io/projected/58928fea-709c-44d8-bd12-23937da8e2c4-kube-api-access-x99zt\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314900 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data-custom\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314935 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58928fea-709c-44d8-bd12-23937da8e2c4-logs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.314966 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.317088 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c996f77-c9rqm" podStartSLOduration=3.973442108 podStartE2EDuration="7.317069787s" podCreationTimestamp="2026-01-30 05:26:19 +0000 UTC" firstStartedPulling="2026-01-30 05:26:21.528614896 +0000 UTC m=+1116.898525153" lastFinishedPulling="2026-01-30 05:26:24.872242575 +0000 UTC m=+1120.242152832" observedRunningTime="2026-01-30 05:26:26.296395973 +0000 UTC m=+1121.666306230" watchObservedRunningTime="2026-01-30 05:26:26.317069787 +0000 UTC m=+1121.686980044" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.353857 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-67465d5765-cp74w"] Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417552 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99zt\" (UniqueName: \"kubernetes.io/projected/58928fea-709c-44d8-bd12-23937da8e2c4-kube-api-access-x99zt\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417607 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data-custom\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58928fea-709c-44d8-bd12-23937da8e2c4-logs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417674 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417715 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-combined-ca-bundle\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417742 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-public-tls-certs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.417760 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-internal-tls-certs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.418279 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58928fea-709c-44d8-bd12-23937da8e2c4-logs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.424160 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-internal-tls-certs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.425156 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-combined-ca-bundle\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.437284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.439945 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-public-tls-certs\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.440195 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data-custom\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.449456 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99zt\" (UniqueName: \"kubernetes.io/projected/58928fea-709c-44d8-bd12-23937da8e2c4-kube-api-access-x99zt\") pod \"barbican-api-7d69b6c966-npv8t\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:26 crc kubenswrapper[4931]: I0130 05:26:26.557215 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.055710 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d69b6c966-npv8t"] Jan 30 05:26:27 crc kubenswrapper[4931]: W0130 05:26:27.067700 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58928fea_709c_44d8_bd12_23937da8e2c4.slice/crio-80b7562d2e28920f91efc6005a3dada9547915b81544850dc4480d2be479f27a WatchSource:0}: Error finding container 80b7562d2e28920f91efc6005a3dada9547915b81544850dc4480d2be479f27a: Status 404 returned error can't find the container with id 80b7562d2e28920f91efc6005a3dada9547915b81544850dc4480d2be479f27a Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.273559 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d69b6c966-npv8t" event={"ID":"58928fea-709c-44d8-bd12-23937da8e2c4","Type":"ContainerStarted","Data":"80b7562d2e28920f91efc6005a3dada9547915b81544850dc4480d2be479f27a"} Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.276976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75d9f6f6ff-kmswn" event={"ID":"e1f9790c-c395-4c72-b569-3140f703b56f","Type":"ContainerStarted","Data":"e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e"} Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.277865 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.280143 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpr97" event={"ID":"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6","Type":"ContainerStarted","Data":"1e41b0f0000283bd1a29c28f4d8fdb74fdd5389e3ebd8804eac1db1375b10248"} Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.282386 4931 generic.go:334] "Generic (PLEG): container finished" podID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerID="fc46b2c7ab19d04106c22b57ea741652e3c63fe169a85fea4405836701cbe7c6" exitCode=143 Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.282960 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6665f9d796-74mbd" event={"ID":"28e3fd91-5906-4368-b156-e0d60f3c268e","Type":"ContainerDied","Data":"fc46b2c7ab19d04106c22b57ea741652e3c63fe169a85fea4405836701cbe7c6"} Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.284031 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener-log" containerID="cri-o://74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6" gracePeriod=30 Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.284263 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener" containerID="cri-o://4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0" gracePeriod=30 Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.304789 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75d9f6f6ff-kmswn" podStartSLOduration=4.304774473 podStartE2EDuration="4.304774473s" podCreationTimestamp="2026-01-30 05:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:27.300618984 +0000 UTC m=+1122.670529241" watchObservedRunningTime="2026-01-30 05:26:27.304774473 +0000 UTC m=+1122.674684730" Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.334870 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rpr97" podStartSLOduration=3.844829641 podStartE2EDuration="40.33485332s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="2026-01-30 05:25:48.742369501 +0000 UTC m=+1084.112279758" lastFinishedPulling="2026-01-30 05:26:25.23239318 +0000 UTC m=+1120.602303437" observedRunningTime="2026-01-30 05:26:27.334384805 +0000 UTC m=+1122.704295072" watchObservedRunningTime="2026-01-30 05:26:27.33485332 +0000 UTC m=+1122.704763577" Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.366870 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:26:27 crc kubenswrapper[4931]: I0130 05:26:27.366920 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.097402 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.295322 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d69b6c966-npv8t" event={"ID":"58928fea-709c-44d8-bd12-23937da8e2c4","Type":"ContainerStarted","Data":"44392d9ac535d9a3ce2ca47aa88e680823c3197a2e50d537aa67df4b03e52fd1"} Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.295373 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d69b6c966-npv8t" event={"ID":"58928fea-709c-44d8-bd12-23937da8e2c4","Type":"ContainerStarted","Data":"0e0a199cc977b5213010336cc2b6c461a3916b61b6c3d9f6dc8eecc7d5c8d17e"} Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.295412 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.295593 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.306352 4931 generic.go:334] "Generic (PLEG): container finished" podID="807d8709-a403-4186-83f5-ec76aee793fe" containerID="74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6" exitCode=143 Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.306686 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" event={"ID":"807d8709-a403-4186-83f5-ec76aee793fe","Type":"ContainerDied","Data":"74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6"} Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.306923 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-67465d5765-cp74w" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker-log" containerID="cri-o://fcd84ab06f79a15cca51ab919a2d8f4365b9ca38a548c71882f816752c64d1a8" gracePeriod=30 Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.307028 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-67465d5765-cp74w" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker" containerID="cri-o://e33db123fcf3634be4f056a5ccfd14e3aaf930151b4fce1b9c71c79348a5ff96" gracePeriod=30 Jan 30 05:26:28 crc kubenswrapper[4931]: I0130 05:26:28.331030 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d69b6c966-npv8t" podStartSLOduration=2.3310119 podStartE2EDuration="2.3310119s" podCreationTimestamp="2026-01-30 05:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:28.320303457 +0000 UTC m=+1123.690213714" watchObservedRunningTime="2026-01-30 05:26:28.3310119 +0000 UTC m=+1123.700922157" Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.324611 4931 generic.go:334] "Generic (PLEG): container finished" podID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerID="e33db123fcf3634be4f056a5ccfd14e3aaf930151b4fce1b9c71c79348a5ff96" exitCode=0 Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.325024 4931 generic.go:334] "Generic (PLEG): container finished" podID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerID="fcd84ab06f79a15cca51ab919a2d8f4365b9ca38a548c71882f816752c64d1a8" exitCode=143 Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.324682 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67465d5765-cp74w" event={"ID":"a83e0ea3-83ba-4e7c-803c-4fd9811318a2","Type":"ContainerDied","Data":"e33db123fcf3634be4f056a5ccfd14e3aaf930151b4fce1b9c71c79348a5ff96"} Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.325328 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67465d5765-cp74w" event={"ID":"a83e0ea3-83ba-4e7c-803c-4fd9811318a2","Type":"ContainerDied","Data":"fcd84ab06f79a15cca51ab919a2d8f4365b9ca38a548c71882f816752c64d1a8"} Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.907292 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.972889 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-6rw7f"] Jan 30 05:26:29 crc kubenswrapper[4931]: I0130 05:26:29.973116 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerName="dnsmasq-dns" containerID="cri-o://00f6a2dd44878296bf4733164be83ce28b802aa8c1f8943860365c668511c527" gracePeriod=10 Jan 30 05:26:30 crc kubenswrapper[4931]: I0130 05:26:30.340692 4931 generic.go:334] "Generic (PLEG): container finished" podID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerID="00f6a2dd44878296bf4733164be83ce28b802aa8c1f8943860365c668511c527" exitCode=0 Jan 30 05:26:30 crc kubenswrapper[4931]: I0130 05:26:30.341239 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" event={"ID":"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089","Type":"ContainerDied","Data":"00f6a2dd44878296bf4733164be83ce28b802aa8c1f8943860365c668511c527"} Jan 30 05:26:31 crc kubenswrapper[4931]: I0130 05:26:31.588774 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6665f9d796-74mbd" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:41292->10.217.0.157:9311: read: connection reset by peer" Jan 30 05:26:31 crc kubenswrapper[4931]: I0130 05:26:31.588811 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6665f9d796-74mbd" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:41288->10.217.0.157:9311: read: connection reset by peer" Jan 30 05:26:31 crc kubenswrapper[4931]: I0130 05:26:31.779120 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:31 crc kubenswrapper[4931]: I0130 05:26:31.846206 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.387774 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" event={"ID":"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089","Type":"ContainerDied","Data":"ae9e2c438382358c90343f1970628f2e8ea67a2dacc48b9e3c93a331cd67467e"} Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.387822 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae9e2c438382358c90343f1970628f2e8ea67a2dacc48b9e3c93a331cd67467e" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.395711 4931 generic.go:334] "Generic (PLEG): container finished" podID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerID="e29747a73e8f9c7e78f65f5f5c5542788fec252768ae4f62a8f1f67f3d4ca4ed" exitCode=0 Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.395792 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6665f9d796-74mbd" event={"ID":"28e3fd91-5906-4368-b156-e0d60f3c268e","Type":"ContainerDied","Data":"e29747a73e8f9c7e78f65f5f5c5542788fec252768ae4f62a8f1f67f3d4ca4ed"} Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.447487 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.569864 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5mcg\" (UniqueName: \"kubernetes.io/projected/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-kube-api-access-s5mcg\") pod \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.569934 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-config\") pod \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.569954 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-sb\") pod \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.569994 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-swift-storage-0\") pod \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.570017 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-nb\") pod \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.570035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-svc\") pod \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\" (UID: \"10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089\") " Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.593411 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-kube-api-access-s5mcg" (OuterVolumeSpecName: "kube-api-access-s5mcg") pod "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" (UID: "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089"). InnerVolumeSpecName "kube-api-access-s5mcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.641568 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" (UID: "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.663675 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-config" (OuterVolumeSpecName: "config") pod "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" (UID: "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.671949 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.671972 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.671983 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5mcg\" (UniqueName: \"kubernetes.io/projected/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-kube-api-access-s5mcg\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.689002 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" (UID: "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.751082 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" (UID: "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.758126 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" (UID: "10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.774004 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.774032 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4931]: I0130 05:26:32.774050 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.058231 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.082179 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.184592 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data\") pod \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.184636 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-combined-ca-bundle\") pod \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.184678 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vl6j\" (UniqueName: \"kubernetes.io/projected/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-kube-api-access-8vl6j\") pod \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.184718 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data\") pod \"28e3fd91-5906-4368-b156-e0d60f3c268e\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.184740 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-logs\") pod \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.185216 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e3fd91-5906-4368-b156-e0d60f3c268e-logs\") pod \"28e3fd91-5906-4368-b156-e0d60f3c268e\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.185325 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9lwj\" (UniqueName: \"kubernetes.io/projected/28e3fd91-5906-4368-b156-e0d60f3c268e-kube-api-access-f9lwj\") pod \"28e3fd91-5906-4368-b156-e0d60f3c268e\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.185349 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data-custom\") pod \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\" (UID: \"a83e0ea3-83ba-4e7c-803c-4fd9811318a2\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.185373 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data-custom\") pod \"28e3fd91-5906-4368-b156-e0d60f3c268e\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.185397 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-combined-ca-bundle\") pod \"28e3fd91-5906-4368-b156-e0d60f3c268e\" (UID: \"28e3fd91-5906-4368-b156-e0d60f3c268e\") " Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.185899 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-logs" (OuterVolumeSpecName: "logs") pod "a83e0ea3-83ba-4e7c-803c-4fd9811318a2" (UID: "a83e0ea3-83ba-4e7c-803c-4fd9811318a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.186039 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e3fd91-5906-4368-b156-e0d60f3c268e-logs" (OuterVolumeSpecName: "logs") pod "28e3fd91-5906-4368-b156-e0d60f3c268e" (UID: "28e3fd91-5906-4368-b156-e0d60f3c268e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.189143 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28e3fd91-5906-4368-b156-e0d60f3c268e" (UID: "28e3fd91-5906-4368-b156-e0d60f3c268e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.196843 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a83e0ea3-83ba-4e7c-803c-4fd9811318a2" (UID: "a83e0ea3-83ba-4e7c-803c-4fd9811318a2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.196864 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-kube-api-access-8vl6j" (OuterVolumeSpecName: "kube-api-access-8vl6j") pod "a83e0ea3-83ba-4e7c-803c-4fd9811318a2" (UID: "a83e0ea3-83ba-4e7c-803c-4fd9811318a2"). InnerVolumeSpecName "kube-api-access-8vl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.200711 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e3fd91-5906-4368-b156-e0d60f3c268e-kube-api-access-f9lwj" (OuterVolumeSpecName: "kube-api-access-f9lwj") pod "28e3fd91-5906-4368-b156-e0d60f3c268e" (UID: "28e3fd91-5906-4368-b156-e0d60f3c268e"). InnerVolumeSpecName "kube-api-access-f9lwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.210641 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a83e0ea3-83ba-4e7c-803c-4fd9811318a2" (UID: "a83e0ea3-83ba-4e7c-803c-4fd9811318a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.214124 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28e3fd91-5906-4368-b156-e0d60f3c268e" (UID: "28e3fd91-5906-4368-b156-e0d60f3c268e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.233909 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data" (OuterVolumeSpecName: "config-data") pod "a83e0ea3-83ba-4e7c-803c-4fd9811318a2" (UID: "a83e0ea3-83ba-4e7c-803c-4fd9811318a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.234160 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data" (OuterVolumeSpecName: "config-data") pod "28e3fd91-5906-4368-b156-e0d60f3c268e" (UID: "28e3fd91-5906-4368-b156-e0d60f3c268e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287674 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e3fd91-5906-4368-b156-e0d60f3c268e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287706 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9lwj\" (UniqueName: \"kubernetes.io/projected/28e3fd91-5906-4368-b156-e0d60f3c268e-kube-api-access-f9lwj\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287720 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287730 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287738 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287746 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287754 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287762 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vl6j\" (UniqueName: \"kubernetes.io/projected/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-kube-api-access-8vl6j\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287770 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e3fd91-5906-4368-b156-e0d60f3c268e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.287778 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a83e0ea3-83ba-4e7c-803c-4fd9811318a2-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4931]: E0130 05:26:33.313671 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.406442 4931 generic.go:334] "Generic (PLEG): container finished" podID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" containerID="1e41b0f0000283bd1a29c28f4d8fdb74fdd5389e3ebd8804eac1db1375b10248" exitCode=0 Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.406510 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpr97" event={"ID":"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6","Type":"ContainerDied","Data":"1e41b0f0000283bd1a29c28f4d8fdb74fdd5389e3ebd8804eac1db1375b10248"} Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.410219 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerStarted","Data":"21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031"} Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.410277 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="ceilometer-notification-agent" containerID="cri-o://ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418" gracePeriod=30 Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.410307 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.410347 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="proxy-httpd" containerID="cri-o://21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031" gracePeriod=30 Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.410447 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="sg-core" containerID="cri-o://c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547" gracePeriod=30 Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.415137 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6665f9d796-74mbd" event={"ID":"28e3fd91-5906-4368-b156-e0d60f3c268e","Type":"ContainerDied","Data":"b479366de18a258ddf192480628e9708d18845ae58ecac6569bfbb633a96f682"} Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.415165 4931 scope.go:117] "RemoveContainer" containerID="e29747a73e8f9c7e78f65f5f5c5542788fec252768ae4f62a8f1f67f3d4ca4ed" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.415242 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6665f9d796-74mbd" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.420631 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-6rw7f" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.421248 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67465d5765-cp74w" event={"ID":"a83e0ea3-83ba-4e7c-803c-4fd9811318a2","Type":"ContainerDied","Data":"9f1458d6f86849c7d56580c53cae53507cdf0fec4d72928952c134f8ba2a7ca8"} Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.421326 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67465d5765-cp74w" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.475446 4931 scope.go:117] "RemoveContainer" containerID="fc46b2c7ab19d04106c22b57ea741652e3c63fe169a85fea4405836701cbe7c6" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.505162 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6665f9d796-74mbd"] Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.505821 4931 scope.go:117] "RemoveContainer" containerID="e33db123fcf3634be4f056a5ccfd14e3aaf930151b4fce1b9c71c79348a5ff96" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.543254 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6665f9d796-74mbd"] Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.555965 4931 scope.go:117] "RemoveContainer" containerID="fcd84ab06f79a15cca51ab919a2d8f4365b9ca38a548c71882f816752c64d1a8" Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.559019 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-67465d5765-cp74w"] Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.570293 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-67465d5765-cp74w"] Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.604087 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-6rw7f"] Jan 30 05:26:33 crc kubenswrapper[4931]: I0130 05:26:33.621699 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-6rw7f"] Jan 30 05:26:33 crc kubenswrapper[4931]: E0130 05:26:33.671936 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddebbaca0_0d1f_47cd_bb8e_8e09e4a65307.slice/crio-c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda83e0ea3_83ba_4e7c_803c_4fd9811318a2.slice/crio-9f1458d6f86849c7d56580c53cae53507cdf0fec4d72928952c134f8ba2a7ca8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda83e0ea3_83ba_4e7c_803c_4fd9811318a2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e3fd91_5906_4368_b156_e0d60f3c268e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddebbaca0_0d1f_47cd_bb8e_8e09e4a65307.slice/crio-conmon-c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e3fd91_5906_4368_b156_e0d60f3c268e.slice/crio-b479366de18a258ddf192480628e9708d18845ae58ecac6569bfbb633a96f682\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10aa7dbb_a9c9_4f2e_8ae5_ec39da4fb089.slice/crio-ae9e2c438382358c90343f1970628f2e8ea67a2dacc48b9e3c93a331cd67467e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10aa7dbb_a9c9_4f2e_8ae5_ec39da4fb089.slice\": RecentStats: unable to find data in memory cache]" Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.434655 4931 generic.go:334] "Generic (PLEG): container finished" podID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerID="21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031" exitCode=0 Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.435099 4931 generic.go:334] "Generic (PLEG): container finished" podID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerID="c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547" exitCode=2 Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.435192 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerDied","Data":"21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031"} Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.435243 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerDied","Data":"c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547"} Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.887059 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpr97" Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.918802 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-combined-ca-bundle\") pod \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.918983 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-config-data\") pod \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.919027 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-db-sync-config-data\") pod \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.919071 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6jwz\" (UniqueName: \"kubernetes.io/projected/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-kube-api-access-x6jwz\") pod \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.919187 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-scripts\") pod \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.919318 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-etc-machine-id\") pod \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\" (UID: \"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6\") " Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.920084 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" (UID: "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.927714 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-scripts" (OuterVolumeSpecName: "scripts") pod "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" (UID: "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.929276 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" (UID: "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.937659 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-kube-api-access-x6jwz" (OuterVolumeSpecName: "kube-api-access-x6jwz") pod "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" (UID: "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6"). InnerVolumeSpecName "kube-api-access-x6jwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:34 crc kubenswrapper[4931]: I0130 05:26:34.971229 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" (UID: "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.001046 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-config-data" (OuterVolumeSpecName: "config-data") pod "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" (UID: "6dd6723b-baf8-47eb-a774-68a5dfbcc4a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.021578 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.021605 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.021614 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.021622 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.021631 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.021639 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6jwz\" (UniqueName: \"kubernetes.io/projected/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6-kube-api-access-x6jwz\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.439558 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" path="/var/lib/kubelet/pods/10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089/volumes" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.440847 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" path="/var/lib/kubelet/pods/28e3fd91-5906-4368-b156-e0d60f3c268e/volumes" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.442359 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" path="/var/lib/kubelet/pods/a83e0ea3-83ba-4e7c-803c-4fd9811318a2/volumes" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.451521 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpr97" event={"ID":"6dd6723b-baf8-47eb-a774-68a5dfbcc4a6","Type":"ContainerDied","Data":"f838ce1f11e506679d678bae95342cc3dcecec78b2114b17644603c407ad3619"} Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.451594 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f838ce1f11e506679d678bae95342cc3dcecec78b2114b17644603c407ad3619" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.451689 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpr97" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832048 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-vxxmk"] Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832391 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832408 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api" Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832444 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerName="init" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832451 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerName="init" Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832464 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" containerName="cinder-db-sync" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832474 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" containerName="cinder-db-sync" Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832483 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker-log" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832490 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker-log" Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832504 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api-log" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832510 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api-log" Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832522 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832529 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker" Jan 30 05:26:35 crc kubenswrapper[4931]: E0130 05:26:35.832539 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerName="dnsmasq-dns" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832546 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerName="dnsmasq-dns" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832707 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api-log" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832723 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832735 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="10aa7dbb-a9c9-4f2e-8ae5-ec39da4fb089" containerName="dnsmasq-dns" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832749 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" containerName="cinder-db-sync" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832757 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e3fd91-5906-4368-b156-e0d60f3c268e" containerName="barbican-api" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.832764 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83e0ea3-83ba-4e7c-803c-4fd9811318a2" containerName="barbican-worker-log" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.833625 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.853963 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-vxxmk"] Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.867221 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.872647 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.876455 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kv6bp" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.876560 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.876671 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.876748 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.888035 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.943864 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-config\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.943912 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682nm\" (UniqueName: \"kubernetes.io/projected/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-kube-api-access-682nm\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.943945 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.943983 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-scripts\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944001 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944040 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944058 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944098 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944117 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkqp\" (UniqueName: \"kubernetes.io/projected/1f487872-4003-4559-8f72-1c6022321160-kube-api-access-2pkqp\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944145 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.944200 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f487872-4003-4559-8f72-1c6022321160-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.995597 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:35 crc kubenswrapper[4931]: I0130 05:26:35.997476 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.000016 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.010184 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.048851 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-config\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.048891 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682nm\" (UniqueName: \"kubernetes.io/projected/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-kube-api-access-682nm\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.048928 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.048968 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-scripts\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.048989 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data-custom\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049048 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049066 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049084 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049102 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049146 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkqp\" (UniqueName: \"kubernetes.io/projected/1f487872-4003-4559-8f72-1c6022321160-kube-api-access-2pkqp\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049177 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049218 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m9bv\" (UniqueName: \"kubernetes.io/projected/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-kube-api-access-5m9bv\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049238 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-scripts\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049264 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049283 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049600 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-logs\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.049950 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.050012 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f487872-4003-4559-8f72-1c6022321160-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.050089 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f487872-4003-4559-8f72-1c6022321160-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.050500 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.051019 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.051778 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-config\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.055946 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.056635 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.057553 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.059946 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.061880 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-scripts\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.077735 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkqp\" (UniqueName: \"kubernetes.io/projected/1f487872-4003-4559-8f72-1c6022321160-kube-api-access-2pkqp\") pod \"cinder-scheduler-0\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.082864 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682nm\" (UniqueName: \"kubernetes.io/projected/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-kube-api-access-682nm\") pod \"dnsmasq-dns-75bfc9b94f-vxxmk\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154652 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data-custom\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154739 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154793 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m9bv\" (UniqueName: \"kubernetes.io/projected/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-kube-api-access-5m9bv\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-scripts\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154837 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154856 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.154886 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-logs\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.155482 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-logs\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.156869 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.165696 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.166308 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.166960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.173006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-scripts\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.182927 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m9bv\" (UniqueName: \"kubernetes.io/projected/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-kube-api-access-5m9bv\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.200115 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.207201 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data-custom\") pod \"cinder-api-0\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.344443 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.608703 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.731173 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-vxxmk"] Jan 30 05:26:36 crc kubenswrapper[4931]: W0130 05:26:36.988706 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf6bc53b_31f7_4650_aab3_d4bcf8b685ab.slice/crio-9ea7e55f3db83940154f2b0bcb0d4ef000b7a1cdefa0062385e86b6b76cab2c2 WatchSource:0}: Error finding container 9ea7e55f3db83940154f2b0bcb0d4ef000b7a1cdefa0062385e86b6b76cab2c2: Status 404 returned error can't find the container with id 9ea7e55f3db83940154f2b0bcb0d4ef000b7a1cdefa0062385e86b6b76cab2c2 Jan 30 05:26:36 crc kubenswrapper[4931]: I0130 05:26:36.991215 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.474683 4931 generic.go:334] "Generic (PLEG): container finished" podID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerID="096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd" exitCode=0 Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.475346 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" event={"ID":"98fe74d3-fa52-4814-8497-1a9bb9ea72ed","Type":"ContainerDied","Data":"096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd"} Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.475382 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" event={"ID":"98fe74d3-fa52-4814-8497-1a9bb9ea72ed","Type":"ContainerStarted","Data":"e1641f306bf07b5142c6dd94dd4d7be821af4a934007d916dd0dd69749c5f578"} Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.476628 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"af6bc53b-31f7-4650-aab3-d4bcf8b685ab","Type":"ContainerStarted","Data":"9ea7e55f3db83940154f2b0bcb0d4ef000b7a1cdefa0062385e86b6b76cab2c2"} Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.479984 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1f487872-4003-4559-8f72-1c6022321160","Type":"ContainerStarted","Data":"16910d816a28eef85f00dcbaeae0524c9893ffdb8537d1cf664654c8a4d009f8"} Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.844142 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.906753 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.987815 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-log-httpd\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.987860 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl9x6\" (UniqueName: \"kubernetes.io/projected/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-kube-api-access-nl9x6\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.988030 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-combined-ca-bundle\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.988072 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-sg-core-conf-yaml\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.988143 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-scripts\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.988182 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-config-data\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.988239 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-run-httpd\") pod \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\" (UID: \"debbaca0-0d1f-47cd-bb8e-8e09e4a65307\") " Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.989157 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.989873 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.994698 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-kube-api-access-nl9x6" (OuterVolumeSpecName: "kube-api-access-nl9x6") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "kube-api-access-nl9x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4931]: I0130 05:26:37.995084 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-scripts" (OuterVolumeSpecName: "scripts") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.031665 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.076084 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-config-data" (OuterVolumeSpecName: "config-data") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.077231 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "debbaca0-0d1f-47cd-bb8e-8e09e4a65307" (UID: "debbaca0-0d1f-47cd-bb8e-8e09e4a65307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093583 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093614 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093626 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093635 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093646 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093655 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.093663 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl9x6\" (UniqueName: \"kubernetes.io/projected/debbaca0-0d1f-47cd-bb8e-8e09e4a65307-kube-api-access-nl9x6\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.421709 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.453133 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.500126 4931 generic.go:334] "Generic (PLEG): container finished" podID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerID="ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418" exitCode=0 Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.500190 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerDied","Data":"ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418"} Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.500216 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"debbaca0-0d1f-47cd-bb8e-8e09e4a65307","Type":"ContainerDied","Data":"c0144289ab3513de686db41a01bd60e595d46a3f8bcaea66b48e5c2753f90feb"} Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.500234 4931 scope.go:117] "RemoveContainer" containerID="21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.500355 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.518918 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" event={"ID":"98fe74d3-fa52-4814-8497-1a9bb9ea72ed","Type":"ContainerStarted","Data":"1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f"} Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.518990 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.526617 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-794bfbdd44-9msr6"] Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.526896 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-794bfbdd44-9msr6" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api-log" containerID="cri-o://845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5" gracePeriod=30 Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.526929 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-794bfbdd44-9msr6" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api" containerID="cri-o://7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819" gracePeriod=30 Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.532699 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api-log" containerID="cri-o://2a8b28eadcd454ca8adf3b36ea9153ce1d0f727ce7e3b65bd14a0471ebbea32f" gracePeriod=30 Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.532777 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"af6bc53b-31f7-4650-aab3-d4bcf8b685ab","Type":"ContainerStarted","Data":"6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3"} Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.532800 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"af6bc53b-31f7-4650-aab3-d4bcf8b685ab","Type":"ContainerStarted","Data":"2a8b28eadcd454ca8adf3b36ea9153ce1d0f727ce7e3b65bd14a0471ebbea32f"} Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.532818 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.532841 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api" containerID="cri-o://6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3" gracePeriod=30 Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.544711 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1f487872-4003-4559-8f72-1c6022321160","Type":"ContainerStarted","Data":"1ee4814b304dc2facdec4fe5a7ec548e21648d29d6f5eb9a0d58da2eecb4e24b"} Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.559656 4931 scope.go:117] "RemoveContainer" containerID="c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.565914 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" podStartSLOduration=3.565893691 podStartE2EDuration="3.565893691s" podCreationTimestamp="2026-01-30 05:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:38.547280601 +0000 UTC m=+1133.917190878" watchObservedRunningTime="2026-01-30 05:26:38.565893691 +0000 UTC m=+1133.935803948" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.615070 4931 scope.go:117] "RemoveContainer" containerID="ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.620126 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.620107549 podStartE2EDuration="3.620107549s" podCreationTimestamp="2026-01-30 05:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:38.580864787 +0000 UTC m=+1133.950775044" watchObservedRunningTime="2026-01-30 05:26:38.620107549 +0000 UTC m=+1133.990017806" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.652504 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.671615 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.718414 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4931]: E0130 05:26:38.725166 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="proxy-httpd" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.725195 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="proxy-httpd" Jan 30 05:26:38 crc kubenswrapper[4931]: E0130 05:26:38.725219 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="sg-core" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.725226 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="sg-core" Jan 30 05:26:38 crc kubenswrapper[4931]: E0130 05:26:38.725258 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="ceilometer-notification-agent" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.725265 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="ceilometer-notification-agent" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.725701 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="sg-core" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.725725 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="ceilometer-notification-agent" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.725749 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" containerName="proxy-httpd" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.730661 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.730788 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.732700 4931 scope.go:117] "RemoveContainer" containerID="21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031" Jan 30 05:26:38 crc kubenswrapper[4931]: E0130 05:26:38.737089 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031\": container with ID starting with 21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031 not found: ID does not exist" containerID="21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.737133 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031"} err="failed to get container status \"21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031\": rpc error: code = NotFound desc = could not find container \"21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031\": container with ID starting with 21b691b2e09482e45017a7d0d0da5ee40918b81eaf090f346eedfe083ceae031 not found: ID does not exist" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.737159 4931 scope.go:117] "RemoveContainer" containerID="c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.738922 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.739049 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:26:38 crc kubenswrapper[4931]: E0130 05:26:38.739496 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547\": container with ID starting with c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547 not found: ID does not exist" containerID="c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.739573 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547"} err="failed to get container status \"c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547\": rpc error: code = NotFound desc = could not find container \"c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547\": container with ID starting with c463c463e45ea10f9d2515585c1aaf9faf895d02d8c1374d44f662c63e4c8547 not found: ID does not exist" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.739624 4931 scope.go:117] "RemoveContainer" containerID="ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418" Jan 30 05:26:38 crc kubenswrapper[4931]: E0130 05:26:38.745087 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418\": container with ID starting with ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418 not found: ID does not exist" containerID="ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.745120 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418"} err="failed to get container status \"ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418\": rpc error: code = NotFound desc = could not find container \"ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418\": container with ID starting with ad927df37265a1a2b98e1c5c50856998727c6049b8e51b41498d7e63a4b86418 not found: ID does not exist" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.810873 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-config-data\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.811135 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-run-httpd\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.811180 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-scripts\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.811195 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.811241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-log-httpd\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.811262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.811297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2j42\" (UniqueName: \"kubernetes.io/projected/e5755369-fc75-443e-b608-996b7212ac94-kube-api-access-x2j42\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913467 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-config-data\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913537 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-run-httpd\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-scripts\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913597 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913652 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-log-httpd\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913681 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.913727 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2j42\" (UniqueName: \"kubernetes.io/projected/e5755369-fc75-443e-b608-996b7212ac94-kube-api-access-x2j42\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.915650 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-run-httpd\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.917978 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-log-httpd\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.921232 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-scripts\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.922389 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-config-data\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.925826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.929271 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4931]: I0130 05:26:38.934634 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2j42\" (UniqueName: \"kubernetes.io/projected/e5755369-fc75-443e-b608-996b7212ac94-kube-api-access-x2j42\") pod \"ceilometer-0\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " pod="openstack/ceilometer-0" Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.063707 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.430948 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debbaca0-0d1f-47cd-bb8e-8e09e4a65307" path="/var/lib/kubelet/pods/debbaca0-0d1f-47cd-bb8e-8e09e4a65307/volumes" Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.532163 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:39 crc kubenswrapper[4931]: W0130 05:26:39.538863 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-ea5ec3936bd62e44ba566c0aec793fe2ef89fc6023d7cad6b0242228b7b8d07a WatchSource:0}: Error finding container ea5ec3936bd62e44ba566c0aec793fe2ef89fc6023d7cad6b0242228b7b8d07a: Status 404 returned error can't find the container with id ea5ec3936bd62e44ba566c0aec793fe2ef89fc6023d7cad6b0242228b7b8d07a Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.556598 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerStarted","Data":"ea5ec3936bd62e44ba566c0aec793fe2ef89fc6023d7cad6b0242228b7b8d07a"} Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.564369 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerID="845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5" exitCode=143 Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.564454 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794bfbdd44-9msr6" event={"ID":"c3dfec36-0758-42c6-8c28-997044eb59a3","Type":"ContainerDied","Data":"845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5"} Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.568802 4931 generic.go:334] "Generic (PLEG): container finished" podID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerID="2a8b28eadcd454ca8adf3b36ea9153ce1d0f727ce7e3b65bd14a0471ebbea32f" exitCode=143 Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.568878 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"af6bc53b-31f7-4650-aab3-d4bcf8b685ab","Type":"ContainerDied","Data":"2a8b28eadcd454ca8adf3b36ea9153ce1d0f727ce7e3b65bd14a0471ebbea32f"} Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.574779 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1f487872-4003-4559-8f72-1c6022321160","Type":"ContainerStarted","Data":"136922e5f994f3ec703ee0b76647539238ec38bef505c93a23f26dc8f73ef24d"} Jan 30 05:26:39 crc kubenswrapper[4931]: I0130 05:26:39.593678 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.72480247 podStartE2EDuration="4.593663766s" podCreationTimestamp="2026-01-30 05:26:35 +0000 UTC" firstStartedPulling="2026-01-30 05:26:36.64675765 +0000 UTC m=+1132.016667907" lastFinishedPulling="2026-01-30 05:26:37.515618946 +0000 UTC m=+1132.885529203" observedRunningTime="2026-01-30 05:26:39.591518039 +0000 UTC m=+1134.961428326" watchObservedRunningTime="2026-01-30 05:26:39.593663766 +0000 UTC m=+1134.963574023" Jan 30 05:26:40 crc kubenswrapper[4931]: I0130 05:26:40.595552 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerStarted","Data":"08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581"} Jan 30 05:26:41 crc kubenswrapper[4931]: I0130 05:26:41.201300 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4931]: I0130 05:26:41.608700 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerStarted","Data":"b2d093b072c9cdec044442a6734371a3413de8a6cea48d10abb06780f1cd4e75"} Jan 30 05:26:41 crc kubenswrapper[4931]: I0130 05:26:41.608737 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerStarted","Data":"f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d"} Jan 30 05:26:41 crc kubenswrapper[4931]: I0130 05:26:41.687603 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-794bfbdd44-9msr6" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:45332->10.217.0.164:9311: read: connection reset by peer" Jan 30 05:26:41 crc kubenswrapper[4931]: I0130 05:26:41.687960 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-794bfbdd44-9msr6" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:45318->10.217.0.164:9311: read: connection reset by peer" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.344144 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.412407 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data-custom\") pod \"c3dfec36-0758-42c6-8c28-997044eb59a3\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.412506 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dfec36-0758-42c6-8c28-997044eb59a3-logs\") pod \"c3dfec36-0758-42c6-8c28-997044eb59a3\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.412641 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l264t\" (UniqueName: \"kubernetes.io/projected/c3dfec36-0758-42c6-8c28-997044eb59a3-kube-api-access-l264t\") pod \"c3dfec36-0758-42c6-8c28-997044eb59a3\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.412705 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-combined-ca-bundle\") pod \"c3dfec36-0758-42c6-8c28-997044eb59a3\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.412786 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data\") pod \"c3dfec36-0758-42c6-8c28-997044eb59a3\" (UID: \"c3dfec36-0758-42c6-8c28-997044eb59a3\") " Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.414117 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3dfec36-0758-42c6-8c28-997044eb59a3-logs" (OuterVolumeSpecName: "logs") pod "c3dfec36-0758-42c6-8c28-997044eb59a3" (UID: "c3dfec36-0758-42c6-8c28-997044eb59a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.419677 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3dfec36-0758-42c6-8c28-997044eb59a3" (UID: "c3dfec36-0758-42c6-8c28-997044eb59a3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.424478 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3dfec36-0758-42c6-8c28-997044eb59a3-kube-api-access-l264t" (OuterVolumeSpecName: "kube-api-access-l264t") pod "c3dfec36-0758-42c6-8c28-997044eb59a3" (UID: "c3dfec36-0758-42c6-8c28-997044eb59a3"). InnerVolumeSpecName "kube-api-access-l264t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.448446 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3dfec36-0758-42c6-8c28-997044eb59a3" (UID: "c3dfec36-0758-42c6-8c28-997044eb59a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.503750 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data" (OuterVolumeSpecName: "config-data") pod "c3dfec36-0758-42c6-8c28-997044eb59a3" (UID: "c3dfec36-0758-42c6-8c28-997044eb59a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.515788 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dfec36-0758-42c6-8c28-997044eb59a3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.515839 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l264t\" (UniqueName: \"kubernetes.io/projected/c3dfec36-0758-42c6-8c28-997044eb59a3-kube-api-access-l264t\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.515861 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.515879 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.515897 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3dfec36-0758-42c6-8c28-997044eb59a3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.622376 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerID="7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819" exitCode=0 Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.622471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794bfbdd44-9msr6" event={"ID":"c3dfec36-0758-42c6-8c28-997044eb59a3","Type":"ContainerDied","Data":"7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819"} Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.622521 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-794bfbdd44-9msr6" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.622542 4931 scope.go:117] "RemoveContainer" containerID="7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.622528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-794bfbdd44-9msr6" event={"ID":"c3dfec36-0758-42c6-8c28-997044eb59a3","Type":"ContainerDied","Data":"d2345973c91074357738f420a2e030c74216ac2f1de71e5a31e295982e276e96"} Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.659208 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-794bfbdd44-9msr6"] Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.662199 4931 scope.go:117] "RemoveContainer" containerID="845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.668113 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-794bfbdd44-9msr6"] Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.687099 4931 scope.go:117] "RemoveContainer" containerID="7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819" Jan 30 05:26:42 crc kubenswrapper[4931]: E0130 05:26:42.687683 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819\": container with ID starting with 7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819 not found: ID does not exist" containerID="7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.687733 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819"} err="failed to get container status \"7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819\": rpc error: code = NotFound desc = could not find container \"7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819\": container with ID starting with 7b6be02bbcc8d0b1586d305237296d73b7b3f419c5f0464691a475c292d19819 not found: ID does not exist" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.687761 4931 scope.go:117] "RemoveContainer" containerID="845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5" Jan 30 05:26:42 crc kubenswrapper[4931]: E0130 05:26:42.688133 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5\": container with ID starting with 845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5 not found: ID does not exist" containerID="845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5" Jan 30 05:26:42 crc kubenswrapper[4931]: I0130 05:26:42.688194 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5"} err="failed to get container status \"845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5\": rpc error: code = NotFound desc = could not find container \"845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5\": container with ID starting with 845309e636acb01f517162d220ba0171bf46701cbbd559935997c52d2058a8d5 not found: ID does not exist" Jan 30 05:26:43 crc kubenswrapper[4931]: I0130 05:26:43.463297 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" path="/var/lib/kubelet/pods/c3dfec36-0758-42c6-8c28-997044eb59a3/volumes" Jan 30 05:26:43 crc kubenswrapper[4931]: I0130 05:26:43.641462 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:26:43 crc kubenswrapper[4931]: I0130 05:26:43.669167 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.857656306 podStartE2EDuration="5.669153435s" podCreationTimestamp="2026-01-30 05:26:38 +0000 UTC" firstStartedPulling="2026-01-30 05:26:39.541335516 +0000 UTC m=+1134.911245773" lastFinishedPulling="2026-01-30 05:26:43.352832625 +0000 UTC m=+1138.722742902" observedRunningTime="2026-01-30 05:26:43.66834564 +0000 UTC m=+1139.038255927" watchObservedRunningTime="2026-01-30 05:26:43.669153435 +0000 UTC m=+1139.039063692" Jan 30 05:26:44 crc kubenswrapper[4931]: I0130 05:26:44.659781 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerStarted","Data":"9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f"} Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.167705 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.261810 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-zmls6"] Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.262077 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerName="dnsmasq-dns" containerID="cri-o://d07be3298c7001c20f2b88e58ea237b62170a8ccd4a24bfb286b3d0a2bff7807" gracePeriod=10 Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.466371 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.515152 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.682130 4931 generic.go:334] "Generic (PLEG): container finished" podID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerID="d07be3298c7001c20f2b88e58ea237b62170a8ccd4a24bfb286b3d0a2bff7807" exitCode=0 Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.682529 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="cinder-scheduler" containerID="cri-o://1ee4814b304dc2facdec4fe5a7ec548e21648d29d6f5eb9a0d58da2eecb4e24b" gracePeriod=30 Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.682608 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="probe" containerID="cri-o://136922e5f994f3ec703ee0b76647539238ec38bef505c93a23f26dc8f73ef24d" gracePeriod=30 Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.682620 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" event={"ID":"4aa89fd3-2a8a-424c-b3a7-cf743d90a249","Type":"ContainerDied","Data":"d07be3298c7001c20f2b88e58ea237b62170a8ccd4a24bfb286b3d0a2bff7807"} Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.807072 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.922777 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-sb\") pod \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.922844 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-config\") pod \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.922919 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-swift-storage-0\") pod \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.923054 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2lmn\" (UniqueName: \"kubernetes.io/projected/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-kube-api-access-z2lmn\") pod \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.923077 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-nb\") pod \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.923101 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-svc\") pod \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\" (UID: \"4aa89fd3-2a8a-424c-b3a7-cf743d90a249\") " Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.930172 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-kube-api-access-z2lmn" (OuterVolumeSpecName: "kube-api-access-z2lmn") pod "4aa89fd3-2a8a-424c-b3a7-cf743d90a249" (UID: "4aa89fd3-2a8a-424c-b3a7-cf743d90a249"). InnerVolumeSpecName "kube-api-access-z2lmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.980933 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4aa89fd3-2a8a-424c-b3a7-cf743d90a249" (UID: "4aa89fd3-2a8a-424c-b3a7-cf743d90a249"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4931]: I0130 05:26:46.985636 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4aa89fd3-2a8a-424c-b3a7-cf743d90a249" (UID: "4aa89fd3-2a8a-424c-b3a7-cf743d90a249"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.002162 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4aa89fd3-2a8a-424c-b3a7-cf743d90a249" (UID: "4aa89fd3-2a8a-424c-b3a7-cf743d90a249"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.005875 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4aa89fd3-2a8a-424c-b3a7-cf743d90a249" (UID: "4aa89fd3-2a8a-424c-b3a7-cf743d90a249"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.010265 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-config" (OuterVolumeSpecName: "config") pod "4aa89fd3-2a8a-424c-b3a7-cf743d90a249" (UID: "4aa89fd3-2a8a-424c-b3a7-cf743d90a249"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.025746 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.025937 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.026045 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.026134 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.026387 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.026475 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2lmn\" (UniqueName: \"kubernetes.io/projected/4aa89fd3-2a8a-424c-b3a7-cf743d90a249-kube-api-access-z2lmn\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.696119 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" event={"ID":"4aa89fd3-2a8a-424c-b3a7-cf743d90a249","Type":"ContainerDied","Data":"b7351472d16e045cf1d352d57e3502d62cfe0a1c627e0387d4154e9570e9d7c6"} Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.697545 4931 scope.go:117] "RemoveContainer" containerID="d07be3298c7001c20f2b88e58ea237b62170a8ccd4a24bfb286b3d0a2bff7807" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.696444 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-zmls6" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.701342 4931 generic.go:334] "Generic (PLEG): container finished" podID="1f487872-4003-4559-8f72-1c6022321160" containerID="136922e5f994f3ec703ee0b76647539238ec38bef505c93a23f26dc8f73ef24d" exitCode=0 Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.701490 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1f487872-4003-4559-8f72-1c6022321160","Type":"ContainerDied","Data":"136922e5f994f3ec703ee0b76647539238ec38bef505c93a23f26dc8f73ef24d"} Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.726673 4931 scope.go:117] "RemoveContainer" containerID="42238edb208312fe89370f3d6e71cdbdaaf5a688762779ee0068776a658a91e9" Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.731462 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-zmls6"] Jan 30 05:26:47 crc kubenswrapper[4931]: I0130 05:26:47.741528 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-zmls6"] Jan 30 05:26:48 crc kubenswrapper[4931]: I0130 05:26:48.562718 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.435794 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" path="/var/lib/kubelet/pods/4aa89fd3-2a8a-424c-b3a7-cf743d90a249/volumes" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.727128 4931 generic.go:334] "Generic (PLEG): container finished" podID="1f487872-4003-4559-8f72-1c6022321160" containerID="1ee4814b304dc2facdec4fe5a7ec548e21648d29d6f5eb9a0d58da2eecb4e24b" exitCode=0 Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.727170 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1f487872-4003-4559-8f72-1c6022321160","Type":"ContainerDied","Data":"1ee4814b304dc2facdec4fe5a7ec548e21648d29d6f5eb9a0d58da2eecb4e24b"} Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.727195 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1f487872-4003-4559-8f72-1c6022321160","Type":"ContainerDied","Data":"16910d816a28eef85f00dcbaeae0524c9893ffdb8537d1cf664654c8a4d009f8"} Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.727205 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16910d816a28eef85f00dcbaeae0524c9893ffdb8537d1cf664654c8a4d009f8" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.736237 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.763393 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878092 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f487872-4003-4559-8f72-1c6022321160-etc-machine-id\") pod \"1f487872-4003-4559-8f72-1c6022321160\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878220 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkqp\" (UniqueName: \"kubernetes.io/projected/1f487872-4003-4559-8f72-1c6022321160-kube-api-access-2pkqp\") pod \"1f487872-4003-4559-8f72-1c6022321160\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878258 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-scripts\") pod \"1f487872-4003-4559-8f72-1c6022321160\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878298 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data\") pod \"1f487872-4003-4559-8f72-1c6022321160\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878315 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-combined-ca-bundle\") pod \"1f487872-4003-4559-8f72-1c6022321160\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878398 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data-custom\") pod \"1f487872-4003-4559-8f72-1c6022321160\" (UID: \"1f487872-4003-4559-8f72-1c6022321160\") " Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878207 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f487872-4003-4559-8f72-1c6022321160-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1f487872-4003-4559-8f72-1c6022321160" (UID: "1f487872-4003-4559-8f72-1c6022321160"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.878788 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f487872-4003-4559-8f72-1c6022321160-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.888700 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f487872-4003-4559-8f72-1c6022321160" (UID: "1f487872-4003-4559-8f72-1c6022321160"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.888848 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f487872-4003-4559-8f72-1c6022321160-kube-api-access-2pkqp" (OuterVolumeSpecName: "kube-api-access-2pkqp") pod "1f487872-4003-4559-8f72-1c6022321160" (UID: "1f487872-4003-4559-8f72-1c6022321160"). InnerVolumeSpecName "kube-api-access-2pkqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.899098 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-scripts" (OuterVolumeSpecName: "scripts") pod "1f487872-4003-4559-8f72-1c6022321160" (UID: "1f487872-4003-4559-8f72-1c6022321160"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.947232 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f487872-4003-4559-8f72-1c6022321160" (UID: "1f487872-4003-4559-8f72-1c6022321160"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.981396 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.981433 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.981444 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4931]: I0130 05:26:49.981454 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkqp\" (UniqueName: \"kubernetes.io/projected/1f487872-4003-4559-8f72-1c6022321160-kube-api-access-2pkqp\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.049203 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data" (OuterVolumeSpecName: "config-data") pod "1f487872-4003-4559-8f72-1c6022321160" (UID: "1f487872-4003-4559-8f72-1c6022321160"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.082577 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f487872-4003-4559-8f72-1c6022321160-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.244173 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-687c697484-j2btt" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.162:9696/\": dial tcp 10.217.0.162:9696: connect: connection refused" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.736687 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.784000 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.791751 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.804151 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.816682 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:50 crc kubenswrapper[4931]: E0130 05:26:50.817000 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817016 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api" Jan 30 05:26:50 crc kubenswrapper[4931]: E0130 05:26:50.817029 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerName="dnsmasq-dns" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817035 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerName="dnsmasq-dns" Jan 30 05:26:50 crc kubenswrapper[4931]: E0130 05:26:50.817049 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="probe" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817055 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="probe" Jan 30 05:26:50 crc kubenswrapper[4931]: E0130 05:26:50.817077 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="cinder-scheduler" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817082 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="cinder-scheduler" Jan 30 05:26:50 crc kubenswrapper[4931]: E0130 05:26:50.817093 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api-log" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817098 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api-log" Jan 30 05:26:50 crc kubenswrapper[4931]: E0130 05:26:50.817113 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerName="init" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817119 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerName="init" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817272 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa89fd3-2a8a-424c-b3a7-cf743d90a249" containerName="dnsmasq-dns" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817287 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817298 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dfec36-0758-42c6-8c28-997044eb59a3" containerName="barbican-api-log" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817310 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="cinder-scheduler" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.817321 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f487872-4003-4559-8f72-1c6022321160" containerName="probe" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.818184 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.823726 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.870443 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.874981 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.897318 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.897466 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rplvd\" (UniqueName: \"kubernetes.io/projected/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-kube-api-access-rplvd\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.897488 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.897535 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.897589 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.897607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999221 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rplvd\" (UniqueName: \"kubernetes.io/projected/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-kube-api-access-rplvd\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999269 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999315 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999369 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999392 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999458 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:50 crc kubenswrapper[4931]: I0130 05:26:50.999492 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.004637 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.005881 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.012607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.020984 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.020999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rplvd\" (UniqueName: \"kubernetes.io/projected/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-kube-api-access-rplvd\") pod \"cinder-scheduler-0\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.021050 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.146564 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.267649 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.368231 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.419871 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d9d68b44b-5gp25"] Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.436742 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f487872-4003-4559-8f72-1c6022321160" path="/var/lib/kubelet/pods/1f487872-4003-4559-8f72-1c6022321160/volumes" Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.617181 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:51 crc kubenswrapper[4931]: I0130 05:26:51.757118 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd","Type":"ContainerStarted","Data":"300c4ac1a78a0898043a5bb9c0ea1e976d3646b2689510ee5ed5d0a93470d249"} Jan 30 05:26:52 crc kubenswrapper[4931]: I0130 05:26:52.766885 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d9d68b44b-5gp25" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-log" containerID="cri-o://133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21" gracePeriod=30 Jan 30 05:26:52 crc kubenswrapper[4931]: I0130 05:26:52.767053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd","Type":"ContainerStarted","Data":"9398f7e713fb447b3a151d286be2d2910e4d8535fd421e906c46b8cc2c9a4728"} Jan 30 05:26:52 crc kubenswrapper[4931]: I0130 05:26:52.767403 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d9d68b44b-5gp25" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-api" containerID="cri-o://fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82" gracePeriod=30 Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.580616 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.640290 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dc49c789d-5gcj4"] Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.640579 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dc49c789d-5gcj4" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-api" containerID="cri-o://f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a" gracePeriod=30 Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.640653 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dc49c789d-5gcj4" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-httpd" containerID="cri-o://e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2" gracePeriod=30 Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.776822 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd","Type":"ContainerStarted","Data":"571155fa2c4a4cc11bc78f96b7b5b636cdda183726d54338d2bf0cc02d77f003"} Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.778661 4931 generic.go:334] "Generic (PLEG): container finished" podID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerID="133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21" exitCode=143 Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.778697 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9d68b44b-5gp25" event={"ID":"b92991ff-5b79-452a-b5ac-9dc90ab42f68","Type":"ContainerDied","Data":"133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21"} Jan 30 05:26:53 crc kubenswrapper[4931]: I0130 05:26:53.796692 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.796676233 podStartE2EDuration="3.796676233s" podCreationTimestamp="2026-01-30 05:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:53.794474365 +0000 UTC m=+1149.164384622" watchObservedRunningTime="2026-01-30 05:26:53.796676233 +0000 UTC m=+1149.166586490" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.791350 4931 generic.go:334] "Generic (PLEG): container finished" podID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerID="e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2" exitCode=0 Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.791412 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc49c789d-5gcj4" event={"ID":"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec","Type":"ContainerDied","Data":"e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2"} Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.791670 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.792940 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.794499 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lxhtr" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.795937 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.796896 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.800742 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.870581 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxs9\" (UniqueName: \"kubernetes.io/projected/43c51602-467a-46a4-a7e5-898e988d56b4-kube-api-access-2hxs9\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.870858 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.870925 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.871072 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.973471 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.973535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.973584 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.973677 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxs9\" (UniqueName: \"kubernetes.io/projected/43c51602-467a-46a4-a7e5-898e988d56b4-kube-api-access-2hxs9\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.974636 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.979483 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.980947 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:54 crc kubenswrapper[4931]: I0130 05:26:54.999326 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxs9\" (UniqueName: \"kubernetes.io/projected/43c51602-467a-46a4-a7e5-898e988d56b4-kube-api-access-2hxs9\") pod \"openstackclient\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.115244 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.135700 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.166961 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.193868 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.195213 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.204529 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.282485 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.282719 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.282780 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k758\" (UniqueName: \"kubernetes.io/projected/6b263e8e-7618-4044-bed1-b35174d6a8f4-kube-api-access-5k758\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.282824 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: E0130 05:26:55.325794 4931 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 05:26:55 crc kubenswrapper[4931]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_43c51602-467a-46a4-a7e5-898e988d56b4_0(6fc7fa539391e55acf05b41581f8c91bef648896bfa36ae10f211c8f9c6c301a): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6fc7fa539391e55acf05b41581f8c91bef648896bfa36ae10f211c8f9c6c301a" Netns:"/var/run/netns/1e968b92-6b2c-4c2c-9be1-cc1e18187ac7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6fc7fa539391e55acf05b41581f8c91bef648896bfa36ae10f211c8f9c6c301a;K8S_POD_UID=43c51602-467a-46a4-a7e5-898e988d56b4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/43c51602-467a-46a4-a7e5-898e988d56b4]: expected pod UID "43c51602-467a-46a4-a7e5-898e988d56b4" but got "6b263e8e-7618-4044-bed1-b35174d6a8f4" from Kube API Jan 30 05:26:55 crc kubenswrapper[4931]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:26:55 crc kubenswrapper[4931]: > Jan 30 05:26:55 crc kubenswrapper[4931]: E0130 05:26:55.325862 4931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 05:26:55 crc kubenswrapper[4931]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_43c51602-467a-46a4-a7e5-898e988d56b4_0(6fc7fa539391e55acf05b41581f8c91bef648896bfa36ae10f211c8f9c6c301a): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6fc7fa539391e55acf05b41581f8c91bef648896bfa36ae10f211c8f9c6c301a" Netns:"/var/run/netns/1e968b92-6b2c-4c2c-9be1-cc1e18187ac7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6fc7fa539391e55acf05b41581f8c91bef648896bfa36ae10f211c8f9c6c301a;K8S_POD_UID=43c51602-467a-46a4-a7e5-898e988d56b4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/43c51602-467a-46a4-a7e5-898e988d56b4]: expected pod UID "43c51602-467a-46a4-a7e5-898e988d56b4" but got "6b263e8e-7618-4044-bed1-b35174d6a8f4" from Kube API Jan 30 05:26:55 crc kubenswrapper[4931]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:26:55 crc kubenswrapper[4931]: > pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.384871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.384915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.384998 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k758\" (UniqueName: \"kubernetes.io/projected/6b263e8e-7618-4044-bed1-b35174d6a8f4-kube-api-access-5k758\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.385055 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.386060 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.390043 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.390297 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.404947 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k758\" (UniqueName: \"kubernetes.io/projected/6b263e8e-7618-4044-bed1-b35174d6a8f4-kube-api-access-5k758\") pod \"openstackclient\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.515108 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.680711 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-687c697484-j2btt_84203bc9-afb4-42cb-843d-c211490ce275/neutron-api/0.log" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.680777 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.791600 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-ovndb-tls-certs\") pod \"84203bc9-afb4-42cb-843d-c211490ce275\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.791897 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-config\") pod \"84203bc9-afb4-42cb-843d-c211490ce275\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.791919 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jwc6\" (UniqueName: \"kubernetes.io/projected/84203bc9-afb4-42cb-843d-c211490ce275-kube-api-access-6jwc6\") pod \"84203bc9-afb4-42cb-843d-c211490ce275\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.791962 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-httpd-config\") pod \"84203bc9-afb4-42cb-843d-c211490ce275\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.792133 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-combined-ca-bundle\") pod \"84203bc9-afb4-42cb-843d-c211490ce275\" (UID: \"84203bc9-afb4-42cb-843d-c211490ce275\") " Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.816825 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-687c697484-j2btt_84203bc9-afb4-42cb-843d-c211490ce275/neutron-api/0.log" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.816873 4931 generic.go:334] "Generic (PLEG): container finished" podID="84203bc9-afb4-42cb-843d-c211490ce275" containerID="56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e" exitCode=137 Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.816956 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.817000 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-687c697484-j2btt" event={"ID":"84203bc9-afb4-42cb-843d-c211490ce275","Type":"ContainerDied","Data":"56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e"} Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.817060 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-687c697484-j2btt" event={"ID":"84203bc9-afb4-42cb-843d-c211490ce275","Type":"ContainerDied","Data":"ae04b2bb5e94306ea64985be9c8f6deb0c75c6536be8008e597d586d1aee985e"} Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.817078 4931 scope.go:117] "RemoveContainer" containerID="14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.817089 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-687c697484-j2btt" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.820576 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "84203bc9-afb4-42cb-843d-c211490ce275" (UID: "84203bc9-afb4-42cb-843d-c211490ce275"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.835412 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84203bc9-afb4-42cb-843d-c211490ce275-kube-api-access-6jwc6" (OuterVolumeSpecName: "kube-api-access-6jwc6") pod "84203bc9-afb4-42cb-843d-c211490ce275" (UID: "84203bc9-afb4-42cb-843d-c211490ce275"). InnerVolumeSpecName "kube-api-access-6jwc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.838371 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.854632 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="43c51602-467a-46a4-a7e5-898e988d56b4" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.864867 4931 scope.go:117] "RemoveContainer" containerID="56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.868552 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-config" (OuterVolumeSpecName: "config") pod "84203bc9-afb4-42cb-843d-c211490ce275" (UID: "84203bc9-afb4-42cb-843d-c211490ce275"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.883222 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "84203bc9-afb4-42cb-843d-c211490ce275" (UID: "84203bc9-afb4-42cb-843d-c211490ce275"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.887462 4931 scope.go:117] "RemoveContainer" containerID="14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.891525 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84203bc9-afb4-42cb-843d-c211490ce275" (UID: "84203bc9-afb4-42cb-843d-c211490ce275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:55 crc kubenswrapper[4931]: E0130 05:26:55.891553 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625\": container with ID starting with 14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625 not found: ID does not exist" containerID="14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.891590 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625"} err="failed to get container status \"14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625\": rpc error: code = NotFound desc = could not find container \"14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625\": container with ID starting with 14ccbe55d693286bb36c8c2d9d691bc7dcb9c2a613cfa2888cedb7d4d536b625 not found: ID does not exist" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.891616 4931 scope.go:117] "RemoveContainer" containerID="56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e" Jan 30 05:26:55 crc kubenswrapper[4931]: E0130 05:26:55.893638 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e\": container with ID starting with 56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e not found: ID does not exist" containerID="56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.893697 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e"} err="failed to get container status \"56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e\": rpc error: code = NotFound desc = could not find container \"56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e\": container with ID starting with 56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e not found: ID does not exist" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.902860 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.902908 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.902920 4931 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.902928 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/84203bc9-afb4-42cb-843d-c211490ce275-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:55 crc kubenswrapper[4931]: I0130 05:26:55.902937 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jwc6\" (UniqueName: \"kubernetes.io/projected/84203bc9-afb4-42cb-843d-c211490ce275-kube-api-access-6jwc6\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.003740 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hxs9\" (UniqueName: \"kubernetes.io/projected/43c51602-467a-46a4-a7e5-898e988d56b4-kube-api-access-2hxs9\") pod \"43c51602-467a-46a4-a7e5-898e988d56b4\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.003891 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config\") pod \"43c51602-467a-46a4-a7e5-898e988d56b4\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.003912 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config-secret\") pod \"43c51602-467a-46a4-a7e5-898e988d56b4\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.003949 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-combined-ca-bundle\") pod \"43c51602-467a-46a4-a7e5-898e988d56b4\" (UID: \"43c51602-467a-46a4-a7e5-898e988d56b4\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.004848 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "43c51602-467a-46a4-a7e5-898e988d56b4" (UID: "43c51602-467a-46a4-a7e5-898e988d56b4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.009814 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c51602-467a-46a4-a7e5-898e988d56b4-kube-api-access-2hxs9" (OuterVolumeSpecName: "kube-api-access-2hxs9") pod "43c51602-467a-46a4-a7e5-898e988d56b4" (UID: "43c51602-467a-46a4-a7e5-898e988d56b4"). InnerVolumeSpecName "kube-api-access-2hxs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.011108 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43c51602-467a-46a4-a7e5-898e988d56b4" (UID: "43c51602-467a-46a4-a7e5-898e988d56b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.011319 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "43c51602-467a-46a4-a7e5-898e988d56b4" (UID: "43c51602-467a-46a4-a7e5-898e988d56b4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.086628 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.105494 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hxs9\" (UniqueName: \"kubernetes.io/projected/43c51602-467a-46a4-a7e5-898e988d56b4-kube-api-access-2hxs9\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.105522 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.105531 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.105539 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c51602-467a-46a4-a7e5-898e988d56b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.153726 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.189287 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-687c697484-j2btt"] Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.197434 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-687c697484-j2btt"] Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.327971 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.408968 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-internal-tls-certs\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409012 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-public-tls-certs\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409049 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-combined-ca-bundle\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409085 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-scripts\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409112 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-config-data\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409156 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bz2z\" (UniqueName: \"kubernetes.io/projected/b92991ff-5b79-452a-b5ac-9dc90ab42f68-kube-api-access-9bz2z\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409180 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92991ff-5b79-452a-b5ac-9dc90ab42f68-logs\") pod \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\" (UID: \"b92991ff-5b79-452a-b5ac-9dc90ab42f68\") " Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.409938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92991ff-5b79-452a-b5ac-9dc90ab42f68-logs" (OuterVolumeSpecName: "logs") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.417219 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-scripts" (OuterVolumeSpecName: "scripts") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.417238 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92991ff-5b79-452a-b5ac-9dc90ab42f68-kube-api-access-9bz2z" (OuterVolumeSpecName: "kube-api-access-9bz2z") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "kube-api-access-9bz2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.462274 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-config-data" (OuterVolumeSpecName: "config-data") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.471046 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.511215 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.511259 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.511280 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.511291 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.511302 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bz2z\" (UniqueName: \"kubernetes.io/projected/b92991ff-5b79-452a-b5ac-9dc90ab42f68-kube-api-access-9bz2z\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.511313 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92991ff-5b79-452a-b5ac-9dc90ab42f68-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.546678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b92991ff-5b79-452a-b5ac-9dc90ab42f68" (UID: "b92991ff-5b79-452a-b5ac-9dc90ab42f68"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.613316 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.613354 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92991ff-5b79-452a-b5ac-9dc90ab42f68-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.826983 4931 generic.go:334] "Generic (PLEG): container finished" podID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerID="fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82" exitCode=0 Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.827050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9d68b44b-5gp25" event={"ID":"b92991ff-5b79-452a-b5ac-9dc90ab42f68","Type":"ContainerDied","Data":"fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82"} Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.827082 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9d68b44b-5gp25" event={"ID":"b92991ff-5b79-452a-b5ac-9dc90ab42f68","Type":"ContainerDied","Data":"040b81795acd0bef7c76b7a99d650deaac66b5fa82f97baf669121be56928797"} Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.827101 4931 scope.go:117] "RemoveContainer" containerID="fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.827203 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9d68b44b-5gp25" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.846977 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.847028 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6b263e8e-7618-4044-bed1-b35174d6a8f4","Type":"ContainerStarted","Data":"14528f0946216f7b2b6764667e27591d86ee74bba97b6ce9081e9dedf29c1572"} Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.867508 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d9d68b44b-5gp25"] Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.867903 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="43c51602-467a-46a4-a7e5-898e988d56b4" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.873903 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d9d68b44b-5gp25"] Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.891922 4931 scope.go:117] "RemoveContainer" containerID="133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.924565 4931 scope.go:117] "RemoveContainer" containerID="fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82" Jan 30 05:26:56 crc kubenswrapper[4931]: E0130 05:26:56.925504 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82\": container with ID starting with fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82 not found: ID does not exist" containerID="fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.925547 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82"} err="failed to get container status \"fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82\": rpc error: code = NotFound desc = could not find container \"fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82\": container with ID starting with fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82 not found: ID does not exist" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.925573 4931 scope.go:117] "RemoveContainer" containerID="133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21" Jan 30 05:26:56 crc kubenswrapper[4931]: E0130 05:26:56.927451 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21\": container with ID starting with 133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21 not found: ID does not exist" containerID="133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21" Jan 30 05:26:56 crc kubenswrapper[4931]: I0130 05:26:56.927484 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21"} err="failed to get container status \"133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21\": rpc error: code = NotFound desc = could not find container \"133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21\": container with ID starting with 133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21 not found: ID does not exist" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.362698 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.363024 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.363063 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.363627 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.363685 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f" gracePeriod=600 Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.434611 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c51602-467a-46a4-a7e5-898e988d56b4" path="/var/lib/kubelet/pods/43c51602-467a-46a4-a7e5-898e988d56b4/volumes" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.435113 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84203bc9-afb4-42cb-843d-c211490ce275" path="/var/lib/kubelet/pods/84203bc9-afb4-42cb-843d-c211490ce275/volumes" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.435881 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" path="/var/lib/kubelet/pods/b92991ff-5b79-452a-b5ac-9dc90ab42f68/volumes" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.712196 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.837959 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-combined-ca-bundle\") pod \"807d8709-a403-4186-83f5-ec76aee793fe\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.838037 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data\") pod \"807d8709-a403-4186-83f5-ec76aee793fe\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.838087 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d8709-a403-4186-83f5-ec76aee793fe-logs\") pod \"807d8709-a403-4186-83f5-ec76aee793fe\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.838164 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrvbk\" (UniqueName: \"kubernetes.io/projected/807d8709-a403-4186-83f5-ec76aee793fe-kube-api-access-jrvbk\") pod \"807d8709-a403-4186-83f5-ec76aee793fe\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.838218 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data-custom\") pod \"807d8709-a403-4186-83f5-ec76aee793fe\" (UID: \"807d8709-a403-4186-83f5-ec76aee793fe\") " Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.841747 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807d8709-a403-4186-83f5-ec76aee793fe-logs" (OuterVolumeSpecName: "logs") pod "807d8709-a403-4186-83f5-ec76aee793fe" (UID: "807d8709-a403-4186-83f5-ec76aee793fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.861605 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "807d8709-a403-4186-83f5-ec76aee793fe" (UID: "807d8709-a403-4186-83f5-ec76aee793fe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.863637 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807d8709-a403-4186-83f5-ec76aee793fe-kube-api-access-jrvbk" (OuterVolumeSpecName: "kube-api-access-jrvbk") pod "807d8709-a403-4186-83f5-ec76aee793fe" (UID: "807d8709-a403-4186-83f5-ec76aee793fe"). InnerVolumeSpecName "kube-api-access-jrvbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.897463 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "807d8709-a403-4186-83f5-ec76aee793fe" (UID: "807d8709-a403-4186-83f5-ec76aee793fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.900565 4931 generic.go:334] "Generic (PLEG): container finished" podID="807d8709-a403-4186-83f5-ec76aee793fe" containerID="4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0" exitCode=137 Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.900615 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" event={"ID":"807d8709-a403-4186-83f5-ec76aee793fe","Type":"ContainerDied","Data":"4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0"} Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.900640 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" event={"ID":"807d8709-a403-4186-83f5-ec76aee793fe","Type":"ContainerDied","Data":"fc0c653d3e574db62881709b302919c961837f9a8fc28421f26c150c1cbda477"} Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.900656 4931 scope.go:117] "RemoveContainer" containerID="4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.900741 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f7d589966-mkfs5" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.910356 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f" exitCode=0 Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.910391 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f"} Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.910432 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"083c2726f719c1b6c228fc0d209a309a403985263c1ced3ea0982529442fd973"} Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.933601 4931 scope.go:117] "RemoveContainer" containerID="74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.936010 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data" (OuterVolumeSpecName: "config-data") pod "807d8709-a403-4186-83f5-ec76aee793fe" (UID: "807d8709-a403-4186-83f5-ec76aee793fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.941072 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrvbk\" (UniqueName: \"kubernetes.io/projected/807d8709-a403-4186-83f5-ec76aee793fe-kube-api-access-jrvbk\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.941100 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.941110 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.941120 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d8709-a403-4186-83f5-ec76aee793fe-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.941129 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d8709-a403-4186-83f5-ec76aee793fe-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.970582 4931 scope.go:117] "RemoveContainer" containerID="4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0" Jan 30 05:26:57 crc kubenswrapper[4931]: E0130 05:26:57.971965 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0\": container with ID starting with 4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0 not found: ID does not exist" containerID="4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.971999 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0"} err="failed to get container status \"4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0\": rpc error: code = NotFound desc = could not find container \"4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0\": container with ID starting with 4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0 not found: ID does not exist" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.972021 4931 scope.go:117] "RemoveContainer" containerID="74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6" Jan 30 05:26:57 crc kubenswrapper[4931]: E0130 05:26:57.972382 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6\": container with ID starting with 74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6 not found: ID does not exist" containerID="74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.972401 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6"} err="failed to get container status \"74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6\": rpc error: code = NotFound desc = could not find container \"74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6\": container with ID starting with 74b2b389570e4a1035ef2e2c5f7eb1cceb4dcfec00b6a918fb6f7e2900be1dd6 not found: ID does not exist" Jan 30 05:26:57 crc kubenswrapper[4931]: I0130 05:26:57.972412 4931 scope.go:117] "RemoveContainer" containerID="60aa2b4543ca7f8bb21bef3c167f0da099829d6ddc544f185e09f8c4de74ad75" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.247848 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-f7d589966-mkfs5"] Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.261908 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76fb878d5c-s22sw"] Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.266197 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener-log" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266222 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener-log" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.266233 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-log" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266239 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-log" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.266255 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-api" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266260 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-api" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.266270 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-api" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266276 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-api" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.266287 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-httpd" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266293 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-httpd" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.266304 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266309 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266569 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266584 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-httpd" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266591 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="807d8709-a403-4186-83f5-ec76aee793fe" containerName="barbican-keystone-listener-log" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266598 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="84203bc9-afb4-42cb-843d-c211490ce275" containerName="neutron-api" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266605 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-log" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.266613 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92991ff-5b79-452a-b5ac-9dc90ab42f68" containerName="placement-api" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.267540 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.273351 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.273615 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.278236 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.283691 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-f7d589966-mkfs5"] Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.297293 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76fb878d5c-s22sw"] Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346503 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-public-tls-certs\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346567 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-log-httpd\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346594 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-combined-ca-bundle\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346642 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsxq8\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-kube-api-access-dsxq8\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346661 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-run-httpd\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346686 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-config-data\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.346750 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-internal-tls-certs\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.448900 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-combined-ca-bundle\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449234 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsxq8\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-kube-api-access-dsxq8\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449262 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-run-httpd\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449294 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-config-data\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449332 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449372 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-internal-tls-certs\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449443 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-public-tls-certs\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-log-httpd\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449910 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-run-httpd\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.449919 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-log-httpd\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.455142 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-public-tls-certs\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.455831 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-combined-ca-bundle\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.456061 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.463156 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-config-data\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.464900 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsxq8\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-kube-api-access-dsxq8\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.471357 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-internal-tls-certs\") pod \"swift-proxy-76fb878d5c-s22sw\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.557976 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.590917 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.651842 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-combined-ca-bundle\") pod \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.651941 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-ovndb-tls-certs\") pod \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.651970 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-config\") pod \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.651989 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-httpd-config\") pod \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.652075 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgbtk\" (UniqueName: \"kubernetes.io/projected/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-kube-api-access-fgbtk\") pod \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\" (UID: \"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec\") " Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.657801 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-kube-api-access-fgbtk" (OuterVolumeSpecName: "kube-api-access-fgbtk") pod "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" (UID: "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec"). InnerVolumeSpecName "kube-api-access-fgbtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.664380 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" (UID: "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.705843 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" (UID: "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.745094 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" (UID: "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.746521 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-config" (OuterVolumeSpecName: "config") pod "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" (UID: "7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.756080 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.756109 4931 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.756118 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.756128 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.756137 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgbtk\" (UniqueName: \"kubernetes.io/projected/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec-kube-api-access-fgbtk\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.924161 4931 generic.go:334] "Generic (PLEG): container finished" podID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerID="f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a" exitCode=0 Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.924355 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc49c789d-5gcj4" event={"ID":"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec","Type":"ContainerDied","Data":"f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a"} Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.925159 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc49c789d-5gcj4" event={"ID":"7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec","Type":"ContainerDied","Data":"49b94c209fcd846b366cb60120c52ee63d74a76288f62e76634d76df2ff577f1"} Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.924467 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc49c789d-5gcj4" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.925239 4931 scope.go:117] "RemoveContainer" containerID="e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.955586 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dc49c789d-5gcj4"] Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.958745 4931 scope.go:117] "RemoveContainer" containerID="f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.963668 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dc49c789d-5gcj4"] Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.978561 4931 scope.go:117] "RemoveContainer" containerID="e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.978892 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2\": container with ID starting with e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2 not found: ID does not exist" containerID="e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.978985 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2"} err="failed to get container status \"e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2\": rpc error: code = NotFound desc = could not find container \"e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2\": container with ID starting with e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2 not found: ID does not exist" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.979084 4931 scope.go:117] "RemoveContainer" containerID="f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a" Jan 30 05:26:58 crc kubenswrapper[4931]: E0130 05:26:58.979516 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a\": container with ID starting with f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a not found: ID does not exist" containerID="f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a" Jan 30 05:26:58 crc kubenswrapper[4931]: I0130 05:26:58.979547 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a"} err="failed to get container status \"f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a\": rpc error: code = NotFound desc = could not find container \"f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a\": container with ID starting with f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a not found: ID does not exist" Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.180547 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76fb878d5c-s22sw"] Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.384479 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.384890 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-central-agent" containerID="cri-o://08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581" gracePeriod=30 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.385014 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="proxy-httpd" containerID="cri-o://9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f" gracePeriod=30 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.385043 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="sg-core" containerID="cri-o://b2d093b072c9cdec044442a6734371a3413de8a6cea48d10abb06780f1cd4e75" gracePeriod=30 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.385181 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-notification-agent" containerID="cri-o://f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d" gracePeriod=30 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.389094 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.443502 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" path="/var/lib/kubelet/pods/7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec/volumes" Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.449949 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807d8709-a403-4186-83f5-ec76aee793fe" path="/var/lib/kubelet/pods/807d8709-a403-4186-83f5-ec76aee793fe/volumes" Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.942091 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76fb878d5c-s22sw" event={"ID":"98d21216-5a0f-422c-9642-0ea353a33e82","Type":"ContainerStarted","Data":"02a426537f79889d684c812318ea1dd0bd0af03a098fb1e7d47cd94e43353e1c"} Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.942733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76fb878d5c-s22sw" event={"ID":"98d21216-5a0f-422c-9642-0ea353a33e82","Type":"ContainerStarted","Data":"3baca7478354f0fb4066c265761c1fd4465b993347daea084eeb2d40cd40bed6"} Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.942756 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.942770 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.942782 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76fb878d5c-s22sw" event={"ID":"98d21216-5a0f-422c-9642-0ea353a33e82","Type":"ContainerStarted","Data":"68d0e2dfe8dc67ba7ff79544ecf0a950e34ec34379d61e5a1edf698fb315e6f7"} Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.948506 4931 generic.go:334] "Generic (PLEG): container finished" podID="e5755369-fc75-443e-b608-996b7212ac94" containerID="9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f" exitCode=0 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.948533 4931 generic.go:334] "Generic (PLEG): container finished" podID="e5755369-fc75-443e-b608-996b7212ac94" containerID="b2d093b072c9cdec044442a6734371a3413de8a6cea48d10abb06780f1cd4e75" exitCode=2 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.948547 4931 generic.go:334] "Generic (PLEG): container finished" podID="e5755369-fc75-443e-b608-996b7212ac94" containerID="08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581" exitCode=0 Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.948566 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerDied","Data":"9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f"} Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.948586 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerDied","Data":"b2d093b072c9cdec044442a6734371a3413de8a6cea48d10abb06780f1cd4e75"} Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.948601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerDied","Data":"08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581"} Jan 30 05:26:59 crc kubenswrapper[4931]: I0130 05:26:59.972183 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76fb878d5c-s22sw" podStartSLOduration=1.9721654659999999 podStartE2EDuration="1.972165466s" podCreationTimestamp="2026-01-30 05:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:59.964356763 +0000 UTC m=+1155.334267030" watchObservedRunningTime="2026-01-30 05:26:59.972165466 +0000 UTC m=+1155.342075723" Jan 30 05:27:01 crc kubenswrapper[4931]: I0130 05:27:01.351886 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 05:27:01 crc kubenswrapper[4931]: I0130 05:27:01.971273 4931 generic.go:334] "Generic (PLEG): container finished" podID="e5755369-fc75-443e-b608-996b7212ac94" containerID="f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d" exitCode=0 Jan 30 05:27:01 crc kubenswrapper[4931]: I0130 05:27:01.971345 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerDied","Data":"f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d"} Jan 30 05:27:04 crc kubenswrapper[4931]: I0130 05:27:04.389031 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:27:04 crc kubenswrapper[4931]: I0130 05:27:04.389478 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" containerName="kube-state-metrics" containerID="cri-o://b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb" gracePeriod=30 Jan 30 05:27:05 crc kubenswrapper[4931]: I0130 05:27:05.001567 4931 generic.go:334] "Generic (PLEG): container finished" podID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" containerID="b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb" exitCode=2 Jan 30 05:27:05 crc kubenswrapper[4931]: I0130 05:27:05.001789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2","Type":"ContainerDied","Data":"b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb"} Jan 30 05:27:05 crc kubenswrapper[4931]: I0130 05:27:05.371140 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": dial tcp 10.217.0.106:8081: connect: connection refused" Jan 30 05:27:05 crc kubenswrapper[4931]: I0130 05:27:05.502328 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:05 crc kubenswrapper[4931]: I0130 05:27:05.502877 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-log" containerID="cri-o://c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57" gracePeriod=30 Jan 30 05:27:05 crc kubenswrapper[4931]: I0130 05:27:05.502941 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-httpd" containerID="cri-o://4dc106a9347c18b30457fcfe0ba0955c89ca0037e4655a069609d973aaa2c8d2" gracePeriod=30 Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.010129 4931 generic.go:334] "Generic (PLEG): container finished" podID="97f44787-3f37-44f1-85a5-4acffef71d95" containerID="c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57" exitCode=143 Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.010172 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97f44787-3f37-44f1-85a5-4acffef71d95","Type":"ContainerDied","Data":"c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57"} Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.847988 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.909880 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2j42\" (UniqueName: \"kubernetes.io/projected/e5755369-fc75-443e-b608-996b7212ac94-kube-api-access-x2j42\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.909931 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-run-httpd\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.910013 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-log-httpd\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.910034 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-sg-core-conf-yaml\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.910054 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-config-data\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.910104 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-scripts\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.910142 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-combined-ca-bundle\") pod \"e5755369-fc75-443e-b608-996b7212ac94\" (UID: \"e5755369-fc75-443e-b608-996b7212ac94\") " Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.911788 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.911807 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.921478 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5755369-fc75-443e-b608-996b7212ac94-kube-api-access-x2j42" (OuterVolumeSpecName: "kube-api-access-x2j42") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "kube-api-access-x2j42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.921663 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-scripts" (OuterVolumeSpecName: "scripts") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.952725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:06 crc kubenswrapper[4931]: I0130 05:27:06.963459 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.009801 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.011832 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2j42\" (UniqueName: \"kubernetes.io/projected/e5755369-fc75-443e-b608-996b7212ac94-kube-api-access-x2j42\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.011861 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.011870 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5755369-fc75-443e-b608-996b7212ac94-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.011878 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.011886 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.011895 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.030462 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6b263e8e-7618-4044-bed1-b35174d6a8f4","Type":"ContainerStarted","Data":"998f26954c016e9a4be4fed72f68f879a1f7793c171311545d8f4958871325fd"} Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.038496 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5755369-fc75-443e-b608-996b7212ac94","Type":"ContainerDied","Data":"ea5ec3936bd62e44ba566c0aec793fe2ef89fc6023d7cad6b0242228b7b8d07a"} Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.038539 4931 scope.go:117] "RemoveContainer" containerID="9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.038664 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.042243 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2","Type":"ContainerDied","Data":"54dbd562a66dec0d4b1d17dc98e849f8bad3b54d165bc61c92a13695e75f4ae7"} Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.042297 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.049010 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-config-data" (OuterVolumeSpecName: "config-data") pod "e5755369-fc75-443e-b608-996b7212ac94" (UID: "e5755369-fc75-443e-b608-996b7212ac94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.054250 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.552493659 podStartE2EDuration="12.054226109s" podCreationTimestamp="2026-01-30 05:26:55 +0000 UTC" firstStartedPulling="2026-01-30 05:26:56.111731983 +0000 UTC m=+1151.481642240" lastFinishedPulling="2026-01-30 05:27:06.613464433 +0000 UTC m=+1161.983374690" observedRunningTime="2026-01-30 05:27:07.049296715 +0000 UTC m=+1162.419206972" watchObservedRunningTime="2026-01-30 05:27:07.054226109 +0000 UTC m=+1162.424136366" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.066355 4931 scope.go:117] "RemoveContainer" containerID="b2d093b072c9cdec044442a6734371a3413de8a6cea48d10abb06780f1cd4e75" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.093090 4931 scope.go:117] "RemoveContainer" containerID="f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.114173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqss9\" (UniqueName: \"kubernetes.io/projected/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2-kube-api-access-tqss9\") pod \"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2\" (UID: \"75e7b62f-8246-48b8-bcbb-d7c5129dd5e2\") " Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.115339 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5755369-fc75-443e-b608-996b7212ac94-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.115696 4931 scope.go:117] "RemoveContainer" containerID="08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.118794 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2-kube-api-access-tqss9" (OuterVolumeSpecName: "kube-api-access-tqss9") pod "75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" (UID: "75e7b62f-8246-48b8-bcbb-d7c5129dd5e2"). InnerVolumeSpecName "kube-api-access-tqss9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.135690 4931 scope.go:117] "RemoveContainer" containerID="b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.216647 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqss9\" (UniqueName: \"kubernetes.io/projected/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2-kube-api-access-tqss9\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.376006 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.385907 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.396769 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.404400 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.414863 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415235 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="sg-core" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415251 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="sg-core" Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415265 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-central-agent" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415271 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-central-agent" Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415280 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="proxy-httpd" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415287 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="proxy-httpd" Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415293 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-notification-agent" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415301 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-notification-agent" Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415316 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-api" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415321 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-api" Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415333 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" containerName="kube-state-metrics" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415339 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" containerName="kube-state-metrics" Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.415358 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-httpd" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415363 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-httpd" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415588 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="proxy-httpd" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415602 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-central-agent" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415612 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" containerName="kube-state-metrics" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415625 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="ceilometer-notification-agent" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415634 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5755369-fc75-443e-b608-996b7212ac94" containerName="sg-core" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415644 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-httpd" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.415654 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c600b2f-8fcd-402b-bd79-9d64f8d1f1ec" containerName="neutron-api" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.417135 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.419286 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-b6t4t" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.419531 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.419885 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.423085 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.432302 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e7b62f-8246-48b8-bcbb-d7c5129dd5e2" path="/var/lib/kubelet/pods/75e7b62f-8246-48b8-bcbb-d7c5129dd5e2/volumes" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.434445 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5755369-fc75-443e-b608-996b7212ac94" path="/var/lib/kubelet/pods/e5755369-fc75-443e-b608-996b7212ac94/volumes" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.435818 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.437591 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.438147 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.443814 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.443827 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.448120 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521667 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-log-httpd\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521732 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-scripts\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521768 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521797 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521828 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-config-data\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521854 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-run-httpd\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521943 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znwrn\" (UniqueName: \"kubernetes.io/projected/30f9b591-fea6-4010-99db-45eef2237cdc-kube-api-access-znwrn\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.521979 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.522034 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.522076 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwgb\" (UniqueName: \"kubernetes.io/projected/e42548a3-5a7b-4f5b-8b13-8b5746710618-kube-api-access-knwgb\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.623856 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knwgb\" (UniqueName: \"kubernetes.io/projected/e42548a3-5a7b-4f5b-8b13-8b5746710618-kube-api-access-knwgb\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.623978 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-log-httpd\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624039 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-scripts\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624123 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624163 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-config-data\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624202 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624261 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624303 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-run-httpd\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624346 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znwrn\" (UniqueName: \"kubernetes.io/projected/30f9b591-fea6-4010-99db-45eef2237cdc-kube-api-access-znwrn\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624403 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.624523 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.625554 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-log-httpd\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.625648 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-run-httpd\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.631941 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:07 crc kubenswrapper[4931]: E0130 05:27:07.632708 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-knwgb scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="e42548a3-5a7b-4f5b-8b13-8b5746710618" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.634235 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.635589 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.636498 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.636619 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.640443 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.640648 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-scripts\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.641876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.648267 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwgb\" (UniqueName: \"kubernetes.io/projected/e42548a3-5a7b-4f5b-8b13-8b5746710618-kube-api-access-knwgb\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.648627 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-config-data\") pod \"ceilometer-0\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.656239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znwrn\" (UniqueName: \"kubernetes.io/projected/30f9b591-fea6-4010-99db-45eef2237cdc-kube-api-access-znwrn\") pod \"kube-state-metrics-0\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " pod="openstack/kube-state-metrics-0" Jan 30 05:27:07 crc kubenswrapper[4931]: I0130 05:27:07.754428 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.057780 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.070195 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133198 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-scripts\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133331 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-sg-core-conf-yaml\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133404 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-config-data\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133500 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-combined-ca-bundle\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133546 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-log-httpd\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-ceilometer-tls-certs\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133702 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knwgb\" (UniqueName: \"kubernetes.io/projected/e42548a3-5a7b-4f5b-8b13-8b5746710618-kube-api-access-knwgb\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.133744 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-run-httpd\") pod \"e42548a3-5a7b-4f5b-8b13-8b5746710618\" (UID: \"e42548a3-5a7b-4f5b-8b13-8b5746710618\") " Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.134377 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.136544 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.139726 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-config-data" (OuterVolumeSpecName: "config-data") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.141264 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.142323 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.142804 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42548a3-5a7b-4f5b-8b13-8b5746710618-kube-api-access-knwgb" (OuterVolumeSpecName: "kube-api-access-knwgb") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "kube-api-access-knwgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.144933 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-scripts" (OuterVolumeSpecName: "scripts") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.146555 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e42548a3-5a7b-4f5b-8b13-8b5746710618" (UID: "e42548a3-5a7b-4f5b-8b13-8b5746710618"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236352 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236377 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236387 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236395 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236404 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236413 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42548a3-5a7b-4f5b-8b13-8b5746710618-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236432 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knwgb\" (UniqueName: \"kubernetes.io/projected/e42548a3-5a7b-4f5b-8b13-8b5746710618-kube-api-access-knwgb\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.236439 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42548a3-5a7b-4f5b-8b13-8b5746710618-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.260459 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.287647 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vbzqc"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.288630 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.301106 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vbzqc"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.374145 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xvdtt"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.375095 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.390666 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0120-account-create-update-dptmf"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.398980 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.401960 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.408675 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xvdtt"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.424568 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0120-account-create-update-dptmf"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.440375 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5cbb37a-882a-46cf-9cee-0543ac708004-operator-scripts\") pod \"nova-cell0-db-create-xvdtt\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.440597 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5hz\" (UniqueName: \"kubernetes.io/projected/d5cbb37a-882a-46cf-9cee-0543ac708004-kube-api-access-kt5hz\") pod \"nova-cell0-db-create-xvdtt\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.440667 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-operator-scripts\") pod \"nova-api-db-create-vbzqc\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.440734 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlfnc\" (UniqueName: \"kubernetes.io/projected/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-kube-api-access-jlfnc\") pod \"nova-api-db-create-vbzqc\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.485196 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-x4mqp"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.486457 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.491342 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-x4mqp"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.541867 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5cbb37a-882a-46cf-9cee-0543ac708004-operator-scripts\") pod \"nova-cell0-db-create-xvdtt\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542155 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8624816-8c2c-4d9c-b3a5-426253850926-operator-scripts\") pod \"nova-cell1-db-create-x4mqp\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542188 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5hz\" (UniqueName: \"kubernetes.io/projected/d5cbb37a-882a-46cf-9cee-0543ac708004-kube-api-access-kt5hz\") pod \"nova-cell0-db-create-xvdtt\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542216 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-operator-scripts\") pod \"nova-api-db-create-vbzqc\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542246 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2bbp\" (UniqueName: \"kubernetes.io/projected/e8624816-8c2c-4d9c-b3a5-426253850926-kube-api-access-b2bbp\") pod \"nova-cell1-db-create-x4mqp\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542285 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlfnc\" (UniqueName: \"kubernetes.io/projected/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-kube-api-access-jlfnc\") pod \"nova-api-db-create-vbzqc\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdss\" (UniqueName: \"kubernetes.io/projected/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-kube-api-access-mkdss\") pod \"nova-api-0120-account-create-update-dptmf\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542368 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-operator-scripts\") pod \"nova-api-0120-account-create-update-dptmf\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542624 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5cbb37a-882a-46cf-9cee-0543ac708004-operator-scripts\") pod \"nova-cell0-db-create-xvdtt\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.542865 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-operator-scripts\") pod \"nova-api-db-create-vbzqc\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.566165 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5hz\" (UniqueName: \"kubernetes.io/projected/d5cbb37a-882a-46cf-9cee-0543ac708004-kube-api-access-kt5hz\") pod \"nova-cell0-db-create-xvdtt\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.570106 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlfnc\" (UniqueName: \"kubernetes.io/projected/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-kube-api-access-jlfnc\") pod \"nova-api-db-create-vbzqc\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.592201 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-vfdzl"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.593571 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.595287 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.613374 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.614928 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-vfdzl"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.619654 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.622971 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.649145 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8624816-8c2c-4d9c-b3a5-426253850926-operator-scripts\") pod \"nova-cell1-db-create-x4mqp\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.649362 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2bbp\" (UniqueName: \"kubernetes.io/projected/e8624816-8c2c-4d9c-b3a5-426253850926-kube-api-access-b2bbp\") pod \"nova-cell1-db-create-x4mqp\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.649549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/053ccacf-d473-49f5-89e5-545a753e5e03-operator-scripts\") pod \"nova-cell0-10f6-account-create-update-vfdzl\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.649654 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdss\" (UniqueName: \"kubernetes.io/projected/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-kube-api-access-mkdss\") pod \"nova-api-0120-account-create-update-dptmf\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.649787 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-operator-scripts\") pod \"nova-api-0120-account-create-update-dptmf\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.650042 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89msj\" (UniqueName: \"kubernetes.io/projected/053ccacf-d473-49f5-89e5-545a753e5e03-kube-api-access-89msj\") pod \"nova-cell0-10f6-account-create-update-vfdzl\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.651645 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8624816-8c2c-4d9c-b3a5-426253850926-operator-scripts\") pod \"nova-cell1-db-create-x4mqp\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.654565 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-operator-scripts\") pod \"nova-api-0120-account-create-update-dptmf\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.685685 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2bbp\" (UniqueName: \"kubernetes.io/projected/e8624816-8c2c-4d9c-b3a5-426253850926-kube-api-access-b2bbp\") pod \"nova-cell1-db-create-x4mqp\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.687255 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdss\" (UniqueName: \"kubernetes.io/projected/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-kube-api-access-mkdss\") pod \"nova-api-0120-account-create-update-dptmf\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.704442 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.721065 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.751355 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/053ccacf-d473-49f5-89e5-545a753e5e03-operator-scripts\") pod \"nova-cell0-10f6-account-create-update-vfdzl\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.751477 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89msj\" (UniqueName: \"kubernetes.io/projected/053ccacf-d473-49f5-89e5-545a753e5e03-kube-api-access-89msj\") pod \"nova-cell0-10f6-account-create-update-vfdzl\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.752789 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/053ccacf-d473-49f5-89e5-545a753e5e03-operator-scripts\") pod \"nova-cell0-10f6-account-create-update-vfdzl\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.806720 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89msj\" (UniqueName: \"kubernetes.io/projected/053ccacf-d473-49f5-89e5-545a753e5e03-kube-api-access-89msj\") pod \"nova-cell0-10f6-account-create-update-vfdzl\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.814230 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-326d-account-create-update-rvcsw"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.815578 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.818083 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.821586 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-rvcsw"] Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.850286 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.861197 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff3c7ac-e403-4826-bf45-a6bed05570b7-operator-scripts\") pod \"nova-cell1-326d-account-create-update-rvcsw\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.861379 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.861391 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz255\" (UniqueName: \"kubernetes.io/projected/1ff3c7ac-e403-4826-bf45-a6bed05570b7-kube-api-access-xz255\") pod \"nova-cell1-326d-account-create-update-rvcsw\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.962819 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff3c7ac-e403-4826-bf45-a6bed05570b7-operator-scripts\") pod \"nova-cell1-326d-account-create-update-rvcsw\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.962935 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz255\" (UniqueName: \"kubernetes.io/projected/1ff3c7ac-e403-4826-bf45-a6bed05570b7-kube-api-access-xz255\") pod \"nova-cell1-326d-account-create-update-rvcsw\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.964771 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff3c7ac-e403-4826-bf45-a6bed05570b7-operator-scripts\") pod \"nova-cell1-326d-account-create-update-rvcsw\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:08 crc kubenswrapper[4931]: I0130 05:27:08.980960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz255\" (UniqueName: \"kubernetes.io/projected/1ff3c7ac-e403-4826-bf45-a6bed05570b7-kube-api-access-xz255\") pod \"nova-cell1-326d-account-create-update-rvcsw\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:09 crc kubenswrapper[4931]: E0130 05:27:09.003843 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92991ff_5b79_452a_b5ac_9dc90ab42f68.slice/crio-conmon-fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c600b2f_8fcd_402b_bd79_9d64f8d1f1ec.slice/crio-49b94c209fcd846b366cb60120c52ee63d74a76288f62e76634d76df2ff577f1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c600b2f_8fcd_402b_bd79_9d64f8d1f1ec.slice/crio-e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92991ff_5b79_452a_b5ac_9dc90ab42f68.slice/crio-conmon-133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84203bc9_afb4_42cb_843d_c211490ce275.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-conmon-08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c600b2f_8fcd_402b_bd79_9d64f8d1f1ec.slice/crio-conmon-e15def09885e74ede7250dabc482871e820b4af404030d975622af536ceb70c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c600b2f_8fcd_402b_bd79_9d64f8d1f1ec.slice/crio-f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-conmon-9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97f44787_3f37_44f1_85a5_4acffef71d95.slice/crio-4dc106a9347c18b30457fcfe0ba0955c89ca0037e4655a069609d973aaa2c8d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c600b2f_8fcd_402b_bd79_9d64f8d1f1ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92991ff_5b79_452a_b5ac_9dc90ab42f68.slice/crio-fd03896a043ef7af736cea0542c13e92c9e0f2ad27eb023f4ee70e94e5161f82.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-08e9531045b4ed348a8e15f9e06ca988f965f1f605cc60da77be0b95272ee581.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84203bc9_afb4_42cb_843d_c211490ce275.slice/crio-ae04b2bb5e94306ea64985be9c8f6deb0c75c6536be8008e597d586d1aee985e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf6bc53b_31f7_4650_aab3_d4bcf8b685ab.slice/crio-conmon-6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189be3dc_d439_47c2_b1f2_7413fc4b5e85.slice/crio-conmon-a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807d8709_a403_4186_83f5_ec76aee793fe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c600b2f_8fcd_402b_bd79_9d64f8d1f1ec.slice/crio-conmon-f166ba6e75d463bf521caced327fdcd398e366003e50939372ecbc0798e6412a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf6bc53b_31f7_4650_aab3_d4bcf8b685ab.slice/crio-6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e7b62f_8246_48b8_bcbb_d7c5129dd5e2.slice/crio-54dbd562a66dec0d4b1d17dc98e849f8bad3b54d165bc61c92a13695e75f4ae7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807d8709_a403_4186_83f5_ec76aee793fe.slice/crio-conmon-4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189be3dc_d439_47c2_b1f2_7413fc4b5e85.slice/crio-a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807d8709_a403_4186_83f5_ec76aee793fe.slice/crio-fc0c653d3e574db62881709b302919c961837f9a8fc28421f26c150c1cbda477\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-conmon-f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84203bc9_afb4_42cb_843d_c211490ce275.slice/crio-56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97f44787_3f37_44f1_85a5_4acffef71d95.slice/crio-c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e7b62f_8246_48b8_bcbb_d7c5129dd5e2.slice/crio-b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84203bc9_afb4_42cb_843d_c211490ce275.slice/crio-conmon-56d86156b326eeb21177e44e6d091888d2774d4f2e2a4d09d4994d48a8cc370e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92991ff_5b79_452a_b5ac_9dc90ab42f68.slice/crio-040b81795acd0bef7c76b7a99d650deaac66b5fa82f97baf669121be56928797\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-9e8bac5316b66891bb65008c7a82aba4d5b92fb6001c63ba8c49a06e95040b7f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92991ff_5b79_452a_b5ac_9dc90ab42f68.slice/crio-133ba33140e89ee163bac35cf527e591a8f3646d1451e448d0c71bf3a1966b21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807d8709_a403_4186_83f5_ec76aee793fe.slice/crio-4784748276da4d9a133003266f7562cd472e3126956926bf5055ab897b3f9fd0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e7b62f_8246_48b8_bcbb_d7c5129dd5e2.slice/crio-conmon-b4e72393e02c3e0619fa1e4bea6d0742ffbef4de8775e7d96ccdc8545af19acb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92991ff_5b79_452a_b5ac_9dc90ab42f68.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5755369_fc75_443e_b608_996b7212ac94.slice/crio-f4f164366b57c885cd381de9f7095a47cecfe7af0c8f7b404360ca7dbbfa150d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97f44787_3f37_44f1_85a5_4acffef71d95.slice/crio-conmon-c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.080578 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30f9b591-fea6-4010-99db-45eef2237cdc","Type":"ContainerStarted","Data":"3ab5021fa2dee4a0cbf054b6b79552974b77b39e6c35cbc24e07bc801848b48b"} Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.086262 4931 generic.go:334] "Generic (PLEG): container finished" podID="97f44787-3f37-44f1-85a5-4acffef71d95" containerID="4dc106a9347c18b30457fcfe0ba0955c89ca0037e4655a069609d973aaa2c8d2" exitCode=0 Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.086316 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97f44787-3f37-44f1-85a5-4acffef71d95","Type":"ContainerDied","Data":"4dc106a9347c18b30457fcfe0ba0955c89ca0037e4655a069609d973aaa2c8d2"} Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.087540 4931 generic.go:334] "Generic (PLEG): container finished" podID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerID="6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3" exitCode=137 Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.087684 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"af6bc53b-31f7-4650-aab3-d4bcf8b685ab","Type":"ContainerDied","Data":"6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3"} Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.087841 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.135235 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.159472 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.185318 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.188547 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.190640 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.193205 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.193508 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.193625 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.219855 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.273808 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-scripts\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.273857 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.273893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.273917 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.273952 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-config-data\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.273967 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-log-httpd\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.274023 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ww7\" (UniqueName: \"kubernetes.io/projected/177d0201-cde1-4aa2-8bcd-63ebade72464-kube-api-access-q8ww7\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.274052 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-run-httpd\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387408 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-scripts\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387474 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387511 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387537 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387577 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-config-data\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-log-httpd\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387646 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ww7\" (UniqueName: \"kubernetes.io/projected/177d0201-cde1-4aa2-8bcd-63ebade72464-kube-api-access-q8ww7\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.387676 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-run-httpd\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.388141 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-run-httpd\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.392023 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-log-httpd\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.412072 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.416594 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.421051 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.424489 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-scripts\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.426512 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-config-data\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.444195 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.488803 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data-custom\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.488868 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.488979 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m9bv\" (UniqueName: \"kubernetes.io/projected/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-kube-api-access-5m9bv\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.489052 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-combined-ca-bundle\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.489096 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-logs\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.489126 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-scripts\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.489210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-etc-machine-id\") pod \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\" (UID: \"af6bc53b-31f7-4650-aab3-d4bcf8b685ab\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.496043 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42548a3-5a7b-4f5b-8b13-8b5746710618" path="/var/lib/kubelet/pods/e42548a3-5a7b-4f5b-8b13-8b5746710618/volumes" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.508132 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.508628 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-logs" (OuterVolumeSpecName: "logs") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.516652 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.516886 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-scripts" (OuterVolumeSpecName: "scripts") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.534454 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-kube-api-access-5m9bv" (OuterVolumeSpecName: "kube-api-access-5m9bv") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "kube-api-access-5m9bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.564547 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ww7\" (UniqueName: \"kubernetes.io/projected/177d0201-cde1-4aa2-8bcd-63ebade72464-kube-api-access-q8ww7\") pod \"ceilometer-0\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.572416 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.593564 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.593599 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m9bv\" (UniqueName: \"kubernetes.io/projected/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-kube-api-access-5m9bv\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.593611 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.593622 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.593731 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.593741 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.652374 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data" (OuterVolumeSpecName: "config-data") pod "af6bc53b-31f7-4650-aab3-d4bcf8b685ab" (UID: "af6bc53b-31f7-4650-aab3-d4bcf8b685ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.695332 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6bc53b-31f7-4650-aab3-d4bcf8b685ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.723509 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vbzqc"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.754859 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797174 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn59m\" (UniqueName: \"kubernetes.io/projected/97f44787-3f37-44f1-85a5-4acffef71d95-kube-api-access-vn59m\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797477 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797542 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-combined-ca-bundle\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797638 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-scripts\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797691 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-httpd-run\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797715 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-internal-tls-certs\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797745 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-config-data\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.797797 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-logs\") pod \"97f44787-3f37-44f1-85a5-4acffef71d95\" (UID: \"97f44787-3f37-44f1-85a5-4acffef71d95\") " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.798026 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.798480 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.798489 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-logs" (OuterVolumeSpecName: "logs") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.805155 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.806508 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-scripts" (OuterVolumeSpecName: "scripts") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.811218 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f44787-3f37-44f1-85a5-4acffef71d95-kube-api-access-vn59m" (OuterVolumeSpecName: "kube-api-access-vn59m") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "kube-api-access-vn59m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.817951 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.900998 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97f44787-3f37-44f1-85a5-4acffef71d95-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.901026 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn59m\" (UniqueName: \"kubernetes.io/projected/97f44787-3f37-44f1-85a5-4acffef71d95-kube-api-access-vn59m\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.901069 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.901082 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.910271 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-vfdzl"] Jan 30 05:27:09 crc kubenswrapper[4931]: W0130 05:27:09.925927 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod053ccacf_d473_49f5_89e5_545a753e5e03.slice/crio-b9fcfefa3cc95bbe91be11ad77ac98a7d6d76da884fe714257e2e61bdeb26830 WatchSource:0}: Error finding container b9fcfefa3cc95bbe91be11ad77ac98a7d6d76da884fe714257e2e61bdeb26830: Status 404 returned error can't find the container with id b9fcfefa3cc95bbe91be11ad77ac98a7d6d76da884fe714257e2e61bdeb26830 Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.930671 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0120-account-create-update-dptmf"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.933117 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.947994 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xvdtt"] Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.952745 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-config-data" (OuterVolumeSpecName: "config-data") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.960665 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.962636 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "97f44787-3f37-44f1-85a5-4acffef71d95" (UID: "97f44787-3f37-44f1-85a5-4acffef71d95"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:09 crc kubenswrapper[4931]: I0130 05:27:09.978862 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-x4mqp"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.008971 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.009015 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.009025 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.009036 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f44787-3f37-44f1-85a5-4acffef71d95-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.049991 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-rvcsw"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.105255 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" event={"ID":"053ccacf-d473-49f5-89e5-545a753e5e03","Type":"ContainerStarted","Data":"b9fcfefa3cc95bbe91be11ad77ac98a7d6d76da884fe714257e2e61bdeb26830"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.114258 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbzqc" event={"ID":"6bdb7d70-31a9-4d52-aae0-072e8c62a23f","Type":"ContainerStarted","Data":"ed1045d9c4b634bdebcb19b30994b9f7ac39021883a8b98a833d09018502f440"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.114300 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbzqc" event={"ID":"6bdb7d70-31a9-4d52-aae0-072e8c62a23f","Type":"ContainerStarted","Data":"39e1e0fee7385124dd916707bda2070cfe0dfef223110231a2b4c91e3fb436e2"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.123952 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97f44787-3f37-44f1-85a5-4acffef71d95","Type":"ContainerDied","Data":"7bfff4eea4487971b7e050b186c84e3209413100130292fb4b6aba07f7e36bce"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.123998 4931 scope.go:117] "RemoveContainer" containerID="4dc106a9347c18b30457fcfe0ba0955c89ca0037e4655a069609d973aaa2c8d2" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.124118 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.137979 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xvdtt" event={"ID":"d5cbb37a-882a-46cf-9cee-0543ac708004","Type":"ContainerStarted","Data":"a99db27aa7441abc06efd7c51c328c92f8d197643334e3cadd83bbe4996d30bd"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.143971 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-vbzqc" podStartSLOduration=2.143951128 podStartE2EDuration="2.143951128s" podCreationTimestamp="2026-01-30 05:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:10.134055691 +0000 UTC m=+1165.503965948" watchObservedRunningTime="2026-01-30 05:27:10.143951128 +0000 UTC m=+1165.513861385" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.145206 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0120-account-create-update-dptmf" event={"ID":"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1","Type":"ContainerStarted","Data":"2faa6341fc48e403680c1a43938fac76f4ec0ce1bf1abf1d909499ba638bb1a3"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.148447 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x4mqp" event={"ID":"e8624816-8c2c-4d9c-b3a5-426253850926","Type":"ContainerStarted","Data":"6bf02e4c1dbd5fde8736e51c53a7aee2fa38184d7d077a5959c84e1d4d9b84d2"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.153328 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.153430 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"af6bc53b-31f7-4650-aab3-d4bcf8b685ab","Type":"ContainerDied","Data":"9ea7e55f3db83940154f2b0bcb0d4ef000b7a1cdefa0062385e86b6b76cab2c2"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.155515 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30f9b591-fea6-4010-99db-45eef2237cdc","Type":"ContainerStarted","Data":"100081f00d3d095ca7d8dca6b7343ac8590f3de539067c314527dbcd86ceca1c"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.155722 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.160098 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" event={"ID":"1ff3c7ac-e403-4826-bf45-a6bed05570b7","Type":"ContainerStarted","Data":"4719e62df3f7e4e05282f94f65b6783badfcadb12c381754281d462174b59154"} Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.177880 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.791152885 podStartE2EDuration="3.177844587s" podCreationTimestamp="2026-01-30 05:27:07 +0000 UTC" firstStartedPulling="2026-01-30 05:27:08.265186768 +0000 UTC m=+1163.635097045" lastFinishedPulling="2026-01-30 05:27:08.65187849 +0000 UTC m=+1164.021788747" observedRunningTime="2026-01-30 05:27:10.170041485 +0000 UTC m=+1165.539951742" watchObservedRunningTime="2026-01-30 05:27:10.177844587 +0000 UTC m=+1165.547754854" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.186352 4931 scope.go:117] "RemoveContainer" containerID="c3997c6bebfa178d2d159b7c46082a54c7bc989ec2ee2d507189f6cfa3f09d57" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.239499 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.243695 4931 scope.go:117] "RemoveContainer" containerID="6b99333c2447cf347ded741a740d78480c65d8cae2c155c1e36dafd5c5578db3" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.249954 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288248 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: E0130 05:27:10.288692 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-log" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288708 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-log" Jan 30 05:27:10 crc kubenswrapper[4931]: E0130 05:27:10.288718 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-httpd" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288726 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-httpd" Jan 30 05:27:10 crc kubenswrapper[4931]: E0130 05:27:10.288753 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api-log" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288760 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api-log" Jan 30 05:27:10 crc kubenswrapper[4931]: E0130 05:27:10.288779 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288786 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288971 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api-log" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288982 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-log" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.288993 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" containerName="glance-httpd" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.289017 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" containerName="cinder-api" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.289925 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.293184 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.293354 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.312696 4931 scope.go:117] "RemoveContainer" containerID="2a8b28eadcd454ca8adf3b36ea9153ce1d0f727ce7e3b65bd14a0471ebbea32f" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.323721 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.372311 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.382055 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.410276 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.411961 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.416407 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.418140 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.418305 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.418901 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.418937 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhps\" (UniqueName: \"kubernetes.io/projected/3415cfc4-a71a-4110-bf82-295181bb386f-kube-api-access-nnhps\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419014 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419035 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419106 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419125 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419159 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419211 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.419842 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.444489 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.521451 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-logs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.521844 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.521877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.521900 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.521989 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64zlj\" (UniqueName: \"kubernetes.io/projected/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-kube-api-access-64zlj\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522007 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522029 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhps\" (UniqueName: \"kubernetes.io/projected/3415cfc4-a71a-4110-bf82-295181bb386f-kube-api-access-nnhps\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522057 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522102 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522147 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522164 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522209 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522230 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data-custom\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522249 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522268 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-scripts\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522380 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.522615 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.523852 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.528004 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.528311 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.528673 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.532823 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.543993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhps\" (UniqueName: \"kubernetes.io/projected/3415cfc4-a71a-4110-bf82-295181bb386f-kube-api-access-nnhps\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.572786 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.613665 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.624391 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-scripts\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.624458 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-logs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.624500 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.624525 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.624557 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64zlj\" (UniqueName: \"kubernetes.io/projected/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-kube-api-access-64zlj\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.624983 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-logs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.625122 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.625201 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.625220 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.625244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data-custom\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.626077 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.627723 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-scripts\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.629991 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data-custom\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.630992 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.631017 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.633532 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.633583 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.642087 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64zlj\" (UniqueName: \"kubernetes.io/projected/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-kube-api-access-64zlj\") pod \"cinder-api-0\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.745161 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:27:10 crc kubenswrapper[4931]: I0130 05:27:10.971334 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.170410 4931 generic.go:334] "Generic (PLEG): container finished" podID="1ff3c7ac-e403-4826-bf45-a6bed05570b7" containerID="daeb4e60a2f2e8b0ecc5573dd48689c8e466dc66250fe49e905723d105d79613" exitCode=0 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.170556 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" event={"ID":"1ff3c7ac-e403-4826-bf45-a6bed05570b7","Type":"ContainerDied","Data":"daeb4e60a2f2e8b0ecc5573dd48689c8e466dc66250fe49e905723d105d79613"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.172804 4931 generic.go:334] "Generic (PLEG): container finished" podID="7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" containerID="6c4ebb40e4402e95e337ac0e8eea0a4fb903b22dbcfc5ac614853d0c17f24e3a" exitCode=0 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.172858 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0120-account-create-update-dptmf" event={"ID":"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1","Type":"ContainerDied","Data":"6c4ebb40e4402e95e337ac0e8eea0a4fb903b22dbcfc5ac614853d0c17f24e3a"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.174157 4931 generic.go:334] "Generic (PLEG): container finished" podID="6bdb7d70-31a9-4d52-aae0-072e8c62a23f" containerID="ed1045d9c4b634bdebcb19b30994b9f7ac39021883a8b98a833d09018502f440" exitCode=0 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.174198 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbzqc" event={"ID":"6bdb7d70-31a9-4d52-aae0-072e8c62a23f","Type":"ContainerDied","Data":"ed1045d9c4b634bdebcb19b30994b9f7ac39021883a8b98a833d09018502f440"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.176389 4931 generic.go:334] "Generic (PLEG): container finished" podID="d5cbb37a-882a-46cf-9cee-0543ac708004" containerID="edf9b3d1d8428caf5db14c3063b00d649e4d886f974003048a406d3bcf0b7c43" exitCode=0 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.176439 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xvdtt" event={"ID":"d5cbb37a-882a-46cf-9cee-0543ac708004","Type":"ContainerDied","Data":"edf9b3d1d8428caf5db14c3063b00d649e4d886f974003048a406d3bcf0b7c43"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.184291 4931 generic.go:334] "Generic (PLEG): container finished" podID="e8624816-8c2c-4d9c-b3a5-426253850926" containerID="5712d27fd9c195ed4c35f4530c38c5e87c6a63708aedb0fa792d34d9e26a0b9a" exitCode=0 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.184350 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x4mqp" event={"ID":"e8624816-8c2c-4d9c-b3a5-426253850926","Type":"ContainerDied","Data":"5712d27fd9c195ed4c35f4530c38c5e87c6a63708aedb0fa792d34d9e26a0b9a"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.191193 4931 generic.go:334] "Generic (PLEG): container finished" podID="053ccacf-d473-49f5-89e5-545a753e5e03" containerID="976d06480a8d07dd149684c2767dbf90e61f0fd7efbc4d623ba32e7d83fb861e" exitCode=0 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.191244 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" event={"ID":"053ccacf-d473-49f5-89e5-545a753e5e03","Type":"ContainerDied","Data":"976d06480a8d07dd149684c2767dbf90e61f0fd7efbc4d623ba32e7d83fb861e"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.192544 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.197039 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerStarted","Data":"859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.197065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerStarted","Data":"dff92e478717103c570a46fcba5be2f0a5365832852a9809f2694fb9464d3ab5"} Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.217065 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:27:11 crc kubenswrapper[4931]: W0130 05:27:11.218840 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc86b96a8_cd5c_4ea7_8a6f_5b3a4b2d923e.slice/crio-d58c6ad814535983bdd3740a7cd3d8c344b8ddf68658a3e7d51e045ec46e07d7 WatchSource:0}: Error finding container d58c6ad814535983bdd3740a7cd3d8c344b8ddf68658a3e7d51e045ec46e07d7: Status 404 returned error can't find the container with id d58c6ad814535983bdd3740a7cd3d8c344b8ddf68658a3e7d51e045ec46e07d7 Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.434677 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f44787-3f37-44f1-85a5-4acffef71d95" path="/var/lib/kubelet/pods/97f44787-3f37-44f1-85a5-4acffef71d95/volumes" Jan 30 05:27:11 crc kubenswrapper[4931]: I0130 05:27:11.435403 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6bc53b-31f7-4650-aab3-d4bcf8b685ab" path="/var/lib/kubelet/pods/af6bc53b-31f7-4650-aab3-d4bcf8b685ab/volumes" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.235313 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerStarted","Data":"a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf"} Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.238965 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3415cfc4-a71a-4110-bf82-295181bb386f","Type":"ContainerStarted","Data":"3795752efe01b170e153d47107ec186f09220104cdec5c71d0a210a63580f858"} Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.239196 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3415cfc4-a71a-4110-bf82-295181bb386f","Type":"ContainerStarted","Data":"8686488d53f891915ba13840ec460659816d6140e0778cc81ec5034b3206cf0a"} Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.241488 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e","Type":"ContainerStarted","Data":"c948d726013eb4e8273ef998118172023cae6536ca99db77a7f1ebd4884def12"} Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.241545 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e","Type":"ContainerStarted","Data":"d58c6ad814535983bdd3740a7cd3d8c344b8ddf68658a3e7d51e045ec46e07d7"} Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.706695 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.778091 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff3c7ac-e403-4826-bf45-a6bed05570b7-operator-scripts\") pod \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.778199 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz255\" (UniqueName: \"kubernetes.io/projected/1ff3c7ac-e403-4826-bf45-a6bed05570b7-kube-api-access-xz255\") pod \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\" (UID: \"1ff3c7ac-e403-4826-bf45-a6bed05570b7\") " Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.778629 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff3c7ac-e403-4826-bf45-a6bed05570b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ff3c7ac-e403-4826-bf45-a6bed05570b7" (UID: "1ff3c7ac-e403-4826-bf45-a6bed05570b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.792064 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff3c7ac-e403-4826-bf45-a6bed05570b7-kube-api-access-xz255" (OuterVolumeSpecName: "kube-api-access-xz255") pod "1ff3c7ac-e403-4826-bf45-a6bed05570b7" (UID: "1ff3c7ac-e403-4826-bf45-a6bed05570b7"). InnerVolumeSpecName "kube-api-access-xz255". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.879752 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz255\" (UniqueName: \"kubernetes.io/projected/1ff3c7ac-e403-4826-bf45-a6bed05570b7-kube-api-access-xz255\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.879780 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ff3c7ac-e403-4826-bf45-a6bed05570b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.911373 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.916459 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.925136 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.942386 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.957222 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.980356 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-operator-scripts\") pod \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.980841 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlfnc\" (UniqueName: \"kubernetes.io/projected/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-kube-api-access-jlfnc\") pod \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\" (UID: \"6bdb7d70-31a9-4d52-aae0-072e8c62a23f\") " Jan 30 05:27:12 crc kubenswrapper[4931]: I0130 05:27:12.983892 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bdb7d70-31a9-4d52-aae0-072e8c62a23f" (UID: "6bdb7d70-31a9-4d52-aae0-072e8c62a23f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.005533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-kube-api-access-jlfnc" (OuterVolumeSpecName: "kube-api-access-jlfnc") pod "6bdb7d70-31a9-4d52-aae0-072e8c62a23f" (UID: "6bdb7d70-31a9-4d52-aae0-072e8c62a23f"). InnerVolumeSpecName "kube-api-access-jlfnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.084990 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/053ccacf-d473-49f5-89e5-545a753e5e03-operator-scripts\") pod \"053ccacf-d473-49f5-89e5-545a753e5e03\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085058 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5cbb37a-882a-46cf-9cee-0543ac708004-operator-scripts\") pod \"d5cbb37a-882a-46cf-9cee-0543ac708004\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085114 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2bbp\" (UniqueName: \"kubernetes.io/projected/e8624816-8c2c-4d9c-b3a5-426253850926-kube-api-access-b2bbp\") pod \"e8624816-8c2c-4d9c-b3a5-426253850926\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085134 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt5hz\" (UniqueName: \"kubernetes.io/projected/d5cbb37a-882a-46cf-9cee-0543ac708004-kube-api-access-kt5hz\") pod \"d5cbb37a-882a-46cf-9cee-0543ac708004\" (UID: \"d5cbb37a-882a-46cf-9cee-0543ac708004\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085216 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkdss\" (UniqueName: \"kubernetes.io/projected/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-kube-api-access-mkdss\") pod \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085255 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8624816-8c2c-4d9c-b3a5-426253850926-operator-scripts\") pod \"e8624816-8c2c-4d9c-b3a5-426253850926\" (UID: \"e8624816-8c2c-4d9c-b3a5-426253850926\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085272 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89msj\" (UniqueName: \"kubernetes.io/projected/053ccacf-d473-49f5-89e5-545a753e5e03-kube-api-access-89msj\") pod \"053ccacf-d473-49f5-89e5-545a753e5e03\" (UID: \"053ccacf-d473-49f5-89e5-545a753e5e03\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085317 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-operator-scripts\") pod \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\" (UID: \"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1\") " Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085674 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlfnc\" (UniqueName: \"kubernetes.io/projected/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-kube-api-access-jlfnc\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.085692 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb7d70-31a9-4d52-aae0-072e8c62a23f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.087782 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/053ccacf-d473-49f5-89e5-545a753e5e03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "053ccacf-d473-49f5-89e5-545a753e5e03" (UID: "053ccacf-d473-49f5-89e5-545a753e5e03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.088036 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5cbb37a-882a-46cf-9cee-0543ac708004-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5cbb37a-882a-46cf-9cee-0543ac708004" (UID: "d5cbb37a-882a-46cf-9cee-0543ac708004"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.088645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" (UID: "7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.088959 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8624816-8c2c-4d9c-b3a5-426253850926-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8624816-8c2c-4d9c-b3a5-426253850926" (UID: "e8624816-8c2c-4d9c-b3a5-426253850926"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.093621 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-kube-api-access-mkdss" (OuterVolumeSpecName: "kube-api-access-mkdss") pod "7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" (UID: "7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1"). InnerVolumeSpecName "kube-api-access-mkdss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.094784 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5cbb37a-882a-46cf-9cee-0543ac708004-kube-api-access-kt5hz" (OuterVolumeSpecName: "kube-api-access-kt5hz") pod "d5cbb37a-882a-46cf-9cee-0543ac708004" (UID: "d5cbb37a-882a-46cf-9cee-0543ac708004"). InnerVolumeSpecName "kube-api-access-kt5hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.096360 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/053ccacf-d473-49f5-89e5-545a753e5e03-kube-api-access-89msj" (OuterVolumeSpecName: "kube-api-access-89msj") pod "053ccacf-d473-49f5-89e5-545a753e5e03" (UID: "053ccacf-d473-49f5-89e5-545a753e5e03"). InnerVolumeSpecName "kube-api-access-89msj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.096511 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8624816-8c2c-4d9c-b3a5-426253850926-kube-api-access-b2bbp" (OuterVolumeSpecName: "kube-api-access-b2bbp") pod "e8624816-8c2c-4d9c-b3a5-426253850926" (UID: "e8624816-8c2c-4d9c-b3a5-426253850926"). InnerVolumeSpecName "kube-api-access-b2bbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187220 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89msj\" (UniqueName: \"kubernetes.io/projected/053ccacf-d473-49f5-89e5-545a753e5e03-kube-api-access-89msj\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187254 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187263 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/053ccacf-d473-49f5-89e5-545a753e5e03-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187272 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5cbb37a-882a-46cf-9cee-0543ac708004-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187281 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2bbp\" (UniqueName: \"kubernetes.io/projected/e8624816-8c2c-4d9c-b3a5-426253850926-kube-api-access-b2bbp\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187290 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt5hz\" (UniqueName: \"kubernetes.io/projected/d5cbb37a-882a-46cf-9cee-0543ac708004-kube-api-access-kt5hz\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187298 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkdss\" (UniqueName: \"kubernetes.io/projected/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1-kube-api-access-mkdss\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.187306 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8624816-8c2c-4d9c-b3a5-426253850926-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.249842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerStarted","Data":"db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.251599 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e","Type":"ContainerStarted","Data":"2823dcc09d156bc746ffbc3ab196c3d6e136f453bf377837d7cce10861a168f4"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.252676 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.253931 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0120-account-create-update-dptmf" event={"ID":"7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1","Type":"ContainerDied","Data":"2faa6341fc48e403680c1a43938fac76f4ec0ce1bf1abf1d909499ba638bb1a3"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.253953 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2faa6341fc48e403680c1a43938fac76f4ec0ce1bf1abf1d909499ba638bb1a3" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.254003 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-dptmf" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.258578 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.258571 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-10f6-account-create-update-vfdzl" event={"ID":"053ccacf-d473-49f5-89e5-545a753e5e03","Type":"ContainerDied","Data":"b9fcfefa3cc95bbe91be11ad77ac98a7d6d76da884fe714257e2e61bdeb26830"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.258705 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9fcfefa3cc95bbe91be11ad77ac98a7d6d76da884fe714257e2e61bdeb26830" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.260097 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbzqc" event={"ID":"6bdb7d70-31a9-4d52-aae0-072e8c62a23f","Type":"ContainerDied","Data":"39e1e0fee7385124dd916707bda2070cfe0dfef223110231a2b4c91e3fb436e2"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.260132 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e1e0fee7385124dd916707bda2070cfe0dfef223110231a2b4c91e3fb436e2" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.260143 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbzqc" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.269789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xvdtt" event={"ID":"d5cbb37a-882a-46cf-9cee-0543ac708004","Type":"ContainerDied","Data":"a99db27aa7441abc06efd7c51c328c92f8d197643334e3cadd83bbe4996d30bd"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.269837 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99db27aa7441abc06efd7c51c328c92f8d197643334e3cadd83bbe4996d30bd" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.269945 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xvdtt" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.273707 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3415cfc4-a71a-4110-bf82-295181bb386f","Type":"ContainerStarted","Data":"cc448c5e4a9d4def969b75156b3cc39bbccbe47f49a05ef9d15592b4643a809f"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.276829 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.276812127 podStartE2EDuration="3.276812127s" podCreationTimestamp="2026-01-30 05:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:13.272878115 +0000 UTC m=+1168.642788372" watchObservedRunningTime="2026-01-30 05:27:13.276812127 +0000 UTC m=+1168.646722384" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.279241 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" event={"ID":"1ff3c7ac-e403-4826-bf45-a6bed05570b7","Type":"ContainerDied","Data":"4719e62df3f7e4e05282f94f65b6783badfcadb12c381754281d462174b59154"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.279284 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4719e62df3f7e4e05282f94f65b6783badfcadb12c381754281d462174b59154" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.279361 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-rvcsw" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.286025 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x4mqp" event={"ID":"e8624816-8c2c-4d9c-b3a5-426253850926","Type":"ContainerDied","Data":"6bf02e4c1dbd5fde8736e51c53a7aee2fa38184d7d077a5959c84e1d4d9b84d2"} Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.286063 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bf02e4c1dbd5fde8736e51c53a7aee2fa38184d7d077a5959c84e1d4d9b84d2" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.286127 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x4mqp" Jan 30 05:27:13 crc kubenswrapper[4931]: I0130 05:27:13.295195 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.295181114 podStartE2EDuration="3.295181114s" podCreationTimestamp="2026-01-30 05:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:13.293058289 +0000 UTC m=+1168.662968546" watchObservedRunningTime="2026-01-30 05:27:13.295181114 +0000 UTC m=+1168.665091371" Jan 30 05:27:14 crc kubenswrapper[4931]: I0130 05:27:14.887091 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:14 crc kubenswrapper[4931]: I0130 05:27:14.887862 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-log" containerID="cri-o://d2ded790b556dc13af017d23c970f8fe6d49472a2741355949522d19b2e1e389" gracePeriod=30 Jan 30 05:27:14 crc kubenswrapper[4931]: I0130 05:27:14.887945 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-httpd" containerID="cri-o://6f20ab78e04ca2466a780c5cc51a4b37e0f487abee57f4e067c29bab7787be5d" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.303306 4931 generic.go:334] "Generic (PLEG): container finished" podID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerID="d2ded790b556dc13af017d23c970f8fe6d49472a2741355949522d19b2e1e389" exitCode=143 Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.303382 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18f01f64-f6e4-42f3-80f8-27c86f82eeef","Type":"ContainerDied","Data":"d2ded790b556dc13af017d23c970f8fe6d49472a2741355949522d19b2e1e389"} Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.306010 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerStarted","Data":"f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658"} Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.306204 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-central-agent" containerID="cri-o://859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.306889 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="proxy-httpd" containerID="cri-o://f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.306967 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="sg-core" containerID="cri-o://db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.307028 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-notification-agent" containerID="cri-o://a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4931]: I0130 05:27:15.336984 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.407954608 podStartE2EDuration="6.336958589s" podCreationTimestamp="2026-01-30 05:27:09 +0000 UTC" firstStartedPulling="2026-01-30 05:27:10.456035924 +0000 UTC m=+1165.825946191" lastFinishedPulling="2026-01-30 05:27:14.385039905 +0000 UTC m=+1169.754950172" observedRunningTime="2026-01-30 05:27:15.330127891 +0000 UTC m=+1170.700038148" watchObservedRunningTime="2026-01-30 05:27:15.336958589 +0000 UTC m=+1170.706868846" Jan 30 05:27:16 crc kubenswrapper[4931]: I0130 05:27:16.320681 4931 generic.go:334] "Generic (PLEG): container finished" podID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerID="f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658" exitCode=0 Jan 30 05:27:16 crc kubenswrapper[4931]: I0130 05:27:16.321066 4931 generic.go:334] "Generic (PLEG): container finished" podID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerID="db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124" exitCode=2 Jan 30 05:27:16 crc kubenswrapper[4931]: I0130 05:27:16.321086 4931 generic.go:334] "Generic (PLEG): container finished" podID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerID="a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf" exitCode=0 Jan 30 05:27:16 crc kubenswrapper[4931]: I0130 05:27:16.320751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerDied","Data":"f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658"} Jan 30 05:27:16 crc kubenswrapper[4931]: I0130 05:27:16.321144 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerDied","Data":"db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124"} Jan 30 05:27:16 crc kubenswrapper[4931]: I0130 05:27:16.321173 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerDied","Data":"a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf"} Jan 30 05:27:17 crc kubenswrapper[4931]: I0130 05:27:17.766381 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.311022 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.345686 4931 generic.go:334] "Generic (PLEG): container finished" podID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerID="859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb" exitCode=0 Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.345765 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerDied","Data":"859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb"} Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.345836 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"177d0201-cde1-4aa2-8bcd-63ebade72464","Type":"ContainerDied","Data":"dff92e478717103c570a46fcba5be2f0a5365832852a9809f2694fb9464d3ab5"} Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.345862 4931 scope.go:117] "RemoveContainer" containerID="f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.346070 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.355033 4931 generic.go:334] "Generic (PLEG): container finished" podID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerID="6f20ab78e04ca2466a780c5cc51a4b37e0f487abee57f4e067c29bab7787be5d" exitCode=0 Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.355087 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18f01f64-f6e4-42f3-80f8-27c86f82eeef","Type":"ContainerDied","Data":"6f20ab78e04ca2466a780c5cc51a4b37e0f487abee57f4e067c29bab7787be5d"} Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.370984 4931 scope.go:117] "RemoveContainer" containerID="db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390230 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-scripts\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390279 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-run-httpd\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390302 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8ww7\" (UniqueName: \"kubernetes.io/projected/177d0201-cde1-4aa2-8bcd-63ebade72464-kube-api-access-q8ww7\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390321 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-config-data\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390389 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-combined-ca-bundle\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390542 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-ceilometer-tls-certs\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390580 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-sg-core-conf-yaml\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.390651 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-log-httpd\") pod \"177d0201-cde1-4aa2-8bcd-63ebade72464\" (UID: \"177d0201-cde1-4aa2-8bcd-63ebade72464\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.391503 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.392267 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.398407 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177d0201-cde1-4aa2-8bcd-63ebade72464-kube-api-access-q8ww7" (OuterVolumeSpecName: "kube-api-access-q8ww7") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "kube-api-access-q8ww7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.398966 4931 scope.go:117] "RemoveContainer" containerID="a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.399670 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-scripts" (OuterVolumeSpecName: "scripts") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.455213 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.469919 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.471463 4931 scope.go:117] "RemoveContainer" containerID="859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.488630 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492859 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492886 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492895 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177d0201-cde1-4aa2-8bcd-63ebade72464-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492904 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8ww7\" (UniqueName: \"kubernetes.io/projected/177d0201-cde1-4aa2-8bcd-63ebade72464-kube-api-access-q8ww7\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492913 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492921 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.492929 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.505617 4931 scope.go:117] "RemoveContainer" containerID="f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.506378 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658\": container with ID starting with f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658 not found: ID does not exist" containerID="f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.506442 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658"} err="failed to get container status \"f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658\": rpc error: code = NotFound desc = could not find container \"f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658\": container with ID starting with f235608970e9f08a99e6669a08f94081a8924b432253a91470fc894cf56a6658 not found: ID does not exist" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.506472 4931 scope.go:117] "RemoveContainer" containerID="db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.507312 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124\": container with ID starting with db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124 not found: ID does not exist" containerID="db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.507383 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124"} err="failed to get container status \"db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124\": rpc error: code = NotFound desc = could not find container \"db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124\": container with ID starting with db0762eac4cfade11ae03279b73a2bf8be966b75fd0991edf7a0fdb9c62c8124 not found: ID does not exist" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.507444 4931 scope.go:117] "RemoveContainer" containerID="a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.507759 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf\": container with ID starting with a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf not found: ID does not exist" containerID="a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.507790 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf"} err="failed to get container status \"a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf\": rpc error: code = NotFound desc = could not find container \"a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf\": container with ID starting with a3df13b4a30c3cc8048c90cfd2f794f41bd5d1b0a6a65d81540d524792930eaf not found: ID does not exist" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.507807 4931 scope.go:117] "RemoveContainer" containerID="859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.508075 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb\": container with ID starting with 859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb not found: ID does not exist" containerID="859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.508120 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb"} err="failed to get container status \"859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb\": rpc error: code = NotFound desc = could not find container \"859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb\": container with ID starting with 859b0155d8a411e508b48bc6436155827f71dbfc993b9a213b9b2f60df1f9abb not found: ID does not exist" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.510733 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-config-data" (OuterVolumeSpecName: "config-data") pod "177d0201-cde1-4aa2-8bcd-63ebade72464" (UID: "177d0201-cde1-4aa2-8bcd-63ebade72464"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.532771 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598201 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-scripts\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598272 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598314 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-public-tls-certs\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598385 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-logs\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598403 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-combined-ca-bundle\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598474 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-config-data\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598506 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czq79\" (UniqueName: \"kubernetes.io/projected/18f01f64-f6e4-42f3-80f8-27c86f82eeef-kube-api-access-czq79\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598589 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-httpd-run\") pod \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\" (UID: \"18f01f64-f6e4-42f3-80f8-27c86f82eeef\") " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.598944 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177d0201-cde1-4aa2-8bcd-63ebade72464-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.599538 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-logs" (OuterVolumeSpecName: "logs") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.603559 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.604500 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-scripts" (OuterVolumeSpecName: "scripts") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.607598 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f01f64-f6e4-42f3-80f8-27c86f82eeef-kube-api-access-czq79" (OuterVolumeSpecName: "kube-api-access-czq79") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "kube-api-access-czq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.633748 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.636602 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.669562 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-config-data" (OuterVolumeSpecName: "config-data") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.700588 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "18f01f64-f6e4-42f3-80f8-27c86f82eeef" (UID: "18f01f64-f6e4-42f3-80f8-27c86f82eeef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709183 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709227 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709269 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709280 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709289 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czq79\" (UniqueName: \"kubernetes.io/projected/18f01f64-f6e4-42f3-80f8-27c86f82eeef-kube-api-access-czq79\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709302 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18f01f64-f6e4-42f3-80f8-27c86f82eeef-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709310 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f01f64-f6e4-42f3-80f8-27c86f82eeef-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.709369 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.759239 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.793933 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.811283 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.811318 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.817662 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.817988 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdb7d70-31a9-4d52-aae0-072e8c62a23f" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818004 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdb7d70-31a9-4d52-aae0-072e8c62a23f" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818015 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff3c7ac-e403-4826-bf45-a6bed05570b7" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818021 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff3c7ac-e403-4826-bf45-a6bed05570b7" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818031 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-central-agent" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818037 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-central-agent" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818052 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-httpd" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818058 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-httpd" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818069 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-log" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818075 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-log" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818083 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-notification-agent" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818089 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-notification-agent" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818099 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8624816-8c2c-4d9c-b3a5-426253850926" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818104 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8624816-8c2c-4d9c-b3a5-426253850926" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818208 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="053ccacf-d473-49f5-89e5-545a753e5e03" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818214 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="053ccacf-d473-49f5-89e5-545a753e5e03" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818225 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="sg-core" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818230 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="sg-core" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818245 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5cbb37a-882a-46cf-9cee-0543ac708004" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818253 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5cbb37a-882a-46cf-9cee-0543ac708004" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818268 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="proxy-httpd" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818275 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="proxy-httpd" Jan 30 05:27:18 crc kubenswrapper[4931]: E0130 05:27:18.818290 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818296 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818513 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-central-agent" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818526 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="sg-core" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818537 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="proxy-httpd" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818550 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="053ccacf-d473-49f5-89e5-545a753e5e03" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818562 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff3c7ac-e403-4826-bf45-a6bed05570b7" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818569 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" containerName="mariadb-account-create-update" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818578 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8624816-8c2c-4d9c-b3a5-426253850926" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818588 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5cbb37a-882a-46cf-9cee-0543ac708004" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818598 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" containerName="ceilometer-notification-agent" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818609 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-httpd" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818617 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" containerName="glance-log" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.818623 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdb7d70-31a9-4d52-aae0-072e8c62a23f" containerName="mariadb-database-create" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.820380 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.822043 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.822573 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.826089 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.827634 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912524 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-config-data\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912597 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9n8v\" (UniqueName: \"kubernetes.io/projected/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-kube-api-access-f9n8v\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912735 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-run-httpd\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912751 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-scripts\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912821 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912877 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-log-httpd\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4931]: I0130 05:27:18.912892 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015129 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015253 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9n8v\" (UniqueName: \"kubernetes.io/projected/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-kube-api-access-f9n8v\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015305 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-run-httpd\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015343 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-scripts\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015460 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015520 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-log-httpd\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015549 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.015615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-config-data\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.016023 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-run-httpd\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.016129 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-log-httpd\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.020558 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-config-data\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.020737 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.022409 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.025719 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-scripts\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.033232 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.038284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9n8v\" (UniqueName: \"kubernetes.io/projected/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-kube-api-access-f9n8v\") pod \"ceilometer-0\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.063165 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s287f"] Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.065389 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.067878 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5k4cm" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.068393 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.068633 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.076212 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s287f"] Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.134713 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.218830 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.219076 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-scripts\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.219151 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzrx\" (UniqueName: \"kubernetes.io/projected/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-kube-api-access-trzrx\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.219257 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-config-data\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.321005 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-scripts\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.321246 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trzrx\" (UniqueName: \"kubernetes.io/projected/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-kube-api-access-trzrx\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.321278 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-config-data\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.321327 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.327266 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-scripts\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.327498 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-config-data\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.336416 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzrx\" (UniqueName: \"kubernetes.io/projected/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-kube-api-access-trzrx\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.371406 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s287f\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.385795 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18f01f64-f6e4-42f3-80f8-27c86f82eeef","Type":"ContainerDied","Data":"f169988e956408b39f47bea60212630dcedf5b4c3315a89463a6589988357590"} Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.385839 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.385855 4931 scope.go:117] "RemoveContainer" containerID="6f20ab78e04ca2466a780c5cc51a4b37e0f487abee57f4e067c29bab7787be5d" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.412537 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.420710 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.431396 4931 scope.go:117] "RemoveContainer" containerID="d2ded790b556dc13af017d23c970f8fe6d49472a2741355949522d19b2e1e389" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.437459 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177d0201-cde1-4aa2-8bcd-63ebade72464" path="/var/lib/kubelet/pods/177d0201-cde1-4aa2-8bcd-63ebade72464/volumes" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.438117 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.451509 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.453062 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.455560 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.456071 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.465329 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524311 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524412 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524507 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-logs\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524626 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.524657 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmpkl\" (UniqueName: \"kubernetes.io/projected/7c0ddaec-4521-4898-8649-262b52f24acb-kube-api-access-pmpkl\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626521 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626814 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-logs\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626857 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626878 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpkl\" (UniqueName: \"kubernetes.io/projected/7c0ddaec-4521-4898-8649-262b52f24acb-kube-api-access-pmpkl\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626921 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.626990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.627014 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.627380 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.634490 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.635117 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-logs\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.640529 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.640836 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.643162 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.643622 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.652790 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmpkl\" (UniqueName: \"kubernetes.io/projected/7c0ddaec-4521-4898-8649-262b52f24acb-kube-api-access-pmpkl\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.673905 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:19 crc kubenswrapper[4931]: W0130 05:27:19.675009 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8c888ca_c1fd_452f_9fd2_ff821f4b18e5.slice/crio-5e4a201ab16bebe12731751910597ba0cdd866af8fc6c845f3bcb4b6eb6b4330 WatchSource:0}: Error finding container 5e4a201ab16bebe12731751910597ba0cdd866af8fc6c845f3bcb4b6eb6b4330: Status 404 returned error can't find the container with id 5e4a201ab16bebe12731751910597ba0cdd866af8fc6c845f3bcb4b6eb6b4330 Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.677287 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.783737 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:19 crc kubenswrapper[4931]: I0130 05:27:19.882588 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s287f"] Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.333357 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:20 crc kubenswrapper[4931]: W0130 05:27:20.334217 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c0ddaec_4521_4898_8649_262b52f24acb.slice/crio-f1e40e63465f32ce48c188f63cada07df803d2d0b29cf0b23188f72f3a13a25b WatchSource:0}: Error finding container f1e40e63465f32ce48c188f63cada07df803d2d0b29cf0b23188f72f3a13a25b: Status 404 returned error can't find the container with id f1e40e63465f32ce48c188f63cada07df803d2d0b29cf0b23188f72f3a13a25b Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.411256 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s287f" event={"ID":"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b","Type":"ContainerStarted","Data":"33ba5db3481aa96c9b6d1d5ea1daf2013941f187605ad329f6f6d5a0b2ba2f94"} Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.414386 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c0ddaec-4521-4898-8649-262b52f24acb","Type":"ContainerStarted","Data":"f1e40e63465f32ce48c188f63cada07df803d2d0b29cf0b23188f72f3a13a25b"} Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.418406 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerStarted","Data":"c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6"} Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.418506 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerStarted","Data":"5e4a201ab16bebe12731751910597ba0cdd866af8fc6c845f3bcb4b6eb6b4330"} Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.614254 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.614300 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.667295 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:20 crc kubenswrapper[4931]: I0130 05:27:20.677056 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:21 crc kubenswrapper[4931]: I0130 05:27:21.435136 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f01f64-f6e4-42f3-80f8-27c86f82eeef" path="/var/lib/kubelet/pods/18f01f64-f6e4-42f3-80f8-27c86f82eeef/volumes" Jan 30 05:27:21 crc kubenswrapper[4931]: I0130 05:27:21.457799 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerStarted","Data":"fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb"} Jan 30 05:27:21 crc kubenswrapper[4931]: I0130 05:27:21.468121 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c0ddaec-4521-4898-8649-262b52f24acb","Type":"ContainerStarted","Data":"754804bc268dc311547eddc996a035b132392b26798898d2ba034bc32dc1ee16"} Jan 30 05:27:21 crc kubenswrapper[4931]: I0130 05:27:21.468546 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:21 crc kubenswrapper[4931]: I0130 05:27:21.468566 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:22 crc kubenswrapper[4931]: I0130 05:27:22.481814 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c0ddaec-4521-4898-8649-262b52f24acb","Type":"ContainerStarted","Data":"3509d69982e816f9732671b256bbf363b32c9a199362011499a8607bf3a6e808"} Jan 30 05:27:22 crc kubenswrapper[4931]: I0130 05:27:22.485632 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerStarted","Data":"1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834"} Jan 30 05:27:22 crc kubenswrapper[4931]: I0130 05:27:22.503407 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.503386991 podStartE2EDuration="3.503386991s" podCreationTimestamp="2026-01-30 05:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:22.497472407 +0000 UTC m=+1177.867382664" watchObservedRunningTime="2026-01-30 05:27:22.503386991 +0000 UTC m=+1177.873297258" Jan 30 05:27:23 crc kubenswrapper[4931]: I0130 05:27:23.016623 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 05:27:23 crc kubenswrapper[4931]: I0130 05:27:23.241124 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:23 crc kubenswrapper[4931]: I0130 05:27:23.568193 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:23 crc kubenswrapper[4931]: I0130 05:27:23.568560 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.108217 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.508495 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-central-agent" containerID="cri-o://c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.508852 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerStarted","Data":"3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133"} Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.508932 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.509198 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="proxy-httpd" containerID="cri-o://3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.509283 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="sg-core" containerID="cri-o://1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4931]: I0130 05:27:24.509320 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-notification-agent" containerID="cri-o://fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb" gracePeriod=30 Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.448714 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.830387463 podStartE2EDuration="7.448699274s" podCreationTimestamp="2026-01-30 05:27:18 +0000 UTC" firstStartedPulling="2026-01-30 05:27:19.678846081 +0000 UTC m=+1175.048756338" lastFinishedPulling="2026-01-30 05:27:23.297157892 +0000 UTC m=+1178.667068149" observedRunningTime="2026-01-30 05:27:24.538039871 +0000 UTC m=+1179.907950128" watchObservedRunningTime="2026-01-30 05:27:25.448699274 +0000 UTC m=+1180.818609531" Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.519559 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerID="3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.519590 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerID="1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834" exitCode=2 Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.519597 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerID="fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.519618 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerDied","Data":"3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133"} Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.519650 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerDied","Data":"1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834"} Jan 30 05:27:25 crc kubenswrapper[4931]: I0130 05:27:25.519660 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerDied","Data":"fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb"} Jan 30 05:27:29 crc kubenswrapper[4931]: I0130 05:27:29.784965 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:27:29 crc kubenswrapper[4931]: I0130 05:27:29.785391 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:27:29 crc kubenswrapper[4931]: I0130 05:27:29.820686 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:27:29 crc kubenswrapper[4931]: I0130 05:27:29.876159 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.518968 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.562546 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s287f" event={"ID":"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b","Type":"ContainerStarted","Data":"c2771265ae8a990e0e69c0f116c64cc25eecd94f6e185173fb3394e2e6fbe468"} Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.566921 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerID="c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6" exitCode=0 Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.567713 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.567735 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerDied","Data":"c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6"} Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.567795 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5","Type":"ContainerDied","Data":"5e4a201ab16bebe12731751910597ba0cdd866af8fc6c845f3bcb4b6eb6b4330"} Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.567814 4931 scope.go:117] "RemoveContainer" containerID="3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.568096 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.570507 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.592948 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-s287f" podStartSLOduration=1.618121457 podStartE2EDuration="11.59292611s" podCreationTimestamp="2026-01-30 05:27:19 +0000 UTC" firstStartedPulling="2026-01-30 05:27:19.899593417 +0000 UTC m=+1175.269503674" lastFinishedPulling="2026-01-30 05:27:29.87439807 +0000 UTC m=+1185.244308327" observedRunningTime="2026-01-30 05:27:30.584699156 +0000 UTC m=+1185.954609413" watchObservedRunningTime="2026-01-30 05:27:30.59292611 +0000 UTC m=+1185.962836367" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.595618 4931 scope.go:117] "RemoveContainer" containerID="1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.619725 4931 scope.go:117] "RemoveContainer" containerID="fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.632734 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-run-httpd\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.632809 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-log-httpd\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.632871 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-config-data\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.632907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-scripts\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.632937 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-sg-core-conf-yaml\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.632995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-ceilometer-tls-certs\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.633070 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-combined-ca-bundle\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.633108 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9n8v\" (UniqueName: \"kubernetes.io/projected/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-kube-api-access-f9n8v\") pod \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\" (UID: \"a8c888ca-c1fd-452f-9fd2-ff821f4b18e5\") " Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.637689 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.638174 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.644512 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-kube-api-access-f9n8v" (OuterVolumeSpecName: "kube-api-access-f9n8v") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "kube-api-access-f9n8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.647812 4931 scope.go:117] "RemoveContainer" containerID="c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.650504 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-scripts" (OuterVolumeSpecName: "scripts") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.677658 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.683033 4931 scope.go:117] "RemoveContainer" containerID="3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133" Jan 30 05:27:30 crc kubenswrapper[4931]: E0130 05:27:30.683494 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133\": container with ID starting with 3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133 not found: ID does not exist" containerID="3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.683550 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133"} err="failed to get container status \"3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133\": rpc error: code = NotFound desc = could not find container \"3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133\": container with ID starting with 3d1af17ef4276356fc20e1e86fb994d75422189cd3d057d792ef2830cd3e7133 not found: ID does not exist" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.683687 4931 scope.go:117] "RemoveContainer" containerID="1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834" Jan 30 05:27:30 crc kubenswrapper[4931]: E0130 05:27:30.684043 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834\": container with ID starting with 1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834 not found: ID does not exist" containerID="1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.684095 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834"} err="failed to get container status \"1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834\": rpc error: code = NotFound desc = could not find container \"1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834\": container with ID starting with 1c290deb1ddccdee82913980b6f27c116b92dbaa10273cd5835fbf2da7ff3834 not found: ID does not exist" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.684113 4931 scope.go:117] "RemoveContainer" containerID="fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb" Jan 30 05:27:30 crc kubenswrapper[4931]: E0130 05:27:30.684335 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb\": container with ID starting with fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb not found: ID does not exist" containerID="fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.684359 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb"} err="failed to get container status \"fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb\": rpc error: code = NotFound desc = could not find container \"fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb\": container with ID starting with fff24e78cf96f52120648f1c7bc9b2d5e91c3ff0f2cf372e795cdf62b3b76dfb not found: ID does not exist" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.684375 4931 scope.go:117] "RemoveContainer" containerID="c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6" Jan 30 05:27:30 crc kubenswrapper[4931]: E0130 05:27:30.684651 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6\": container with ID starting with c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6 not found: ID does not exist" containerID="c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.684677 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6"} err="failed to get container status \"c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6\": rpc error: code = NotFound desc = could not find container \"c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6\": container with ID starting with c6de91be4c5b640d1c85b573a20a5ff3615c8377869fa35e5e5c44374ab7f6c6 not found: ID does not exist" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.707488 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.732039 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736546 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736569 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9n8v\" (UniqueName: \"kubernetes.io/projected/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-kube-api-access-f9n8v\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736585 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736597 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736608 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736618 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.736629 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.757842 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-config-data" (OuterVolumeSpecName: "config-data") pod "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" (UID: "a8c888ca-c1fd-452f-9fd2-ff821f4b18e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.838732 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.980116 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:30 crc kubenswrapper[4931]: I0130 05:27:30.997804 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.007704 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4931]: E0130 05:27:31.008128 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="proxy-httpd" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008150 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="proxy-httpd" Jan 30 05:27:31 crc kubenswrapper[4931]: E0130 05:27:31.008180 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="sg-core" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008189 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="sg-core" Jan 30 05:27:31 crc kubenswrapper[4931]: E0130 05:27:31.008206 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-notification-agent" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008214 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-notification-agent" Jan 30 05:27:31 crc kubenswrapper[4931]: E0130 05:27:31.008230 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-central-agent" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008238 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-central-agent" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008477 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="sg-core" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008508 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-notification-agent" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008519 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="ceilometer-central-agent" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.008528 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" containerName="proxy-httpd" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.021137 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.021252 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.024168 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.024572 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.025726 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.146777 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.146817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xw72\" (UniqueName: \"kubernetes.io/projected/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-kube-api-access-9xw72\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.146835 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.146895 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-run-httpd\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.146926 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-log-httpd\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.147048 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-scripts\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.147152 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.147238 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-config-data\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.248802 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.249108 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-config-data\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.249491 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.249604 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xw72\" (UniqueName: \"kubernetes.io/projected/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-kube-api-access-9xw72\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.249701 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.249867 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-run-httpd\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.250020 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-log-httpd\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.250219 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-run-httpd\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.250393 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-log-httpd\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.251669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-scripts\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.254052 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.254881 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.255784 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-config-data\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.264452 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.269358 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-scripts\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.272858 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xw72\" (UniqueName: \"kubernetes.io/projected/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-kube-api-access-9xw72\") pod \"ceilometer-0\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.363542 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.446879 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c888ca-c1fd-452f-9fd2-ff821f4b18e5" path="/var/lib/kubelet/pods/a8c888ca-c1fd-452f-9fd2-ff821f4b18e5/volumes" Jan 30 05:27:31 crc kubenswrapper[4931]: W0130 05:27:31.846492 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bf8c845_e69c_41e6_9e59_4b9b9fceaaf7.slice/crio-011e359b19d615fee43023e3c1e45b97d099d18b94cb63d33b5905aec3a68e27 WatchSource:0}: Error finding container 011e359b19d615fee43023e3c1e45b97d099d18b94cb63d33b5905aec3a68e27: Status 404 returned error can't find the container with id 011e359b19d615fee43023e3c1e45b97d099d18b94cb63d33b5905aec3a68e27 Jan 30 05:27:31 crc kubenswrapper[4931]: I0130 05:27:31.847605 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:32 crc kubenswrapper[4931]: I0130 05:27:32.408134 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:27:32 crc kubenswrapper[4931]: I0130 05:27:32.586018 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerStarted","Data":"011e359b19d615fee43023e3c1e45b97d099d18b94cb63d33b5905aec3a68e27"} Jan 30 05:27:32 crc kubenswrapper[4931]: I0130 05:27:32.586054 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 05:27:32 crc kubenswrapper[4931]: I0130 05:27:32.651664 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:27:33 crc kubenswrapper[4931]: I0130 05:27:33.596408 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerStarted","Data":"782c5f8dbe0e7576669ed328ec36323e5d152ca2c37c77db802604122975e914"} Jan 30 05:27:34 crc kubenswrapper[4931]: I0130 05:27:34.641105 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerStarted","Data":"89d3e2e5267fbad75d566e6ba9ac104cbc8326782fb62f0bd4ed8c4f9b169c76"} Jan 30 05:27:35 crc kubenswrapper[4931]: I0130 05:27:35.653262 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerStarted","Data":"37cec202033bdc3e70a415b285410f6ce8158b9541b9d20f36bd938b5978559c"} Jan 30 05:27:36 crc kubenswrapper[4931]: I0130 05:27:36.666902 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerStarted","Data":"149c802638d11431261e2009e655ac397a1354f084b06d7f2da2c77118f48d07"} Jan 30 05:27:36 crc kubenswrapper[4931]: I0130 05:27:36.668725 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:27:36 crc kubenswrapper[4931]: I0130 05:27:36.707547 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.268554896 podStartE2EDuration="6.707522656s" podCreationTimestamp="2026-01-30 05:27:30 +0000 UTC" firstStartedPulling="2026-01-30 05:27:31.850128902 +0000 UTC m=+1187.220039179" lastFinishedPulling="2026-01-30 05:27:36.289096672 +0000 UTC m=+1191.659006939" observedRunningTime="2026-01-30 05:27:36.689101828 +0000 UTC m=+1192.059012125" watchObservedRunningTime="2026-01-30 05:27:36.707522656 +0000 UTC m=+1192.077432953" Jan 30 05:27:43 crc kubenswrapper[4931]: I0130 05:27:43.772164 4931 generic.go:334] "Generic (PLEG): container finished" podID="ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" containerID="c2771265ae8a990e0e69c0f116c64cc25eecd94f6e185173fb3394e2e6fbe468" exitCode=0 Jan 30 05:27:43 crc kubenswrapper[4931]: I0130 05:27:43.772232 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s287f" event={"ID":"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b","Type":"ContainerDied","Data":"c2771265ae8a990e0e69c0f116c64cc25eecd94f6e185173fb3394e2e6fbe468"} Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.194217 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.271257 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-config-data\") pod \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.271300 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trzrx\" (UniqueName: \"kubernetes.io/projected/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-kube-api-access-trzrx\") pod \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.271473 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-combined-ca-bundle\") pod \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.271545 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-scripts\") pod \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\" (UID: \"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b\") " Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.279767 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-scripts" (OuterVolumeSpecName: "scripts") pod "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" (UID: "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.280022 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-kube-api-access-trzrx" (OuterVolumeSpecName: "kube-api-access-trzrx") pod "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" (UID: "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b"). InnerVolumeSpecName "kube-api-access-trzrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.306230 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" (UID: "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.308808 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-config-data" (OuterVolumeSpecName: "config-data") pod "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" (UID: "ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.374058 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.374107 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.374130 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trzrx\" (UniqueName: \"kubernetes.io/projected/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-kube-api-access-trzrx\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.374150 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.813762 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s287f" event={"ID":"ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b","Type":"ContainerDied","Data":"33ba5db3481aa96c9b6d1d5ea1daf2013941f187605ad329f6f6d5a0b2ba2f94"} Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.813821 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ba5db3481aa96c9b6d1d5ea1daf2013941f187605ad329f6f6d5a0b2ba2f94" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.813920 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s287f" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.956751 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:27:45 crc kubenswrapper[4931]: E0130 05:27:45.957410 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" containerName="nova-cell0-conductor-db-sync" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.957474 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" containerName="nova-cell0-conductor-db-sync" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.957889 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" containerName="nova-cell0-conductor-db-sync" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.959005 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:45 crc kubenswrapper[4931]: I0130 05:27:45.966853 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.002319 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5k4cm" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.002732 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.103938 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hln5j\" (UniqueName: \"kubernetes.io/projected/88988b92-cd64-490d-b55f-959ecf4095af-kube-api-access-hln5j\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.104274 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.104460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.206642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hln5j\" (UniqueName: \"kubernetes.io/projected/88988b92-cd64-490d-b55f-959ecf4095af-kube-api-access-hln5j\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.206899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.206949 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.224500 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.224537 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.229988 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hln5j\" (UniqueName: \"kubernetes.io/projected/88988b92-cd64-490d-b55f-959ecf4095af-kube-api-access-hln5j\") pod \"nova-cell0-conductor-0\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.337691 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:46 crc kubenswrapper[4931]: I0130 05:27:46.903376 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:27:47 crc kubenswrapper[4931]: I0130 05:27:47.841313 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"88988b92-cd64-490d-b55f-959ecf4095af","Type":"ContainerStarted","Data":"83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06"} Jan 30 05:27:47 crc kubenswrapper[4931]: I0130 05:27:47.841721 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"88988b92-cd64-490d-b55f-959ecf4095af","Type":"ContainerStarted","Data":"56a8c3403b77c67382071da65bd384ea85d43f4776ebb7971c9a14fd4e392984"} Jan 30 05:27:47 crc kubenswrapper[4931]: I0130 05:27:47.841794 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:47 crc kubenswrapper[4931]: I0130 05:27:47.871601 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.87158233 podStartE2EDuration="2.87158233s" podCreationTimestamp="2026-01-30 05:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:47.871252472 +0000 UTC m=+1203.241162759" watchObservedRunningTime="2026-01-30 05:27:47.87158233 +0000 UTC m=+1203.241492587" Jan 30 05:27:56 crc kubenswrapper[4931]: I0130 05:27:56.367616 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:56 crc kubenswrapper[4931]: I0130 05:27:56.977445 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9b8l8"] Jan 30 05:27:56 crc kubenswrapper[4931]: I0130 05:27:56.979257 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:56 crc kubenswrapper[4931]: I0130 05:27:56.982020 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 05:27:56 crc kubenswrapper[4931]: I0130 05:27:56.982395 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 05:27:56 crc kubenswrapper[4931]: I0130 05:27:56.992054 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9b8l8"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.067376 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-scripts\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.067788 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf54v\" (UniqueName: \"kubernetes.io/projected/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-kube-api-access-sf54v\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.067891 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-config-data\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.067934 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.170643 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf54v\" (UniqueName: \"kubernetes.io/projected/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-kube-api-access-sf54v\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.170732 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-config-data\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.170763 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.170808 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-scripts\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.193022 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.194105 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-config-data\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.208876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-scripts\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.241306 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf54v\" (UniqueName: \"kubernetes.io/projected/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-kube-api-access-sf54v\") pod \"nova-cell0-cell-mapping-9b8l8\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.261727 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.263403 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.271769 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.310516 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.311992 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.317752 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.318136 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.325595 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.358489 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.382009 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.382141 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwsz9\" (UniqueName: \"kubernetes.io/projected/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-kube-api-access-mwsz9\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.382180 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.437773 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.446567 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.461862 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.487375 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.487485 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.490531 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f76c52b2-cfad-4017-a265-142c8e1b54f9-logs\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.490623 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9czc\" (UniqueName: \"kubernetes.io/projected/f76c52b2-cfad-4017-a265-142c8e1b54f9-kube-api-access-z9czc\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.490678 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.490700 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-config-data\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.490862 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwsz9\" (UniqueName: \"kubernetes.io/projected/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-kube-api-access-mwsz9\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.499439 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.501042 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.531073 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.531132 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.532096 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.537827 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.544066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwsz9\" (UniqueName: \"kubernetes.io/projected/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-kube-api-access-mwsz9\") pod \"nova-cell1-novncproxy-0\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.578488 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595046 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-config-data\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595180 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-config-data\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595288 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e37fcb81-9df1-411b-b593-8ca56c518f33-logs\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595342 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f76c52b2-cfad-4017-a265-142c8e1b54f9-logs\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595361 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595396 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9czc\" (UniqueName: \"kubernetes.io/projected/f76c52b2-cfad-4017-a265-142c8e1b54f9-kube-api-access-z9czc\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595429 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m84h6\" (UniqueName: \"kubernetes.io/projected/e37fcb81-9df1-411b-b593-8ca56c518f33-kube-api-access-m84h6\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.595467 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.597276 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f76c52b2-cfad-4017-a265-142c8e1b54f9-logs\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.602074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.617828 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-config-data\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.625678 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9czc\" (UniqueName: \"kubernetes.io/projected/f76c52b2-cfad-4017-a265-142c8e1b54f9-kube-api-access-z9czc\") pod \"nova-api-0\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.640284 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-p25hj"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.644490 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.653461 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-p25hj"] Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.655863 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.699783 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e37fcb81-9df1-411b-b593-8ca56c518f33-logs\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.699899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.699938 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m84h6\" (UniqueName: \"kubernetes.io/projected/e37fcb81-9df1-411b-b593-8ca56c518f33-kube-api-access-m84h6\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.699985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.700057 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-config-data\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.700091 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-config-data\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.700942 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e37fcb81-9df1-411b-b593-8ca56c518f33-logs\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.703664 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xlx\" (UniqueName: \"kubernetes.io/projected/018cf21e-c9c2-4ab4-8794-f6e066bafc86-kube-api-access-24xlx\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.706744 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-config-data\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.707655 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.725837 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m84h6\" (UniqueName: \"kubernetes.io/projected/e37fcb81-9df1-411b-b593-8ca56c518f33-kube-api-access-m84h6\") pod \"nova-metadata-0\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805102 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-config-data\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805191 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805239 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xlx\" (UniqueName: \"kubernetes.io/projected/018cf21e-c9c2-4ab4-8794-f6e066bafc86-kube-api-access-24xlx\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805306 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805338 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805369 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805407 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805442 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftvmx\" (UniqueName: \"kubernetes.io/projected/4dc14ae7-f05f-4093-838b-bdd419f4302f-kube-api-access-ftvmx\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.805482 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-config\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.809485 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.810768 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-config-data\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.817312 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.828330 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xlx\" (UniqueName: \"kubernetes.io/projected/018cf21e-c9c2-4ab4-8794-f6e066bafc86-kube-api-access-24xlx\") pod \"nova-scheduler-0\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.834192 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.860735 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.913025 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-config\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.913144 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.913237 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.913271 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.913326 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.913347 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftvmx\" (UniqueName: \"kubernetes.io/projected/4dc14ae7-f05f-4093-838b-bdd419f4302f-kube-api-access-ftvmx\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.914584 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-config\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.914584 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.914730 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.915290 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.915580 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.938414 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftvmx\" (UniqueName: \"kubernetes.io/projected/4dc14ae7-f05f-4093-838b-bdd419f4302f-kube-api-access-ftvmx\") pod \"dnsmasq-dns-557bbc7df7-p25hj\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:57 crc kubenswrapper[4931]: I0130 05:27:57.998345 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.043798 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9b8l8"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.103498 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jhn9j"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.104746 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.107099 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.107723 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.115932 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jhn9j"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.180696 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.222055 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.222095 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-scripts\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.222143 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-config-data\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.222197 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc9q6\" (UniqueName: \"kubernetes.io/projected/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-kube-api-access-dc9q6\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.304473 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.324634 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-config-data\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.324715 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc9q6\" (UniqueName: \"kubernetes.io/projected/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-kube-api-access-dc9q6\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.324795 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.324818 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-scripts\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.333164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-scripts\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.334926 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-config-data\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.335433 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.340595 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc9q6\" (UniqueName: \"kubernetes.io/projected/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-kube-api-access-dc9q6\") pod \"nova-cell1-conductor-db-sync-jhn9j\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.437324 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.479396 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.521884 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.665459 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-p25hj"] Jan 30 05:27:58 crc kubenswrapper[4931]: W0130 05:27:58.679483 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc14ae7_f05f_4093_838b_bdd419f4302f.slice/crio-8d7831c7e52a58c64227ffa0e77bd37bd5e37f6fcdda5357867d37c004b57577 WatchSource:0}: Error finding container 8d7831c7e52a58c64227ffa0e77bd37bd5e37f6fcdda5357867d37c004b57577: Status 404 returned error can't find the container with id 8d7831c7e52a58c64227ffa0e77bd37bd5e37f6fcdda5357867d37c004b57577 Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.908653 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jhn9j"] Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.993109 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f76c52b2-cfad-4017-a265-142c8e1b54f9","Type":"ContainerStarted","Data":"ce62ba9a7d987c86fe58643c3364d299b44cd5fdb6e5e34e5b8f029af4afb80d"} Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.995045 4931 generic.go:334] "Generic (PLEG): container finished" podID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerID="056aa11a16b72fe7fde4370093154af79d24b07c3142cb8943c78be2016d3fc6" exitCode=0 Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.995269 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" event={"ID":"4dc14ae7-f05f-4093-838b-bdd419f4302f","Type":"ContainerDied","Data":"056aa11a16b72fe7fde4370093154af79d24b07c3142cb8943c78be2016d3fc6"} Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.995309 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" event={"ID":"4dc14ae7-f05f-4093-838b-bdd419f4302f","Type":"ContainerStarted","Data":"8d7831c7e52a58c64227ffa0e77bd37bd5e37f6fcdda5357867d37c004b57577"} Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.996259 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3","Type":"ContainerStarted","Data":"77d36bb8a11804c505d48bebee2dbafaeb19326f0f1ced3b73af355d57d86b2b"} Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.998585 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9b8l8" event={"ID":"b0a8f8fe-306a-4373-bbb0-d96f2b498d62","Type":"ContainerStarted","Data":"346e5462e41c54c8f5c2422490f080d2b64f85c405ea5cc5337aa66fee775153"} Jan 30 05:27:58 crc kubenswrapper[4931]: I0130 05:27:58.998615 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9b8l8" event={"ID":"b0a8f8fe-306a-4373-bbb0-d96f2b498d62","Type":"ContainerStarted","Data":"831b8fc29d43f639d90655aa063689063afad41ee488b8e379f4edd22fec355a"} Jan 30 05:27:59 crc kubenswrapper[4931]: I0130 05:27:59.004388 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"018cf21e-c9c2-4ab4-8794-f6e066bafc86","Type":"ContainerStarted","Data":"f850561dfba344195d6aaf76e50b967d79d23fca8dbd9d77fa357655bcad14dc"} Jan 30 05:27:59 crc kubenswrapper[4931]: I0130 05:27:59.009008 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e37fcb81-9df1-411b-b593-8ca56c518f33","Type":"ContainerStarted","Data":"78bd5fff55edf34b6d1b969823b41d22ccbf3eabadd222063f311257dc45d17d"} Jan 30 05:27:59 crc kubenswrapper[4931]: I0130 05:27:59.014519 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" event={"ID":"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f","Type":"ContainerStarted","Data":"05fd8e29ff186451c0c9db4ef3d2f2174b59837370d11120981136e0fa9ba630"} Jan 30 05:27:59 crc kubenswrapper[4931]: I0130 05:27:59.036373 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9b8l8" podStartSLOduration=3.03632974 podStartE2EDuration="3.03632974s" podCreationTimestamp="2026-01-30 05:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:59.028360343 +0000 UTC m=+1214.398270620" watchObservedRunningTime="2026-01-30 05:27:59.03632974 +0000 UTC m=+1214.406240007" Jan 30 05:28:00 crc kubenswrapper[4931]: I0130 05:28:00.024541 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" event={"ID":"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f","Type":"ContainerStarted","Data":"6fe17572613dbd341b30bd762c1b5735b49c432308fe3f9a9ede6d5185282afe"} Jan 30 05:28:00 crc kubenswrapper[4931]: I0130 05:28:00.028503 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" event={"ID":"4dc14ae7-f05f-4093-838b-bdd419f4302f","Type":"ContainerStarted","Data":"a729151ede12640ea81c41f5d7f2d36efd861e7a4d31b991fe42dd4d2139fbe2"} Jan 30 05:28:00 crc kubenswrapper[4931]: I0130 05:28:00.028855 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:28:00 crc kubenswrapper[4931]: I0130 05:28:00.044706 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" podStartSLOduration=2.044685188 podStartE2EDuration="2.044685188s" podCreationTimestamp="2026-01-30 05:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:00.040378416 +0000 UTC m=+1215.410288693" watchObservedRunningTime="2026-01-30 05:28:00.044685188 +0000 UTC m=+1215.414595455" Jan 30 05:28:00 crc kubenswrapper[4931]: I0130 05:28:00.065781 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" podStartSLOduration=3.065760213 podStartE2EDuration="3.065760213s" podCreationTimestamp="2026-01-30 05:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:00.056111364 +0000 UTC m=+1215.426021661" watchObservedRunningTime="2026-01-30 05:28:00.065760213 +0000 UTC m=+1215.435670470" Jan 30 05:28:01 crc kubenswrapper[4931]: I0130 05:28:01.387123 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 05:28:01 crc kubenswrapper[4931]: I0130 05:28:01.753385 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:01 crc kubenswrapper[4931]: I0130 05:28:01.770764 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.051961 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"018cf21e-c9c2-4ab4-8794-f6e066bafc86","Type":"ContainerStarted","Data":"56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16"} Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.055139 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e37fcb81-9df1-411b-b593-8ca56c518f33","Type":"ContainerStarted","Data":"5894edfe624e331ed6304798892205879e56e8c005b21dd019bad175228231c6"} Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.055199 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e37fcb81-9df1-411b-b593-8ca56c518f33","Type":"ContainerStarted","Data":"7b8d1ff499fbb8ebcbcbf87555e175867c57e89396d957fe7b851fe96dc36245"} Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.055266 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-log" containerID="cri-o://7b8d1ff499fbb8ebcbcbf87555e175867c57e89396d957fe7b851fe96dc36245" gracePeriod=30 Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.055296 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-metadata" containerID="cri-o://5894edfe624e331ed6304798892205879e56e8c005b21dd019bad175228231c6" gracePeriod=30 Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.058068 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f76c52b2-cfad-4017-a265-142c8e1b54f9","Type":"ContainerStarted","Data":"0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386"} Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.058245 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f76c52b2-cfad-4017-a265-142c8e1b54f9","Type":"ContainerStarted","Data":"bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c"} Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.062492 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3","Type":"ContainerStarted","Data":"8389df76ed09df3262a31c554555764b5a96d65170cca0da5f3f94ed26654194"} Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.062835 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8389df76ed09df3262a31c554555764b5a96d65170cca0da5f3f94ed26654194" gracePeriod=30 Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.083615 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.269045106 podStartE2EDuration="5.083596868s" podCreationTimestamp="2026-01-30 05:27:57 +0000 UTC" firstStartedPulling="2026-01-30 05:27:58.53289076 +0000 UTC m=+1213.902801017" lastFinishedPulling="2026-01-30 05:28:01.347442502 +0000 UTC m=+1216.717352779" observedRunningTime="2026-01-30 05:28:02.077982172 +0000 UTC m=+1217.447892429" watchObservedRunningTime="2026-01-30 05:28:02.083596868 +0000 UTC m=+1217.453507125" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.101810 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.245758863 podStartE2EDuration="5.10178197s" podCreationTimestamp="2026-01-30 05:27:57 +0000 UTC" firstStartedPulling="2026-01-30 05:27:58.485631014 +0000 UTC m=+1213.855541271" lastFinishedPulling="2026-01-30 05:28:01.341654111 +0000 UTC m=+1216.711564378" observedRunningTime="2026-01-30 05:28:02.096315868 +0000 UTC m=+1217.466226125" watchObservedRunningTime="2026-01-30 05:28:02.10178197 +0000 UTC m=+1217.471692227" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.129263 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.954664902 podStartE2EDuration="5.129239312s" podCreationTimestamp="2026-01-30 05:27:57 +0000 UTC" firstStartedPulling="2026-01-30 05:27:58.166724482 +0000 UTC m=+1213.536634739" lastFinishedPulling="2026-01-30 05:28:01.341298892 +0000 UTC m=+1216.711209149" observedRunningTime="2026-01-30 05:28:02.119544341 +0000 UTC m=+1217.489454618" watchObservedRunningTime="2026-01-30 05:28:02.129239312 +0000 UTC m=+1217.499149569" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.152584 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.114482617 podStartE2EDuration="5.152560627s" podCreationTimestamp="2026-01-30 05:27:57 +0000 UTC" firstStartedPulling="2026-01-30 05:27:58.304938947 +0000 UTC m=+1213.674849204" lastFinishedPulling="2026-01-30 05:28:01.343016937 +0000 UTC m=+1216.712927214" observedRunningTime="2026-01-30 05:28:02.138364659 +0000 UTC m=+1217.508274916" watchObservedRunningTime="2026-01-30 05:28:02.152560627 +0000 UTC m=+1217.522470884" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.656868 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.835679 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.836044 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:28:02 crc kubenswrapper[4931]: I0130 05:28:02.861829 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4931]: I0130 05:28:03.071987 4931 generic.go:334] "Generic (PLEG): container finished" podID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerID="7b8d1ff499fbb8ebcbcbf87555e175867c57e89396d957fe7b851fe96dc36245" exitCode=143 Jan 30 05:28:03 crc kubenswrapper[4931]: I0130 05:28:03.073094 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e37fcb81-9df1-411b-b593-8ca56c518f33","Type":"ContainerDied","Data":"7b8d1ff499fbb8ebcbcbf87555e175867c57e89396d957fe7b851fe96dc36245"} Jan 30 05:28:06 crc kubenswrapper[4931]: I0130 05:28:06.106077 4931 generic.go:334] "Generic (PLEG): container finished" podID="9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" containerID="6fe17572613dbd341b30bd762c1b5735b49c432308fe3f9a9ede6d5185282afe" exitCode=0 Jan 30 05:28:06 crc kubenswrapper[4931]: I0130 05:28:06.106172 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" event={"ID":"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f","Type":"ContainerDied","Data":"6fe17572613dbd341b30bd762c1b5735b49c432308fe3f9a9ede6d5185282afe"} Jan 30 05:28:06 crc kubenswrapper[4931]: I0130 05:28:06.112004 4931 generic.go:334] "Generic (PLEG): container finished" podID="b0a8f8fe-306a-4373-bbb0-d96f2b498d62" containerID="346e5462e41c54c8f5c2422490f080d2b64f85c405ea5cc5337aa66fee775153" exitCode=0 Jan 30 05:28:06 crc kubenswrapper[4931]: I0130 05:28:06.112062 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9b8l8" event={"ID":"b0a8f8fe-306a-4373-bbb0-d96f2b498d62","Type":"ContainerDied","Data":"346e5462e41c54c8f5c2422490f080d2b64f85c405ea5cc5337aa66fee775153"} Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.494915 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.605413 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.622622 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-combined-ca-bundle\") pod \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.622823 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-config-data\") pod \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.623090 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc9q6\" (UniqueName: \"kubernetes.io/projected/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-kube-api-access-dc9q6\") pod \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.623357 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-scripts\") pod \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\" (UID: \"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.630836 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-scripts" (OuterVolumeSpecName: "scripts") pod "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" (UID: "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.631377 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-kube-api-access-dc9q6" (OuterVolumeSpecName: "kube-api-access-dc9q6") pod "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" (UID: "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f"). InnerVolumeSpecName "kube-api-access-dc9q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.654838 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-config-data" (OuterVolumeSpecName: "config-data") pod "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" (UID: "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.684938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" (UID: "9262fbc3-2503-4252-b2dd-10cd8dcfbd6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.727156 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-scripts\") pod \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.727362 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-combined-ca-bundle\") pod \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.727492 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf54v\" (UniqueName: \"kubernetes.io/projected/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-kube-api-access-sf54v\") pod \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.727613 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-config-data\") pod \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\" (UID: \"b0a8f8fe-306a-4373-bbb0-d96f2b498d62\") " Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.728337 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.728372 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.728393 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc9q6\" (UniqueName: \"kubernetes.io/projected/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-kube-api-access-dc9q6\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.728413 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.729407 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-scripts" (OuterVolumeSpecName: "scripts") pod "b0a8f8fe-306a-4373-bbb0-d96f2b498d62" (UID: "b0a8f8fe-306a-4373-bbb0-d96f2b498d62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.730771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-kube-api-access-sf54v" (OuterVolumeSpecName: "kube-api-access-sf54v") pod "b0a8f8fe-306a-4373-bbb0-d96f2b498d62" (UID: "b0a8f8fe-306a-4373-bbb0-d96f2b498d62"). InnerVolumeSpecName "kube-api-access-sf54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.749765 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0a8f8fe-306a-4373-bbb0-d96f2b498d62" (UID: "b0a8f8fe-306a-4373-bbb0-d96f2b498d62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.752134 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-config-data" (OuterVolumeSpecName: "config-data") pod "b0a8f8fe-306a-4373-bbb0-d96f2b498d62" (UID: "b0a8f8fe-306a-4373-bbb0-d96f2b498d62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.811354 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.811442 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.831130 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.831206 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.831237 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf54v\" (UniqueName: \"kubernetes.io/projected/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-kube-api-access-sf54v\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.831264 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a8f8fe-306a-4373-bbb0-d96f2b498d62-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.861765 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.907259 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 05:28:07 crc kubenswrapper[4931]: I0130 05:28:07.999599 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.074355 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-vxxmk"] Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.074755 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerName="dnsmasq-dns" containerID="cri-o://1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f" gracePeriod=10 Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.163977 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9b8l8" event={"ID":"b0a8f8fe-306a-4373-bbb0-d96f2b498d62","Type":"ContainerDied","Data":"831b8fc29d43f639d90655aa063689063afad41ee488b8e379f4edd22fec355a"} Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.164547 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="831b8fc29d43f639d90655aa063689063afad41ee488b8e379f4edd22fec355a" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.164653 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9b8l8" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.167736 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.170603 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jhn9j" event={"ID":"9262fbc3-2503-4252-b2dd-10cd8dcfbd6f","Type":"ContainerDied","Data":"05fd8e29ff186451c0c9db4ef3d2f2174b59837370d11120981136e0fa9ba630"} Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.170641 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05fd8e29ff186451c0c9db4ef3d2f2174b59837370d11120981136e0fa9ba630" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.240777 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.247897 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:28:08 crc kubenswrapper[4931]: E0130 05:28:08.248377 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" containerName="nova-cell1-conductor-db-sync" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.248402 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" containerName="nova-cell1-conductor-db-sync" Jan 30 05:28:08 crc kubenswrapper[4931]: E0130 05:28:08.248498 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a8f8fe-306a-4373-bbb0-d96f2b498d62" containerName="nova-manage" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.248509 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a8f8fe-306a-4373-bbb0-d96f2b498d62" containerName="nova-manage" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.248787 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a8f8fe-306a-4373-bbb0-d96f2b498d62" containerName="nova-manage" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.248828 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" containerName="nova-cell1-conductor-db-sync" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.249561 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.252251 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.279062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.341762 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.341925 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.342029 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hmc\" (UniqueName: \"kubernetes.io/projected/9bb44c01-e79f-42d8-912c-66db07c6b328-kube-api-access-67hmc\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.444990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.461849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67hmc\" (UniqueName: \"kubernetes.io/projected/9bb44c01-e79f-42d8-912c-66db07c6b328-kube-api-access-67hmc\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.461911 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.458041 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.465299 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.478676 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.479015 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-log" containerID="cri-o://bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c" gracePeriod=30 Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.479686 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-api" containerID="cri-o://0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386" gracePeriod=30 Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.485793 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.494474 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hmc\" (UniqueName: \"kubernetes.io/projected/9bb44c01-e79f-42d8-912c-66db07c6b328-kube-api-access-67hmc\") pod \"nova-cell1-conductor-0\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.506671 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.586592 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.669663 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.744291 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.766404 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-sb\") pod \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.766526 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-config\") pod \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.766771 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-svc\") pod \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.767022 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-682nm\" (UniqueName: \"kubernetes.io/projected/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-kube-api-access-682nm\") pod \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.767083 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-nb\") pod \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.767107 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-swift-storage-0\") pod \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\" (UID: \"98fe74d3-fa52-4814-8497-1a9bb9ea72ed\") " Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.776620 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-kube-api-access-682nm" (OuterVolumeSpecName: "kube-api-access-682nm") pod "98fe74d3-fa52-4814-8497-1a9bb9ea72ed" (UID: "98fe74d3-fa52-4814-8497-1a9bb9ea72ed"). InnerVolumeSpecName "kube-api-access-682nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.832359 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98fe74d3-fa52-4814-8497-1a9bb9ea72ed" (UID: "98fe74d3-fa52-4814-8497-1a9bb9ea72ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.846504 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98fe74d3-fa52-4814-8497-1a9bb9ea72ed" (UID: "98fe74d3-fa52-4814-8497-1a9bb9ea72ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.852192 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98fe74d3-fa52-4814-8497-1a9bb9ea72ed" (UID: "98fe74d3-fa52-4814-8497-1a9bb9ea72ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.856539 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98fe74d3-fa52-4814-8497-1a9bb9ea72ed" (UID: "98fe74d3-fa52-4814-8497-1a9bb9ea72ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.856926 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-config" (OuterVolumeSpecName: "config") pod "98fe74d3-fa52-4814-8497-1a9bb9ea72ed" (UID: "98fe74d3-fa52-4814-8497-1a9bb9ea72ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.869391 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.869413 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-682nm\" (UniqueName: \"kubernetes.io/projected/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-kube-api-access-682nm\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.869463 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.869472 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.869500 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:08 crc kubenswrapper[4931]: I0130 05:28:08.869511 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98fe74d3-fa52-4814-8497-1a9bb9ea72ed-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.094555 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:28:09 crc kubenswrapper[4931]: W0130 05:28:09.099654 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb44c01_e79f_42d8_912c_66db07c6b328.slice/crio-1c009cefddaacdb91464295ae716d32cc8f92887e0517a9011cacb821ea578bd WatchSource:0}: Error finding container 1c009cefddaacdb91464295ae716d32cc8f92887e0517a9011cacb821ea578bd: Status 404 returned error can't find the container with id 1c009cefddaacdb91464295ae716d32cc8f92887e0517a9011cacb821ea578bd Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.183567 4931 generic.go:334] "Generic (PLEG): container finished" podID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerID="1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f" exitCode=0 Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.183633 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" event={"ID":"98fe74d3-fa52-4814-8497-1a9bb9ea72ed","Type":"ContainerDied","Data":"1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f"} Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.183676 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" event={"ID":"98fe74d3-fa52-4814-8497-1a9bb9ea72ed","Type":"ContainerDied","Data":"e1641f306bf07b5142c6dd94dd4d7be821af4a934007d916dd0dd69749c5f578"} Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.183697 4931 scope.go:117] "RemoveContainer" containerID="1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.183604 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-vxxmk" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.185048 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bb44c01-e79f-42d8-912c-66db07c6b328","Type":"ContainerStarted","Data":"1c009cefddaacdb91464295ae716d32cc8f92887e0517a9011cacb821ea578bd"} Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.188273 4931 generic.go:334] "Generic (PLEG): container finished" podID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerID="bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c" exitCode=143 Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.188481 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f76c52b2-cfad-4017-a265-142c8e1b54f9","Type":"ContainerDied","Data":"bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c"} Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.222164 4931 scope.go:117] "RemoveContainer" containerID="096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.238937 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-vxxmk"] Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.259316 4931 scope.go:117] "RemoveContainer" containerID="1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f" Jan 30 05:28:09 crc kubenswrapper[4931]: E0130 05:28:09.259955 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f\": container with ID starting with 1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f not found: ID does not exist" containerID="1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.260032 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f"} err="failed to get container status \"1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f\": rpc error: code = NotFound desc = could not find container \"1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f\": container with ID starting with 1b840f2239f8f8dd8ec7c99cc9ea368efa35cb6a2c53f91934fe0a2a26cfb59f not found: ID does not exist" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.260065 4931 scope.go:117] "RemoveContainer" containerID="096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd" Jan 30 05:28:09 crc kubenswrapper[4931]: E0130 05:28:09.260414 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd\": container with ID starting with 096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd not found: ID does not exist" containerID="096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.260445 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd"} err="failed to get container status \"096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd\": rpc error: code = NotFound desc = could not find container \"096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd\": container with ID starting with 096e49c6163153a9fa1821f983bb1d1d489d24729a6de12e0a2cbc9a181c87cd not found: ID does not exist" Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.279861 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-vxxmk"] Jan 30 05:28:09 crc kubenswrapper[4931]: I0130 05:28:09.434373 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" path="/var/lib/kubelet/pods/98fe74d3-fa52-4814-8497-1a9bb9ea72ed/volumes" Jan 30 05:28:10 crc kubenswrapper[4931]: I0130 05:28:10.197604 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bb44c01-e79f-42d8-912c-66db07c6b328","Type":"ContainerStarted","Data":"9aac5eb9a7735bf1efc6d134170c4297691e32c19fa1a2cd01ab0ae918243436"} Jan 30 05:28:10 crc kubenswrapper[4931]: I0130 05:28:10.198000 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:10 crc kubenswrapper[4931]: I0130 05:28:10.199239 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" containerName="nova-scheduler-scheduler" containerID="cri-o://56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16" gracePeriod=30 Jan 30 05:28:10 crc kubenswrapper[4931]: I0130 05:28:10.215058 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.2150401730000002 podStartE2EDuration="2.215040173s" podCreationTimestamp="2026-01-30 05:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:10.213943615 +0000 UTC m=+1225.583853872" watchObservedRunningTime="2026-01-30 05:28:10.215040173 +0000 UTC m=+1225.584950430" Jan 30 05:28:12 crc kubenswrapper[4931]: E0130 05:28:12.874464 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:12 crc kubenswrapper[4931]: E0130 05:28:12.881061 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:12 crc kubenswrapper[4931]: E0130 05:28:12.884152 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:12 crc kubenswrapper[4931]: E0130 05:28:12.884234 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" containerName="nova-scheduler-scheduler" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.238129 4931 generic.go:334] "Generic (PLEG): container finished" podID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" containerID="56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16" exitCode=0 Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.238184 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"018cf21e-c9c2-4ab4-8794-f6e066bafc86","Type":"ContainerDied","Data":"56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16"} Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.766792 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.811077 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24xlx\" (UniqueName: \"kubernetes.io/projected/018cf21e-c9c2-4ab4-8794-f6e066bafc86-kube-api-access-24xlx\") pod \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.811278 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-config-data\") pod \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.811382 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-combined-ca-bundle\") pod \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\" (UID: \"018cf21e-c9c2-4ab4-8794-f6e066bafc86\") " Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.818088 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018cf21e-c9c2-4ab4-8794-f6e066bafc86-kube-api-access-24xlx" (OuterVolumeSpecName: "kube-api-access-24xlx") pod "018cf21e-c9c2-4ab4-8794-f6e066bafc86" (UID: "018cf21e-c9c2-4ab4-8794-f6e066bafc86"). InnerVolumeSpecName "kube-api-access-24xlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.849806 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "018cf21e-c9c2-4ab4-8794-f6e066bafc86" (UID: "018cf21e-c9c2-4ab4-8794-f6e066bafc86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.866816 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-config-data" (OuterVolumeSpecName: "config-data") pod "018cf21e-c9c2-4ab4-8794-f6e066bafc86" (UID: "018cf21e-c9c2-4ab4-8794-f6e066bafc86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.913824 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24xlx\" (UniqueName: \"kubernetes.io/projected/018cf21e-c9c2-4ab4-8794-f6e066bafc86-kube-api-access-24xlx\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.913859 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:13 crc kubenswrapper[4931]: I0130 05:28:13.913870 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018cf21e-c9c2-4ab4-8794-f6e066bafc86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.184006 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.248062 4931 generic.go:334] "Generic (PLEG): container finished" podID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerID="0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386" exitCode=0 Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.248146 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f76c52b2-cfad-4017-a265-142c8e1b54f9","Type":"ContainerDied","Data":"0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386"} Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.248178 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f76c52b2-cfad-4017-a265-142c8e1b54f9","Type":"ContainerDied","Data":"ce62ba9a7d987c86fe58643c3364d299b44cd5fdb6e5e34e5b8f029af4afb80d"} Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.248200 4931 scope.go:117] "RemoveContainer" containerID="0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.248343 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.250883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"018cf21e-c9c2-4ab4-8794-f6e066bafc86","Type":"ContainerDied","Data":"f850561dfba344195d6aaf76e50b967d79d23fca8dbd9d77fa357655bcad14dc"} Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.251000 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.291029 4931 scope.go:117] "RemoveContainer" containerID="bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.305779 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.319974 4931 scope.go:117] "RemoveContainer" containerID="0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.320426 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9czc\" (UniqueName: \"kubernetes.io/projected/f76c52b2-cfad-4017-a265-142c8e1b54f9-kube-api-access-z9czc\") pod \"f76c52b2-cfad-4017-a265-142c8e1b54f9\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.320631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f76c52b2-cfad-4017-a265-142c8e1b54f9-logs\") pod \"f76c52b2-cfad-4017-a265-142c8e1b54f9\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.320719 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-combined-ca-bundle\") pod \"f76c52b2-cfad-4017-a265-142c8e1b54f9\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.320874 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-config-data\") pod \"f76c52b2-cfad-4017-a265-142c8e1b54f9\" (UID: \"f76c52b2-cfad-4017-a265-142c8e1b54f9\") " Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.321133 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76c52b2-cfad-4017-a265-142c8e1b54f9-logs" (OuterVolumeSpecName: "logs") pod "f76c52b2-cfad-4017-a265-142c8e1b54f9" (UID: "f76c52b2-cfad-4017-a265-142c8e1b54f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.320706 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386\": container with ID starting with 0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386 not found: ID does not exist" containerID="0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.321189 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386"} err="failed to get container status \"0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386\": rpc error: code = NotFound desc = could not find container \"0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386\": container with ID starting with 0191ef656c7021ce17be43d4051ed5a029ba9b9d53bf9beffcc79d9698624386 not found: ID does not exist" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.321218 4931 scope.go:117] "RemoveContainer" containerID="bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.321460 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f76c52b2-cfad-4017-a265-142c8e1b54f9-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.324942 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.326939 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c\": container with ID starting with bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c not found: ID does not exist" containerID="bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.326986 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c"} err="failed to get container status \"bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c\": rpc error: code = NotFound desc = could not find container \"bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c\": container with ID starting with bd57739faa4ffabdbe2aaad279c6333f7323539495e281e3c68aae4e3d40e10c not found: ID does not exist" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.327013 4931 scope.go:117] "RemoveContainer" containerID="56fd7ff0bc2b09ba7471742c19169b6e1897a3dc1f6e923f43d7c3edb6934d16" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.327470 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76c52b2-cfad-4017-a265-142c8e1b54f9-kube-api-access-z9czc" (OuterVolumeSpecName: "kube-api-access-z9czc") pod "f76c52b2-cfad-4017-a265-142c8e1b54f9" (UID: "f76c52b2-cfad-4017-a265-142c8e1b54f9"). InnerVolumeSpecName "kube-api-access-z9czc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.333829 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.334264 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerName="dnsmasq-dns" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334281 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerName="dnsmasq-dns" Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.334297 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" containerName="nova-scheduler-scheduler" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334304 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" containerName="nova-scheduler-scheduler" Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.334322 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-api" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334329 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-api" Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.334343 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-log" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334349 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-log" Jan 30 05:28:14 crc kubenswrapper[4931]: E0130 05:28:14.334361 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerName="init" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334367 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerName="init" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334555 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fe74d3-fa52-4814-8497-1a9bb9ea72ed" containerName="dnsmasq-dns" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334566 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-log" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334573 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" containerName="nova-scheduler-scheduler" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.334587 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" containerName="nova-api-api" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.335186 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.338422 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.341901 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.362005 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-config-data" (OuterVolumeSpecName: "config-data") pod "f76c52b2-cfad-4017-a265-142c8e1b54f9" (UID: "f76c52b2-cfad-4017-a265-142c8e1b54f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.376271 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f76c52b2-cfad-4017-a265-142c8e1b54f9" (UID: "f76c52b2-cfad-4017-a265-142c8e1b54f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.423167 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.423263 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-config-data\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.423372 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8p8\" (UniqueName: \"kubernetes.io/projected/f4be9b51-9e05-4080-9aac-1e7a68785e90-kube-api-access-dr8p8\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.423513 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.423538 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76c52b2-cfad-4017-a265-142c8e1b54f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.423549 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9czc\" (UniqueName: \"kubernetes.io/projected/f76c52b2-cfad-4017-a265-142c8e1b54f9-kube-api-access-z9czc\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.524946 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-config-data\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.525023 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8p8\" (UniqueName: \"kubernetes.io/projected/f4be9b51-9e05-4080-9aac-1e7a68785e90-kube-api-access-dr8p8\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.525096 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.529484 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.532540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-config-data\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.544045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8p8\" (UniqueName: \"kubernetes.io/projected/f4be9b51-9e05-4080-9aac-1e7a68785e90-kube-api-access-dr8p8\") pod \"nova-scheduler-0\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.660050 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.678156 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.689415 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.690848 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.696971 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.729792 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.731348 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.830486 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-config-data\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.830570 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.830625 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78469f1d-85e0-488f-9334-f756e0410bba-logs\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.830650 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvsb\" (UniqueName: \"kubernetes.io/projected/78469f1d-85e0-488f-9334-f756e0410bba-kube-api-access-tpvsb\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.932317 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-config-data\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.932392 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.932508 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78469f1d-85e0-488f-9334-f756e0410bba-logs\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.932527 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvsb\" (UniqueName: \"kubernetes.io/projected/78469f1d-85e0-488f-9334-f756e0410bba-kube-api-access-tpvsb\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.933800 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78469f1d-85e0-488f-9334-f756e0410bba-logs\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.937551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-config-data\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.940489 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:14 crc kubenswrapper[4931]: I0130 05:28:14.952167 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvsb\" (UniqueName: \"kubernetes.io/projected/78469f1d-85e0-488f-9334-f756e0410bba-kube-api-access-tpvsb\") pod \"nova-api-0\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " pod="openstack/nova-api-0" Jan 30 05:28:15 crc kubenswrapper[4931]: I0130 05:28:15.039243 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:15 crc kubenswrapper[4931]: I0130 05:28:15.217660 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:15 crc kubenswrapper[4931]: W0130 05:28:15.243685 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4be9b51_9e05_4080_9aac_1e7a68785e90.slice/crio-2dd5974fb15bfc06c0c4a9379a4055028a1a3284a814cf069766b15f03c9fc5e WatchSource:0}: Error finding container 2dd5974fb15bfc06c0c4a9379a4055028a1a3284a814cf069766b15f03c9fc5e: Status 404 returned error can't find the container with id 2dd5974fb15bfc06c0c4a9379a4055028a1a3284a814cf069766b15f03c9fc5e Jan 30 05:28:15 crc kubenswrapper[4931]: I0130 05:28:15.265382 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4be9b51-9e05-4080-9aac-1e7a68785e90","Type":"ContainerStarted","Data":"2dd5974fb15bfc06c0c4a9379a4055028a1a3284a814cf069766b15f03c9fc5e"} Jan 30 05:28:15 crc kubenswrapper[4931]: I0130 05:28:15.431926 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018cf21e-c9c2-4ab4-8794-f6e066bafc86" path="/var/lib/kubelet/pods/018cf21e-c9c2-4ab4-8794-f6e066bafc86/volumes" Jan 30 05:28:15 crc kubenswrapper[4931]: I0130 05:28:15.432760 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76c52b2-cfad-4017-a265-142c8e1b54f9" path="/var/lib/kubelet/pods/f76c52b2-cfad-4017-a265-142c8e1b54f9/volumes" Jan 30 05:28:15 crc kubenswrapper[4931]: I0130 05:28:15.537513 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:16 crc kubenswrapper[4931]: I0130 05:28:16.277896 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78469f1d-85e0-488f-9334-f756e0410bba","Type":"ContainerStarted","Data":"e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244"} Jan 30 05:28:16 crc kubenswrapper[4931]: I0130 05:28:16.278221 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78469f1d-85e0-488f-9334-f756e0410bba","Type":"ContainerStarted","Data":"a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa"} Jan 30 05:28:16 crc kubenswrapper[4931]: I0130 05:28:16.278234 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78469f1d-85e0-488f-9334-f756e0410bba","Type":"ContainerStarted","Data":"4f8866ca081593d16859461ac5678f090e967b3446220506bb4f1ab22cb7f8fb"} Jan 30 05:28:16 crc kubenswrapper[4931]: I0130 05:28:16.279457 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4be9b51-9e05-4080-9aac-1e7a68785e90","Type":"ContainerStarted","Data":"fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd"} Jan 30 05:28:16 crc kubenswrapper[4931]: I0130 05:28:16.302310 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.302293319 podStartE2EDuration="2.302293319s" podCreationTimestamp="2026-01-30 05:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:16.300804617 +0000 UTC m=+1231.670714874" watchObservedRunningTime="2026-01-30 05:28:16.302293319 +0000 UTC m=+1231.672203576" Jan 30 05:28:16 crc kubenswrapper[4931]: I0130 05:28:16.323082 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.323059182 podStartE2EDuration="2.323059182s" podCreationTimestamp="2026-01-30 05:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:16.32085785 +0000 UTC m=+1231.690768127" watchObservedRunningTime="2026-01-30 05:28:16.323059182 +0000 UTC m=+1231.692969449" Jan 30 05:28:18 crc kubenswrapper[4931]: I0130 05:28:18.642902 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:19 crc kubenswrapper[4931]: I0130 05:28:19.732248 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 05:28:24 crc kubenswrapper[4931]: I0130 05:28:24.732150 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 05:28:24 crc kubenswrapper[4931]: I0130 05:28:24.782452 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 05:28:25 crc kubenswrapper[4931]: I0130 05:28:25.040170 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:25 crc kubenswrapper[4931]: I0130 05:28:25.040281 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:25 crc kubenswrapper[4931]: I0130 05:28:25.439344 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 05:28:26 crc kubenswrapper[4931]: I0130 05:28:26.122887 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:26 crc kubenswrapper[4931]: I0130 05:28:26.123240 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.456155 4931 generic.go:334] "Generic (PLEG): container finished" podID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerID="5894edfe624e331ed6304798892205879e56e8c005b21dd019bad175228231c6" exitCode=137 Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.456187 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e37fcb81-9df1-411b-b593-8ca56c518f33","Type":"ContainerDied","Data":"5894edfe624e331ed6304798892205879e56e8c005b21dd019bad175228231c6"} Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.459565 4931 generic.go:334] "Generic (PLEG): container finished" podID="f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" containerID="8389df76ed09df3262a31c554555764b5a96d65170cca0da5f3f94ed26654194" exitCode=137 Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.459605 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3","Type":"ContainerDied","Data":"8389df76ed09df3262a31c554555764b5a96d65170cca0da5f3f94ed26654194"} Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.616763 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.624710 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.797989 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-config-data\") pod \"e37fcb81-9df1-411b-b593-8ca56c518f33\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798150 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m84h6\" (UniqueName: \"kubernetes.io/projected/e37fcb81-9df1-411b-b593-8ca56c518f33-kube-api-access-m84h6\") pod \"e37fcb81-9df1-411b-b593-8ca56c518f33\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798245 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e37fcb81-9df1-411b-b593-8ca56c518f33-logs\") pod \"e37fcb81-9df1-411b-b593-8ca56c518f33\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798302 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-config-data\") pod \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798373 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-combined-ca-bundle\") pod \"e37fcb81-9df1-411b-b593-8ca56c518f33\" (UID: \"e37fcb81-9df1-411b-b593-8ca56c518f33\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798405 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-combined-ca-bundle\") pod \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798500 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwsz9\" (UniqueName: \"kubernetes.io/projected/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-kube-api-access-mwsz9\") pod \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\" (UID: \"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3\") " Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.798904 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e37fcb81-9df1-411b-b593-8ca56c518f33-logs" (OuterVolumeSpecName: "logs") pod "e37fcb81-9df1-411b-b593-8ca56c518f33" (UID: "e37fcb81-9df1-411b-b593-8ca56c518f33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.799578 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e37fcb81-9df1-411b-b593-8ca56c518f33-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.810382 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37fcb81-9df1-411b-b593-8ca56c518f33-kube-api-access-m84h6" (OuterVolumeSpecName: "kube-api-access-m84h6") pod "e37fcb81-9df1-411b-b593-8ca56c518f33" (UID: "e37fcb81-9df1-411b-b593-8ca56c518f33"). InnerVolumeSpecName "kube-api-access-m84h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.810827 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-kube-api-access-mwsz9" (OuterVolumeSpecName: "kube-api-access-mwsz9") pod "f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" (UID: "f06a8661-ec14-48e2-a48b-2ecfec7b8ea3"). InnerVolumeSpecName "kube-api-access-mwsz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.830230 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" (UID: "f06a8661-ec14-48e2-a48b-2ecfec7b8ea3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.841961 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-config-data" (OuterVolumeSpecName: "config-data") pod "f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" (UID: "f06a8661-ec14-48e2-a48b-2ecfec7b8ea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.867416 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-config-data" (OuterVolumeSpecName: "config-data") pod "e37fcb81-9df1-411b-b593-8ca56c518f33" (UID: "e37fcb81-9df1-411b-b593-8ca56c518f33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.869565 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e37fcb81-9df1-411b-b593-8ca56c518f33" (UID: "e37fcb81-9df1-411b-b593-8ca56c518f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.901883 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.902096 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.902236 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwsz9\" (UniqueName: \"kubernetes.io/projected/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-kube-api-access-mwsz9\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.902389 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37fcb81-9df1-411b-b593-8ca56c518f33-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.902617 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m84h6\" (UniqueName: \"kubernetes.io/projected/e37fcb81-9df1-411b-b593-8ca56c518f33-kube-api-access-m84h6\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:32 crc kubenswrapper[4931]: I0130 05:28:32.902748 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.478137 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e37fcb81-9df1-411b-b593-8ca56c518f33","Type":"ContainerDied","Data":"78bd5fff55edf34b6d1b969823b41d22ccbf3eabadd222063f311257dc45d17d"} Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.478220 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.478698 4931 scope.go:117] "RemoveContainer" containerID="5894edfe624e331ed6304798892205879e56e8c005b21dd019bad175228231c6" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.481976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f06a8661-ec14-48e2-a48b-2ecfec7b8ea3","Type":"ContainerDied","Data":"77d36bb8a11804c505d48bebee2dbafaeb19326f0f1ced3b73af355d57d86b2b"} Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.482037 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.518893 4931 scope.go:117] "RemoveContainer" containerID="7b8d1ff499fbb8ebcbcbf87555e175867c57e89396d957fe7b851fe96dc36245" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.537848 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.565484 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.573978 4931 scope.go:117] "RemoveContainer" containerID="8389df76ed09df3262a31c554555764b5a96d65170cca0da5f3f94ed26654194" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.576730 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.587589 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.597641 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: E0130 05:28:33.598139 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-metadata" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.598165 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-metadata" Jan 30 05:28:33 crc kubenswrapper[4931]: E0130 05:28:33.598194 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-log" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.598208 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-log" Jan 30 05:28:33 crc kubenswrapper[4931]: E0130 05:28:33.598232 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.598245 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.598646 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.598689 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-log" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.598708 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" containerName="nova-metadata-metadata" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.599663 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.604708 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.606824 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.606832 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.608398 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.610116 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.612182 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.612187 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.616524 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.628227 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719038 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719123 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719309 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719570 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719622 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhbcb\" (UniqueName: \"kubernetes.io/projected/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-kube-api-access-fhbcb\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719713 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvzsw\" (UniqueName: \"kubernetes.io/projected/2565fa42-f180-4948-8b2f-68c419d78d2b-kube-api-access-qvzsw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719781 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-config-data\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.719948 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-logs\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.720011 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.720230 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.822234 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.822707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.823051 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.823261 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.823475 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhbcb\" (UniqueName: \"kubernetes.io/projected/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-kube-api-access-fhbcb\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.823711 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.823976 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvzsw\" (UniqueName: \"kubernetes.io/projected/2565fa42-f180-4948-8b2f-68c419d78d2b-kube-api-access-qvzsw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.824128 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-config-data\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.824308 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-logs\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.824453 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.824974 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-logs\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.830032 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.830483 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-config-data\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.831750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.832943 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.832998 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.835228 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.835840 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.853121 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvzsw\" (UniqueName: \"kubernetes.io/projected/2565fa42-f180-4948-8b2f-68c419d78d2b-kube-api-access-qvzsw\") pod \"nova-cell1-novncproxy-0\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.853339 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhbcb\" (UniqueName: \"kubernetes.io/projected/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-kube-api-access-fhbcb\") pod \"nova-metadata-0\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " pod="openstack/nova-metadata-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.958973 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:33 crc kubenswrapper[4931]: I0130 05:28:33.964652 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:34 crc kubenswrapper[4931]: I0130 05:28:34.304415 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:34 crc kubenswrapper[4931]: I0130 05:28:34.495915 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2565fa42-f180-4948-8b2f-68c419d78d2b","Type":"ContainerStarted","Data":"182ca03d45434848993e7087501801fbb8335a526ad9960a6da96e395124bc68"} Jan 30 05:28:34 crc kubenswrapper[4931]: I0130 05:28:34.591255 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.045623 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.046045 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.047091 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.051858 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.444271 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37fcb81-9df1-411b-b593-8ca56c518f33" path="/var/lib/kubelet/pods/e37fcb81-9df1-411b-b593-8ca56c518f33/volumes" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.447618 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f06a8661-ec14-48e2-a48b-2ecfec7b8ea3" path="/var/lib/kubelet/pods/f06a8661-ec14-48e2-a48b-2ecfec7b8ea3/volumes" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.516757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873","Type":"ContainerStarted","Data":"c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322"} Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.516818 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873","Type":"ContainerStarted","Data":"df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e"} Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.516840 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873","Type":"ContainerStarted","Data":"ce7e3daf9b312cef06426fb53814cad3b92811315fdfd1796c0688baab6a72ed"} Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.523132 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2565fa42-f180-4948-8b2f-68c419d78d2b","Type":"ContainerStarted","Data":"4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe"} Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.523199 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.527917 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.561221 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.561199843 podStartE2EDuration="2.561199843s" podCreationTimestamp="2026-01-30 05:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:35.543507797 +0000 UTC m=+1250.913418104" watchObservedRunningTime="2026-01-30 05:28:35.561199843 +0000 UTC m=+1250.931110110" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.616312 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.616291889 podStartE2EDuration="2.616291889s" podCreationTimestamp="2026-01-30 05:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:35.596980968 +0000 UTC m=+1250.966891255" watchObservedRunningTime="2026-01-30 05:28:35.616291889 +0000 UTC m=+1250.986202156" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.731347 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-ctzjd"] Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.733326 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.762957 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-ctzjd"] Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.886164 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.886269 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.886306 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.886565 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-config\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.886687 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-svc\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.886959 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnld\" (UniqueName: \"kubernetes.io/projected/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-kube-api-access-nrnld\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.988672 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.988723 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.988755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-config\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.988801 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-svc\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.989017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnld\" (UniqueName: \"kubernetes.io/projected/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-kube-api-access-nrnld\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.989049 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.990440 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-config\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.990597 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.990721 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.990837 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-svc\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:35 crc kubenswrapper[4931]: I0130 05:28:35.990950 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:36 crc kubenswrapper[4931]: I0130 05:28:36.014066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnld\" (UniqueName: \"kubernetes.io/projected/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-kube-api-access-nrnld\") pod \"dnsmasq-dns-5ddd577785-ctzjd\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:36 crc kubenswrapper[4931]: I0130 05:28:36.076767 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:36 crc kubenswrapper[4931]: I0130 05:28:36.617371 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-ctzjd"] Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.495548 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.496207 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-central-agent" containerID="cri-o://782c5f8dbe0e7576669ed328ec36323e5d152ca2c37c77db802604122975e914" gracePeriod=30 Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.496775 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="proxy-httpd" containerID="cri-o://149c802638d11431261e2009e655ac397a1354f084b06d7f2da2c77118f48d07" gracePeriod=30 Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.496850 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="sg-core" containerID="cri-o://37cec202033bdc3e70a415b285410f6ce8158b9541b9d20f36bd938b5978559c" gracePeriod=30 Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.496971 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-notification-agent" containerID="cri-o://89d3e2e5267fbad75d566e6ba9ac104cbc8326782fb62f0bd4ed8c4f9b169c76" gracePeriod=30 Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.539008 4931 generic.go:334] "Generic (PLEG): container finished" podID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerID="d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2" exitCode=0 Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.540059 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" event={"ID":"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d","Type":"ContainerDied","Data":"d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2"} Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.540089 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" event={"ID":"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d","Type":"ContainerStarted","Data":"645723e127490c600cf593cc161f0207c0a197195fa54096da51b7634ddd33ac"} Jan 30 05:28:37 crc kubenswrapper[4931]: I0130 05:28:37.816273 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.564450 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" event={"ID":"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d","Type":"ContainerStarted","Data":"e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e"} Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.564573 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.569755 4931 generic.go:334] "Generic (PLEG): container finished" podID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerID="149c802638d11431261e2009e655ac397a1354f084b06d7f2da2c77118f48d07" exitCode=0 Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.569787 4931 generic.go:334] "Generic (PLEG): container finished" podID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerID="37cec202033bdc3e70a415b285410f6ce8158b9541b9d20f36bd938b5978559c" exitCode=2 Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.569800 4931 generic.go:334] "Generic (PLEG): container finished" podID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerID="782c5f8dbe0e7576669ed328ec36323e5d152ca2c37c77db802604122975e914" exitCode=0 Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.569974 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-log" containerID="cri-o://a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa" gracePeriod=30 Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.570227 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerDied","Data":"149c802638d11431261e2009e655ac397a1354f084b06d7f2da2c77118f48d07"} Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.570266 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerDied","Data":"37cec202033bdc3e70a415b285410f6ce8158b9541b9d20f36bd938b5978559c"} Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.570282 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerDied","Data":"782c5f8dbe0e7576669ed328ec36323e5d152ca2c37c77db802604122975e914"} Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.570328 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-api" containerID="cri-o://e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244" gracePeriod=30 Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.613246 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" podStartSLOduration=3.6132185039999998 podStartE2EDuration="3.613218504s" podCreationTimestamp="2026-01-30 05:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:38.601103884 +0000 UTC m=+1253.971014141" watchObservedRunningTime="2026-01-30 05:28:38.613218504 +0000 UTC m=+1253.983128791" Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.959833 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.965746 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:28:38 crc kubenswrapper[4931]: I0130 05:28:38.965820 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:28:39 crc kubenswrapper[4931]: I0130 05:28:39.582615 4931 generic.go:334] "Generic (PLEG): container finished" podID="78469f1d-85e0-488f-9334-f756e0410bba" containerID="a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa" exitCode=143 Jan 30 05:28:39 crc kubenswrapper[4931]: I0130 05:28:39.582678 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78469f1d-85e0-488f-9334-f756e0410bba","Type":"ContainerDied","Data":"a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa"} Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.165010 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.322789 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78469f1d-85e0-488f-9334-f756e0410bba-logs\") pod \"78469f1d-85e0-488f-9334-f756e0410bba\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.323671 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78469f1d-85e0-488f-9334-f756e0410bba-logs" (OuterVolumeSpecName: "logs") pod "78469f1d-85e0-488f-9334-f756e0410bba" (UID: "78469f1d-85e0-488f-9334-f756e0410bba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.323995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-combined-ca-bundle\") pod \"78469f1d-85e0-488f-9334-f756e0410bba\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.324857 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-config-data\") pod \"78469f1d-85e0-488f-9334-f756e0410bba\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.324928 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpvsb\" (UniqueName: \"kubernetes.io/projected/78469f1d-85e0-488f-9334-f756e0410bba-kube-api-access-tpvsb\") pod \"78469f1d-85e0-488f-9334-f756e0410bba\" (UID: \"78469f1d-85e0-488f-9334-f756e0410bba\") " Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.325882 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78469f1d-85e0-488f-9334-f756e0410bba-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.329894 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78469f1d-85e0-488f-9334-f756e0410bba-kube-api-access-tpvsb" (OuterVolumeSpecName: "kube-api-access-tpvsb") pod "78469f1d-85e0-488f-9334-f756e0410bba" (UID: "78469f1d-85e0-488f-9334-f756e0410bba"). InnerVolumeSpecName "kube-api-access-tpvsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.367205 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-config-data" (OuterVolumeSpecName: "config-data") pod "78469f1d-85e0-488f-9334-f756e0410bba" (UID: "78469f1d-85e0-488f-9334-f756e0410bba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.370615 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78469f1d-85e0-488f-9334-f756e0410bba" (UID: "78469f1d-85e0-488f-9334-f756e0410bba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.427687 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.427727 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78469f1d-85e0-488f-9334-f756e0410bba-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.427740 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpvsb\" (UniqueName: \"kubernetes.io/projected/78469f1d-85e0-488f-9334-f756e0410bba-kube-api-access-tpvsb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.632110 4931 generic.go:334] "Generic (PLEG): container finished" podID="78469f1d-85e0-488f-9334-f756e0410bba" containerID="e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244" exitCode=0 Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.632149 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.632171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78469f1d-85e0-488f-9334-f756e0410bba","Type":"ContainerDied","Data":"e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244"} Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.632620 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78469f1d-85e0-488f-9334-f756e0410bba","Type":"ContainerDied","Data":"4f8866ca081593d16859461ac5678f090e967b3446220506bb4f1ab22cb7f8fb"} Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.632642 4931 scope.go:117] "RemoveContainer" containerID="e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.670103 4931 scope.go:117] "RemoveContainer" containerID="a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.692194 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.695501 4931 scope.go:117] "RemoveContainer" containerID="e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244" Jan 30 05:28:42 crc kubenswrapper[4931]: E0130 05:28:42.696083 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244\": container with ID starting with e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244 not found: ID does not exist" containerID="e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.696131 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244"} err="failed to get container status \"e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244\": rpc error: code = NotFound desc = could not find container \"e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244\": container with ID starting with e65906c2e18f99d1ae8f7979b57948a0712e252ccec84428baf7905f59be9244 not found: ID does not exist" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.696172 4931 scope.go:117] "RemoveContainer" containerID="a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa" Jan 30 05:28:42 crc kubenswrapper[4931]: E0130 05:28:42.696627 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa\": container with ID starting with a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa not found: ID does not exist" containerID="a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.696660 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa"} err="failed to get container status \"a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa\": rpc error: code = NotFound desc = could not find container \"a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa\": container with ID starting with a86dbb9777f83307aa7e0153e43a1930d04e2a1808d8f994eb6f9cfbc049e3fa not found: ID does not exist" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.705821 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.714334 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:42 crc kubenswrapper[4931]: E0130 05:28:42.714822 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-api" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.714847 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-api" Jan 30 05:28:42 crc kubenswrapper[4931]: E0130 05:28:42.714880 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-log" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.714889 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-log" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.715106 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-log" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.715134 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="78469f1d-85e0-488f-9334-f756e0410bba" containerName="nova-api-api" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.716290 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.719699 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.719743 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.719882 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.726917 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.833384 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-public-tls-certs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.833945 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrpv\" (UniqueName: \"kubernetes.io/projected/84172ea2-ea94-454e-a247-3388dbd3f559-kube-api-access-2vrpv\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.834060 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-config-data\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.834130 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.834246 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.834592 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84172ea2-ea94-454e-a247-3388dbd3f559-logs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.937925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-public-tls-certs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.938016 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrpv\" (UniqueName: \"kubernetes.io/projected/84172ea2-ea94-454e-a247-3388dbd3f559-kube-api-access-2vrpv\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.938055 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-config-data\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.938083 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.938116 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.938178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84172ea2-ea94-454e-a247-3388dbd3f559-logs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.938659 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84172ea2-ea94-454e-a247-3388dbd3f559-logs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.942025 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-config-data\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.943258 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.943870 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-public-tls-certs\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.950822 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:42 crc kubenswrapper[4931]: I0130 05:28:42.953365 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrpv\" (UniqueName: \"kubernetes.io/projected/84172ea2-ea94-454e-a247-3388dbd3f559-kube-api-access-2vrpv\") pod \"nova-api-0\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " pod="openstack/nova-api-0" Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.039600 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.438019 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78469f1d-85e0-488f-9334-f756e0410bba" path="/var/lib/kubelet/pods/78469f1d-85e0-488f-9334-f756e0410bba/volumes" Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.608643 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.645695 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84172ea2-ea94-454e-a247-3388dbd3f559","Type":"ContainerStarted","Data":"5def1c57849cb63c69d783f2aabd654a2c46a9078012cc3f016fb655404a7738"} Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.959670 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.965726 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.965779 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:28:43 crc kubenswrapper[4931]: I0130 05:28:43.986686 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.661672 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84172ea2-ea94-454e-a247-3388dbd3f559","Type":"ContainerStarted","Data":"32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61"} Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.662117 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84172ea2-ea94-454e-a247-3388dbd3f559","Type":"ContainerStarted","Data":"d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8"} Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.701973 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.701942326 podStartE2EDuration="2.701942326s" podCreationTimestamp="2026-01-30 05:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:44.683328683 +0000 UTC m=+1260.053238970" watchObservedRunningTime="2026-01-30 05:28:44.701942326 +0000 UTC m=+1260.071852623" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.707191 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.935374 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tjkcd"] Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.936809 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.942704 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.943371 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.944235 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tjkcd"] Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.978560 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:44 crc kubenswrapper[4931]: I0130 05:28:44.978609 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.085519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-config-data\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.085624 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.085716 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwjb\" (UniqueName: \"kubernetes.io/projected/fac7a7da-7577-4269-8e37-fd964be6f75c-kube-api-access-bqwjb\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.085745 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-scripts\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.187811 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwjb\" (UniqueName: \"kubernetes.io/projected/fac7a7da-7577-4269-8e37-fd964be6f75c-kube-api-access-bqwjb\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.187890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-scripts\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.187997 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-config-data\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.188117 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.197408 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-config-data\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.199215 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-scripts\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.211047 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.215148 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwjb\" (UniqueName: \"kubernetes.io/projected/fac7a7da-7577-4269-8e37-fd964be6f75c-kube-api-access-bqwjb\") pod \"nova-cell1-cell-mapping-tjkcd\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.256611 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:45 crc kubenswrapper[4931]: I0130 05:28:45.715422 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tjkcd"] Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.078751 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.190839 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-p25hj"] Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.191898 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerName="dnsmasq-dns" containerID="cri-o://a729151ede12640ea81c41f5d7f2d36efd861e7a4d31b991fe42dd4d2139fbe2" gracePeriod=10 Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.687723 4931 generic.go:334] "Generic (PLEG): container finished" podID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerID="a729151ede12640ea81c41f5d7f2d36efd861e7a4d31b991fe42dd4d2139fbe2" exitCode=0 Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.688020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" event={"ID":"4dc14ae7-f05f-4093-838b-bdd419f4302f","Type":"ContainerDied","Data":"a729151ede12640ea81c41f5d7f2d36efd861e7a4d31b991fe42dd4d2139fbe2"} Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.688047 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" event={"ID":"4dc14ae7-f05f-4093-838b-bdd419f4302f","Type":"ContainerDied","Data":"8d7831c7e52a58c64227ffa0e77bd37bd5e37f6fcdda5357867d37c004b57577"} Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.688056 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7831c7e52a58c64227ffa0e77bd37bd5e37f6fcdda5357867d37c004b57577" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.691580 4931 generic.go:334] "Generic (PLEG): container finished" podID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerID="89d3e2e5267fbad75d566e6ba9ac104cbc8326782fb62f0bd4ed8c4f9b169c76" exitCode=0 Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.691700 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerDied","Data":"89d3e2e5267fbad75d566e6ba9ac104cbc8326782fb62f0bd4ed8c4f9b169c76"} Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.694435 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tjkcd" event={"ID":"fac7a7da-7577-4269-8e37-fd964be6f75c","Type":"ContainerStarted","Data":"a32ca29963fb38a6014b8500b2c2495801c36c3f1563f62dcc7d71405aa5c328"} Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.694478 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tjkcd" event={"ID":"fac7a7da-7577-4269-8e37-fd964be6f75c","Type":"ContainerStarted","Data":"12bb0a37bfd07d7c0922ec928aebe0dc629e0ea396fd8488eb5f2642c0f2d238"} Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.696416 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.704468 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.714581 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tjkcd" podStartSLOduration=2.714564247 podStartE2EDuration="2.714564247s" podCreationTimestamp="2026-01-30 05:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:46.712261512 +0000 UTC m=+1262.082171769" watchObservedRunningTime="2026-01-30 05:28:46.714564247 +0000 UTC m=+1262.084474494" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.830999 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-svc\") pod \"4dc14ae7-f05f-4093-838b-bdd419f4302f\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831158 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-log-httpd\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831190 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-sb\") pod \"4dc14ae7-f05f-4093-838b-bdd419f4302f\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831214 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-run-httpd\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831239 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-config\") pod \"4dc14ae7-f05f-4093-838b-bdd419f4302f\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831275 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-ceilometer-tls-certs\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831317 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xw72\" (UniqueName: \"kubernetes.io/projected/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-kube-api-access-9xw72\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831344 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-combined-ca-bundle\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831367 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-swift-storage-0\") pod \"4dc14ae7-f05f-4093-838b-bdd419f4302f\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831385 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-scripts\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831415 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-config-data\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831513 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-sg-core-conf-yaml\") pod \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\" (UID: \"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831529 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftvmx\" (UniqueName: \"kubernetes.io/projected/4dc14ae7-f05f-4093-838b-bdd419f4302f-kube-api-access-ftvmx\") pod \"4dc14ae7-f05f-4093-838b-bdd419f4302f\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.831556 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-nb\") pod \"4dc14ae7-f05f-4093-838b-bdd419f4302f\" (UID: \"4dc14ae7-f05f-4093-838b-bdd419f4302f\") " Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.832333 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.832884 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.836599 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-kube-api-access-9xw72" (OuterVolumeSpecName: "kube-api-access-9xw72") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "kube-api-access-9xw72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.838537 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc14ae7-f05f-4093-838b-bdd419f4302f-kube-api-access-ftvmx" (OuterVolumeSpecName: "kube-api-access-ftvmx") pod "4dc14ae7-f05f-4093-838b-bdd419f4302f" (UID: "4dc14ae7-f05f-4093-838b-bdd419f4302f"). InnerVolumeSpecName "kube-api-access-ftvmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.841783 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-scripts" (OuterVolumeSpecName: "scripts") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.910608 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.917807 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4dc14ae7-f05f-4093-838b-bdd419f4302f" (UID: "4dc14ae7-f05f-4093-838b-bdd419f4302f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933514 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933539 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933549 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xw72\" (UniqueName: \"kubernetes.io/projected/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-kube-api-access-9xw72\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933559 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933567 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933575 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.933584 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftvmx\" (UniqueName: \"kubernetes.io/projected/4dc14ae7-f05f-4093-838b-bdd419f4302f-kube-api-access-ftvmx\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.940802 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.942543 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4dc14ae7-f05f-4093-838b-bdd419f4302f" (UID: "4dc14ae7-f05f-4093-838b-bdd419f4302f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.948201 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-config" (OuterVolumeSpecName: "config") pod "4dc14ae7-f05f-4093-838b-bdd419f4302f" (UID: "4dc14ae7-f05f-4093-838b-bdd419f4302f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.950028 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4dc14ae7-f05f-4093-838b-bdd419f4302f" (UID: "4dc14ae7-f05f-4093-838b-bdd419f4302f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.968795 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.980819 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4dc14ae7-f05f-4093-838b-bdd419f4302f" (UID: "4dc14ae7-f05f-4093-838b-bdd419f4302f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:46 crc kubenswrapper[4931]: I0130 05:28:46.983698 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-config-data" (OuterVolumeSpecName: "config-data") pod "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" (UID: "9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035793 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035834 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035848 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035862 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035874 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035885 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc14ae7-f05f-4093-838b-bdd419f4302f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.035896 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.711211 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7","Type":"ContainerDied","Data":"011e359b19d615fee43023e3c1e45b97d099d18b94cb63d33b5905aec3a68e27"} Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.711277 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-p25hj" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.711301 4931 scope.go:117] "RemoveContainer" containerID="149c802638d11431261e2009e655ac397a1354f084b06d7f2da2c77118f48d07" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.711605 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.760401 4931 scope.go:117] "RemoveContainer" containerID="37cec202033bdc3e70a415b285410f6ce8158b9541b9d20f36bd938b5978559c" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.784220 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-p25hj"] Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.795972 4931 scope.go:117] "RemoveContainer" containerID="89d3e2e5267fbad75d566e6ba9ac104cbc8326782fb62f0bd4ed8c4f9b169c76" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.802413 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-p25hj"] Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.820531 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.863036 4931 scope.go:117] "RemoveContainer" containerID="782c5f8dbe0e7576669ed328ec36323e5d152ca2c37c77db802604122975e914" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.873320 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.892386 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:47 crc kubenswrapper[4931]: E0130 05:28:47.892820 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerName="init" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.892844 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerName="init" Jan 30 05:28:47 crc kubenswrapper[4931]: E0130 05:28:47.892860 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerName="dnsmasq-dns" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.892869 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerName="dnsmasq-dns" Jan 30 05:28:47 crc kubenswrapper[4931]: E0130 05:28:47.892881 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-central-agent" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.892890 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-central-agent" Jan 30 05:28:47 crc kubenswrapper[4931]: E0130 05:28:47.892907 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="sg-core" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.892914 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="sg-core" Jan 30 05:28:47 crc kubenswrapper[4931]: E0130 05:28:47.892967 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="proxy-httpd" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.892977 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="proxy-httpd" Jan 30 05:28:47 crc kubenswrapper[4931]: E0130 05:28:47.892997 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-notification-agent" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.893005 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-notification-agent" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.893306 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-notification-agent" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.893324 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="proxy-httpd" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.893344 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" containerName="dnsmasq-dns" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.893365 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="ceilometer-central-agent" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.893378 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" containerName="sg-core" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.895743 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.898653 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.898826 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.900440 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:28:47 crc kubenswrapper[4931]: I0130 05:28:47.905243 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.063736 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-scripts\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.063840 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-run-httpd\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.063878 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.063905 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.063938 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.063960 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-log-httpd\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.064144 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-config-data\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.064248 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4df4\" (UniqueName: \"kubernetes.io/projected/cacfcbd5-8c12-4fc5-88ce-516fda23464d-kube-api-access-q4df4\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.166697 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.167103 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.167297 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.167510 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-log-httpd\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.167804 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-config-data\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.168679 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4df4\" (UniqueName: \"kubernetes.io/projected/cacfcbd5-8c12-4fc5-88ce-516fda23464d-kube-api-access-q4df4\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.169028 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-scripts\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.169256 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-run-httpd\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.168065 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-log-httpd\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.169913 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-run-httpd\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.174816 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.176565 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-scripts\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.177103 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-config-data\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.179461 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.182122 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.191504 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4df4\" (UniqueName: \"kubernetes.io/projected/cacfcbd5-8c12-4fc5-88ce-516fda23464d-kube-api-access-q4df4\") pod \"ceilometer-0\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.217682 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4931]: I0130 05:28:48.728625 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:49 crc kubenswrapper[4931]: I0130 05:28:49.444251 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc14ae7-f05f-4093-838b-bdd419f4302f" path="/var/lib/kubelet/pods/4dc14ae7-f05f-4093-838b-bdd419f4302f/volumes" Jan 30 05:28:49 crc kubenswrapper[4931]: I0130 05:28:49.446457 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7" path="/var/lib/kubelet/pods/9bf8c845-e69c-41e6-9e59-4b9b9fceaaf7/volumes" Jan 30 05:28:49 crc kubenswrapper[4931]: I0130 05:28:49.753784 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerStarted","Data":"9186f065f875b33ba8c3817977c815fff7b67dc7404ddb06ea6a85fb64800755"} Jan 30 05:28:49 crc kubenswrapper[4931]: I0130 05:28:49.754108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerStarted","Data":"99a1153c1cd92ab2a34d0651a54dd16cc1116a03a4d5c96b1f4e7e5abbde1e2d"} Jan 30 05:28:50 crc kubenswrapper[4931]: I0130 05:28:50.763208 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerStarted","Data":"0f6848e1ccd25c33da13cfce62f451555f794b35623ff124d320281a39cb9911"} Jan 30 05:28:50 crc kubenswrapper[4931]: I0130 05:28:50.763521 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerStarted","Data":"62da5f526098b3b9f5437a81119156f87878963a1c44c026236d9b63e20bbac5"} Jan 30 05:28:50 crc kubenswrapper[4931]: I0130 05:28:50.766137 4931 generic.go:334] "Generic (PLEG): container finished" podID="fac7a7da-7577-4269-8e37-fd964be6f75c" containerID="a32ca29963fb38a6014b8500b2c2495801c36c3f1563f62dcc7d71405aa5c328" exitCode=0 Jan 30 05:28:50 crc kubenswrapper[4931]: I0130 05:28:50.766180 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tjkcd" event={"ID":"fac7a7da-7577-4269-8e37-fd964be6f75c","Type":"ContainerDied","Data":"a32ca29963fb38a6014b8500b2c2495801c36c3f1563f62dcc7d71405aa5c328"} Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.188029 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.350641 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-scripts\") pod \"fac7a7da-7577-4269-8e37-fd964be6f75c\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.350902 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-combined-ca-bundle\") pod \"fac7a7da-7577-4269-8e37-fd964be6f75c\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.350998 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwjb\" (UniqueName: \"kubernetes.io/projected/fac7a7da-7577-4269-8e37-fd964be6f75c-kube-api-access-bqwjb\") pod \"fac7a7da-7577-4269-8e37-fd964be6f75c\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.351149 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-config-data\") pod \"fac7a7da-7577-4269-8e37-fd964be6f75c\" (UID: \"fac7a7da-7577-4269-8e37-fd964be6f75c\") " Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.357802 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-scripts" (OuterVolumeSpecName: "scripts") pod "fac7a7da-7577-4269-8e37-fd964be6f75c" (UID: "fac7a7da-7577-4269-8e37-fd964be6f75c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.363819 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac7a7da-7577-4269-8e37-fd964be6f75c-kube-api-access-bqwjb" (OuterVolumeSpecName: "kube-api-access-bqwjb") pod "fac7a7da-7577-4269-8e37-fd964be6f75c" (UID: "fac7a7da-7577-4269-8e37-fd964be6f75c"). InnerVolumeSpecName "kube-api-access-bqwjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.402480 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-config-data" (OuterVolumeSpecName: "config-data") pod "fac7a7da-7577-4269-8e37-fd964be6f75c" (UID: "fac7a7da-7577-4269-8e37-fd964be6f75c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.409238 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fac7a7da-7577-4269-8e37-fd964be6f75c" (UID: "fac7a7da-7577-4269-8e37-fd964be6f75c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.455179 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqwjb\" (UniqueName: \"kubernetes.io/projected/fac7a7da-7577-4269-8e37-fd964be6f75c-kube-api-access-bqwjb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.455798 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.455831 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.455861 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac7a7da-7577-4269-8e37-fd964be6f75c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.794608 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tjkcd" event={"ID":"fac7a7da-7577-4269-8e37-fd964be6f75c","Type":"ContainerDied","Data":"12bb0a37bfd07d7c0922ec928aebe0dc629e0ea396fd8488eb5f2642c0f2d238"} Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.794663 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12bb0a37bfd07d7c0922ec928aebe0dc629e0ea396fd8488eb5f2642c0f2d238" Jan 30 05:28:52 crc kubenswrapper[4931]: I0130 05:28:52.794672 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tjkcd" Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.040781 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.040823 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.065842 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.066104 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f4be9b51-9e05-4080-9aac-1e7a68785e90" containerName="nova-scheduler-scheduler" containerID="cri-o://fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" gracePeriod=30 Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.086194 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.094461 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.094726 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-log" containerID="cri-o://df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e" gracePeriod=30 Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.094871 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-metadata" containerID="cri-o://c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322" gracePeriod=30 Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.804890 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerStarted","Data":"25cc12087ab98d0fc79e679c4de5be61f557329f293a5a68393ba8b20a57c428"} Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.805330 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.808098 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerID="df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e" exitCode=143 Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.808199 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873","Type":"ContainerDied","Data":"df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e"} Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.808340 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-log" containerID="cri-o://d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8" gracePeriod=30 Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.808375 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-api" containerID="cri-o://32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61" gracePeriod=30 Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.815653 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": EOF" Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.815832 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": EOF" Jan 30 05:28:53 crc kubenswrapper[4931]: I0130 05:28:53.842517 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.970462409 podStartE2EDuration="6.842494502s" podCreationTimestamp="2026-01-30 05:28:47 +0000 UTC" firstStartedPulling="2026-01-30 05:28:48.739383351 +0000 UTC m=+1264.109293618" lastFinishedPulling="2026-01-30 05:28:52.611415414 +0000 UTC m=+1267.981325711" observedRunningTime="2026-01-30 05:28:53.838170561 +0000 UTC m=+1269.208080828" watchObservedRunningTime="2026-01-30 05:28:53.842494502 +0000 UTC m=+1269.212404759" Jan 30 05:28:54 crc kubenswrapper[4931]: E0130 05:28:54.734486 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:54 crc kubenswrapper[4931]: E0130 05:28:54.736165 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:54 crc kubenswrapper[4931]: E0130 05:28:54.738075 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:54 crc kubenswrapper[4931]: E0130 05:28:54.738104 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f4be9b51-9e05-4080-9aac-1e7a68785e90" containerName="nova-scheduler-scheduler" Jan 30 05:28:54 crc kubenswrapper[4931]: I0130 05:28:54.834072 4931 generic.go:334] "Generic (PLEG): container finished" podID="84172ea2-ea94-454e-a247-3388dbd3f559" containerID="d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8" exitCode=143 Jan 30 05:28:54 crc kubenswrapper[4931]: I0130 05:28:54.834123 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84172ea2-ea94-454e-a247-3388dbd3f559","Type":"ContainerDied","Data":"d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8"} Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.806336 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.864843 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerID="c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322" exitCode=0 Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.864892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873","Type":"ContainerDied","Data":"c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322"} Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.864921 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873","Type":"ContainerDied","Data":"ce7e3daf9b312cef06426fb53814cad3b92811315fdfd1796c0688baab6a72ed"} Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.864942 4931 scope.go:117] "RemoveContainer" containerID="c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.864998 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.904102 4931 scope.go:117] "RemoveContainer" containerID="df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.925607 4931 scope.go:117] "RemoveContainer" containerID="c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322" Jan 30 05:28:56 crc kubenswrapper[4931]: E0130 05:28:56.926307 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322\": container with ID starting with c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322 not found: ID does not exist" containerID="c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.926347 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322"} err="failed to get container status \"c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322\": rpc error: code = NotFound desc = could not find container \"c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322\": container with ID starting with c1b78b938372c88282024e91925a759a66838d2af03e52c363529c73f9f70322 not found: ID does not exist" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.926408 4931 scope.go:117] "RemoveContainer" containerID="df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e" Jan 30 05:28:56 crc kubenswrapper[4931]: E0130 05:28:56.926836 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e\": container with ID starting with df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e not found: ID does not exist" containerID="df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.926866 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e"} err="failed to get container status \"df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e\": rpc error: code = NotFound desc = could not find container \"df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e\": container with ID starting with df793b7adeadd15a2ed69b8d798d6152c6ea016bd80ec1e05b3e19a35a19ae6e not found: ID does not exist" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.960036 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-config-data\") pod \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.960121 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-nova-metadata-tls-certs\") pod \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.960173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-logs\") pod \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.960342 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-combined-ca-bundle\") pod \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.960584 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhbcb\" (UniqueName: \"kubernetes.io/projected/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-kube-api-access-fhbcb\") pod \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\" (UID: \"0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873\") " Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.961147 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-logs" (OuterVolumeSpecName: "logs") pod "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" (UID: "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.962087 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:56 crc kubenswrapper[4931]: I0130 05:28:56.968739 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-kube-api-access-fhbcb" (OuterVolumeSpecName: "kube-api-access-fhbcb") pod "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" (UID: "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873"). InnerVolumeSpecName "kube-api-access-fhbcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.005966 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" (UID: "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.031209 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-config-data" (OuterVolumeSpecName: "config-data") pod "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" (UID: "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.042789 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" (UID: "0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.065291 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhbcb\" (UniqueName: \"kubernetes.io/projected/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-kube-api-access-fhbcb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.065330 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.065346 4931 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.065359 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.240558 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.263853 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.274947 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:57 crc kubenswrapper[4931]: E0130 05:28:57.275465 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac7a7da-7577-4269-8e37-fd964be6f75c" containerName="nova-manage" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.275490 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac7a7da-7577-4269-8e37-fd964be6f75c" containerName="nova-manage" Jan 30 05:28:57 crc kubenswrapper[4931]: E0130 05:28:57.275521 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-metadata" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.275530 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-metadata" Jan 30 05:28:57 crc kubenswrapper[4931]: E0130 05:28:57.275549 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-log" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.275558 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-log" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.275798 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-metadata" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.275827 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac7a7da-7577-4269-8e37-fd964be6f75c" containerName="nova-manage" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.275844 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" containerName="nova-metadata-log" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.277406 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.280396 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.280708 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.283041 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.363345 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.363465 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.375584 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc8vz\" (UniqueName: \"kubernetes.io/projected/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-kube-api-access-pc8vz\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.375729 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-logs\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.375771 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-config-data\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.375919 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.376061 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.440039 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873" path="/var/lib/kubelet/pods/0c0f5a0d-26b4-49ab-8bca-8a29ae2d5873/volumes" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.478386 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc8vz\" (UniqueName: \"kubernetes.io/projected/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-kube-api-access-pc8vz\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.478568 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-logs\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.478738 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-config-data\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.478835 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.478909 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.479859 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-logs\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.485626 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.485752 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-config-data\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.489291 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.507882 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc8vz\" (UniqueName: \"kubernetes.io/projected/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-kube-api-access-pc8vz\") pod \"nova-metadata-0\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " pod="openstack/nova-metadata-0" Jan 30 05:28:57 crc kubenswrapper[4931]: I0130 05:28:57.594306 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:58 crc kubenswrapper[4931]: W0130 05:28:58.123607 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e6d6a8_599b_4ab9_b1f7_cf521e455d74.slice/crio-575ed258be47595d56c46486254ee62d83c72e80bf57019828419331f46802a7 WatchSource:0}: Error finding container 575ed258be47595d56c46486254ee62d83c72e80bf57019828419331f46802a7: Status 404 returned error can't find the container with id 575ed258be47595d56c46486254ee62d83c72e80bf57019828419331f46802a7 Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.133572 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.736992 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.807396 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-config-data\") pod \"f4be9b51-9e05-4080-9aac-1e7a68785e90\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.807887 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-combined-ca-bundle\") pod \"f4be9b51-9e05-4080-9aac-1e7a68785e90\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.807929 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr8p8\" (UniqueName: \"kubernetes.io/projected/f4be9b51-9e05-4080-9aac-1e7a68785e90-kube-api-access-dr8p8\") pod \"f4be9b51-9e05-4080-9aac-1e7a68785e90\" (UID: \"f4be9b51-9e05-4080-9aac-1e7a68785e90\") " Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.812226 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4be9b51-9e05-4080-9aac-1e7a68785e90-kube-api-access-dr8p8" (OuterVolumeSpecName: "kube-api-access-dr8p8") pod "f4be9b51-9e05-4080-9aac-1e7a68785e90" (UID: "f4be9b51-9e05-4080-9aac-1e7a68785e90"). InnerVolumeSpecName "kube-api-access-dr8p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.843447 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4be9b51-9e05-4080-9aac-1e7a68785e90" (UID: "f4be9b51-9e05-4080-9aac-1e7a68785e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.846872 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-config-data" (OuterVolumeSpecName: "config-data") pod "f4be9b51-9e05-4080-9aac-1e7a68785e90" (UID: "f4be9b51-9e05-4080-9aac-1e7a68785e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.884712 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4be9b51-9e05-4080-9aac-1e7a68785e90" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" exitCode=0 Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.884775 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4be9b51-9e05-4080-9aac-1e7a68785e90","Type":"ContainerDied","Data":"fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd"} Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.884806 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4be9b51-9e05-4080-9aac-1e7a68785e90","Type":"ContainerDied","Data":"2dd5974fb15bfc06c0c4a9379a4055028a1a3284a814cf069766b15f03c9fc5e"} Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.884826 4931 scope.go:117] "RemoveContainer" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.884948 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.888281 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74","Type":"ContainerStarted","Data":"6c90254ae67ae50ab19fa555ce55d1839d94322882a09bb91ad616b62efcfeeb"} Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.888317 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74","Type":"ContainerStarted","Data":"a268ff4ead170d4fc7c25a89e846ed2d0f10278b94da4082529cc4ebd9ab4f0e"} Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.888329 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74","Type":"ContainerStarted","Data":"575ed258be47595d56c46486254ee62d83c72e80bf57019828419331f46802a7"} Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.909842 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.909894 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4be9b51-9e05-4080-9aac-1e7a68785e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.909921 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr8p8\" (UniqueName: \"kubernetes.io/projected/f4be9b51-9e05-4080-9aac-1e7a68785e90-kube-api-access-dr8p8\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.912720 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.912669119 podStartE2EDuration="1.912669119s" podCreationTimestamp="2026-01-30 05:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:58.90521395 +0000 UTC m=+1274.275124227" watchObservedRunningTime="2026-01-30 05:28:58.912669119 +0000 UTC m=+1274.282579406" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.930175 4931 scope.go:117] "RemoveContainer" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" Jan 30 05:28:58 crc kubenswrapper[4931]: E0130 05:28:58.930676 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd\": container with ID starting with fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd not found: ID does not exist" containerID="fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.930701 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd"} err="failed to get container status \"fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd\": rpc error: code = NotFound desc = could not find container \"fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd\": container with ID starting with fb13c2d310bf911723630c03371a3530f6d596636164c4a527400ca34ebbd7fd not found: ID does not exist" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.936531 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.950002 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.957411 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:58 crc kubenswrapper[4931]: E0130 05:28:58.957994 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4be9b51-9e05-4080-9aac-1e7a68785e90" containerName="nova-scheduler-scheduler" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.958018 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4be9b51-9e05-4080-9aac-1e7a68785e90" containerName="nova-scheduler-scheduler" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.958293 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4be9b51-9e05-4080-9aac-1e7a68785e90" containerName="nova-scheduler-scheduler" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.959097 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.961359 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 05:28:58 crc kubenswrapper[4931]: I0130 05:28:58.969575 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.113611 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5r9p\" (UniqueName: \"kubernetes.io/projected/1acfa9c2-a802-404e-976b-93d9f99e1fbb-kube-api-access-z5r9p\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.113677 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-config-data\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.114057 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.216549 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5r9p\" (UniqueName: \"kubernetes.io/projected/1acfa9c2-a802-404e-976b-93d9f99e1fbb-kube-api-access-z5r9p\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.216617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-config-data\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.216717 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.222943 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.222991 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-config-data\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.240547 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5r9p\" (UniqueName: \"kubernetes.io/projected/1acfa9c2-a802-404e-976b-93d9f99e1fbb-kube-api-access-z5r9p\") pod \"nova-scheduler-0\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.274729 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.438119 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4be9b51-9e05-4080-9aac-1e7a68785e90" path="/var/lib/kubelet/pods/f4be9b51-9e05-4080-9aac-1e7a68785e90/volumes" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.758387 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.828211 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-public-tls-certs\") pod \"84172ea2-ea94-454e-a247-3388dbd3f559\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.828260 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrpv\" (UniqueName: \"kubernetes.io/projected/84172ea2-ea94-454e-a247-3388dbd3f559-kube-api-access-2vrpv\") pod \"84172ea2-ea94-454e-a247-3388dbd3f559\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.828344 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84172ea2-ea94-454e-a247-3388dbd3f559-logs\") pod \"84172ea2-ea94-454e-a247-3388dbd3f559\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.828374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-internal-tls-certs\") pod \"84172ea2-ea94-454e-a247-3388dbd3f559\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.828434 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-combined-ca-bundle\") pod \"84172ea2-ea94-454e-a247-3388dbd3f559\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.828543 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-config-data\") pod \"84172ea2-ea94-454e-a247-3388dbd3f559\" (UID: \"84172ea2-ea94-454e-a247-3388dbd3f559\") " Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.829213 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84172ea2-ea94-454e-a247-3388dbd3f559-logs" (OuterVolumeSpecName: "logs") pod "84172ea2-ea94-454e-a247-3388dbd3f559" (UID: "84172ea2-ea94-454e-a247-3388dbd3f559"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.838508 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84172ea2-ea94-454e-a247-3388dbd3f559-kube-api-access-2vrpv" (OuterVolumeSpecName: "kube-api-access-2vrpv") pod "84172ea2-ea94-454e-a247-3388dbd3f559" (UID: "84172ea2-ea94-454e-a247-3388dbd3f559"). InnerVolumeSpecName "kube-api-access-2vrpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.878054 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-config-data" (OuterVolumeSpecName: "config-data") pod "84172ea2-ea94-454e-a247-3388dbd3f559" (UID: "84172ea2-ea94-454e-a247-3388dbd3f559"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.885010 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84172ea2-ea94-454e-a247-3388dbd3f559" (UID: "84172ea2-ea94-454e-a247-3388dbd3f559"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.909442 4931 generic.go:334] "Generic (PLEG): container finished" podID="84172ea2-ea94-454e-a247-3388dbd3f559" containerID="32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61" exitCode=0 Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.909509 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84172ea2-ea94-454e-a247-3388dbd3f559","Type":"ContainerDied","Data":"32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61"} Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.909703 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84172ea2-ea94-454e-a247-3388dbd3f559","Type":"ContainerDied","Data":"5def1c57849cb63c69d783f2aabd654a2c46a9078012cc3f016fb655404a7738"} Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.909731 4931 scope.go:117] "RemoveContainer" containerID="32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.909972 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.910745 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "84172ea2-ea94-454e-a247-3388dbd3f559" (UID: "84172ea2-ea94-454e-a247-3388dbd3f559"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.927198 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.936571 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.936635 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vrpv\" (UniqueName: \"kubernetes.io/projected/84172ea2-ea94-454e-a247-3388dbd3f559-kube-api-access-2vrpv\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.936647 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84172ea2-ea94-454e-a247-3388dbd3f559-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.936660 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.936669 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.951656 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "84172ea2-ea94-454e-a247-3388dbd3f559" (UID: "84172ea2-ea94-454e-a247-3388dbd3f559"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4931]: I0130 05:28:59.996766 4931 scope.go:117] "RemoveContainer" containerID="d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.022613 4931 scope.go:117] "RemoveContainer" containerID="32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61" Jan 30 05:29:00 crc kubenswrapper[4931]: E0130 05:29:00.023077 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61\": container with ID starting with 32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61 not found: ID does not exist" containerID="32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.023126 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61"} err="failed to get container status \"32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61\": rpc error: code = NotFound desc = could not find container \"32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61\": container with ID starting with 32f9b0c81902795a5f01b845ba8964a6d9eb951a58b9171a271cd3f750b03f61 not found: ID does not exist" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.023145 4931 scope.go:117] "RemoveContainer" containerID="d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8" Jan 30 05:29:00 crc kubenswrapper[4931]: E0130 05:29:00.023511 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8\": container with ID starting with d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8 not found: ID does not exist" containerID="d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.023530 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8"} err="failed to get container status \"d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8\": rpc error: code = NotFound desc = could not find container \"d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8\": container with ID starting with d14ecfe1fd471fb982dc61b1f42b768a824415136baffeefdc2bb1ee2a9426e8 not found: ID does not exist" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.038143 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84172ea2-ea94-454e-a247-3388dbd3f559-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.263905 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.285313 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.305653 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:00 crc kubenswrapper[4931]: E0130 05:29:00.306319 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-api" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.306352 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-api" Jan 30 05:29:00 crc kubenswrapper[4931]: E0130 05:29:00.306401 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-log" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.306417 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-log" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.306777 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-log" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.306807 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" containerName="nova-api-api" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.308532 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.316561 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.316869 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.317071 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.329056 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.445514 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dkxm\" (UniqueName: \"kubernetes.io/projected/406c25f3-c398-4ace-ba4b-1d9b48b289a2-kube-api-access-8dkxm\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.445651 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.445704 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.445777 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/406c25f3-c398-4ace-ba4b-1d9b48b289a2-logs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.445835 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.445962 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-config-data\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547117 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-config-data\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547264 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dkxm\" (UniqueName: \"kubernetes.io/projected/406c25f3-c398-4ace-ba4b-1d9b48b289a2-kube-api-access-8dkxm\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547307 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547323 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547353 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/406c25f3-c398-4ace-ba4b-1d9b48b289a2-logs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547379 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.547994 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/406c25f3-c398-4ace-ba4b-1d9b48b289a2-logs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.554493 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-config-data\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.554856 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.560476 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.564120 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.583305 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dkxm\" (UniqueName: \"kubernetes.io/projected/406c25f3-c398-4ace-ba4b-1d9b48b289a2-kube-api-access-8dkxm\") pod \"nova-api-0\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.681301 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.927772 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1acfa9c2-a802-404e-976b-93d9f99e1fbb","Type":"ContainerStarted","Data":"cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52"} Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.927841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1acfa9c2-a802-404e-976b-93d9f99e1fbb","Type":"ContainerStarted","Data":"24154bd6bbe2da670ea864204ee97206379a1b7b92792be6f14d33757f908143"} Jan 30 05:29:00 crc kubenswrapper[4931]: I0130 05:29:00.972372 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.972342271 podStartE2EDuration="2.972342271s" podCreationTimestamp="2026-01-30 05:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:00.948647016 +0000 UTC m=+1276.318557323" watchObservedRunningTime="2026-01-30 05:29:00.972342271 +0000 UTC m=+1276.342252548" Jan 30 05:29:01 crc kubenswrapper[4931]: I0130 05:29:01.069825 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:01 crc kubenswrapper[4931]: W0130 05:29:01.073407 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod406c25f3_c398_4ace_ba4b_1d9b48b289a2.slice/crio-76659dd7c99e1a96db2d103669e2a9a9122278f8d46deab3719b00840b99f159 WatchSource:0}: Error finding container 76659dd7c99e1a96db2d103669e2a9a9122278f8d46deab3719b00840b99f159: Status 404 returned error can't find the container with id 76659dd7c99e1a96db2d103669e2a9a9122278f8d46deab3719b00840b99f159 Jan 30 05:29:01 crc kubenswrapper[4931]: I0130 05:29:01.439574 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84172ea2-ea94-454e-a247-3388dbd3f559" path="/var/lib/kubelet/pods/84172ea2-ea94-454e-a247-3388dbd3f559/volumes" Jan 30 05:29:01 crc kubenswrapper[4931]: I0130 05:29:01.947816 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"406c25f3-c398-4ace-ba4b-1d9b48b289a2","Type":"ContainerStarted","Data":"d70e15b0f074e59f1a9f39048c2cf45a62e81400091cb70df139030d514fe003"} Jan 30 05:29:01 crc kubenswrapper[4931]: I0130 05:29:01.947866 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"406c25f3-c398-4ace-ba4b-1d9b48b289a2","Type":"ContainerStarted","Data":"e5dadd497214a5d2efc5b8027947f3661f7f73599b0778570358c42329955e8d"} Jan 30 05:29:01 crc kubenswrapper[4931]: I0130 05:29:01.947877 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"406c25f3-c398-4ace-ba4b-1d9b48b289a2","Type":"ContainerStarted","Data":"76659dd7c99e1a96db2d103669e2a9a9122278f8d46deab3719b00840b99f159"} Jan 30 05:29:01 crc kubenswrapper[4931]: I0130 05:29:01.974670 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.974650199 podStartE2EDuration="1.974650199s" podCreationTimestamp="2026-01-30 05:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:01.971558773 +0000 UTC m=+1277.341469050" watchObservedRunningTime="2026-01-30 05:29:01.974650199 +0000 UTC m=+1277.344560456" Jan 30 05:29:02 crc kubenswrapper[4931]: I0130 05:29:02.594804 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4931]: I0130 05:29:02.594862 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:29:04 crc kubenswrapper[4931]: I0130 05:29:04.275286 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 05:29:07 crc kubenswrapper[4931]: I0130 05:29:07.595284 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:29:07 crc kubenswrapper[4931]: I0130 05:29:07.595351 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:29:08 crc kubenswrapper[4931]: I0130 05:29:08.608672 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:08 crc kubenswrapper[4931]: I0130 05:29:08.608804 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:09 crc kubenswrapper[4931]: I0130 05:29:09.275205 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 05:29:09 crc kubenswrapper[4931]: I0130 05:29:09.318584 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 05:29:10 crc kubenswrapper[4931]: I0130 05:29:10.070095 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 05:29:10 crc kubenswrapper[4931]: I0130 05:29:10.683591 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:29:10 crc kubenswrapper[4931]: I0130 05:29:10.683644 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:29:11 crc kubenswrapper[4931]: I0130 05:29:11.698611 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:11 crc kubenswrapper[4931]: I0130 05:29:11.698681 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:17 crc kubenswrapper[4931]: I0130 05:29:17.603718 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 05:29:17 crc kubenswrapper[4931]: I0130 05:29:17.606631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 05:29:17 crc kubenswrapper[4931]: I0130 05:29:17.611780 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 05:29:18 crc kubenswrapper[4931]: I0130 05:29:18.133582 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 05:29:18 crc kubenswrapper[4931]: I0130 05:29:18.229525 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 05:29:20 crc kubenswrapper[4931]: I0130 05:29:20.694316 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:29:20 crc kubenswrapper[4931]: I0130 05:29:20.695082 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:29:20 crc kubenswrapper[4931]: I0130 05:29:20.699626 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:29:20 crc kubenswrapper[4931]: I0130 05:29:20.703885 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:29:21 crc kubenswrapper[4931]: I0130 05:29:21.165302 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:29:21 crc kubenswrapper[4931]: I0130 05:29:21.177922 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:29:27 crc kubenswrapper[4931]: I0130 05:29:27.363554 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:29:27 crc kubenswrapper[4931]: I0130 05:29:27.364384 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.181112 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7789bbd757-45b5w"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.183115 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.203510 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.217187 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-867d8cd54-77bnr"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.220710 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.245895 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7789bbd757-45b5w"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.277352 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.277552 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="cinder-scheduler" containerID="cri-o://9398f7e713fb447b3a151d286be2d2910e4d8535fd421e906c46b8cc2c9a4728" gracePeriod=30 Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.277627 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="probe" containerID="cri-o://571155fa2c4a4cc11bc78f96b7b5b636cdda183726d54338d2bf0cc02d77f003" gracePeriod=30 Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.320490 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.320884 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" containerName="openstackclient" containerID="cri-o://998f26954c016e9a4be4fed72f68f879a1f7793c171311545d8f4958871325fd" gracePeriod=2 Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325451 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-combined-ca-bundle\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325534 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325580 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwf5\" (UniqueName: \"kubernetes.io/projected/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-kube-api-access-jpwf5\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325609 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac55021-a07e-443f-9ee9-e7516556b975-logs\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325625 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-logs\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325671 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mc9v\" (UniqueName: \"kubernetes.io/projected/9ac55021-a07e-443f-9ee9-e7516556b975-kube-api-access-7mc9v\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325696 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data-custom\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325736 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325788 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data-custom\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.325803 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-combined-ca-bundle\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.330247 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.330335 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data podName:fc3f4796-66b1-452b-afca-5e62cbf2a53b nodeName:}" failed. No retries permitted until 2026-01-30 05:29:41.830320777 +0000 UTC m=+1317.200231034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data") pod "rabbitmq-server-0" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b") : configmap "rabbitmq-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.359512 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-867d8cd54-77bnr"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.382479 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.400410 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6xxt5"] Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.400824 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" containerName="openstackclient" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.400840 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" containerName="openstackclient" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.401036 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" containerName="openstackclient" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.401611 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.411613 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.427667 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6xxt5"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432713 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data-custom\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-combined-ca-bundle\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-combined-ca-bundle\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432896 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwf5\" (UniqueName: \"kubernetes.io/projected/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-kube-api-access-jpwf5\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432922 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac55021-a07e-443f-9ee9-e7516556b975-logs\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432939 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-logs\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432973 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mc9v\" (UniqueName: \"kubernetes.io/projected/9ac55021-a07e-443f-9ee9-e7516556b975-kube-api-access-7mc9v\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.432994 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data-custom\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.433018 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.463579 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-logs\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.465289 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-combined-ca-bundle\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.472853 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac55021-a07e-443f-9ee9-e7516556b975-logs\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.473182 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data-custom\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.479948 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-combined-ca-bundle\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.489876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.490353 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data-custom\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.531071 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mc9v\" (UniqueName: \"kubernetes.io/projected/9ac55021-a07e-443f-9ee9-e7516556b975-kube-api-access-7mc9v\") pod \"barbican-keystone-listener-867d8cd54-77bnr\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.538744 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts\") pod \"root-account-create-update-6xxt5\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.538854 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7wh\" (UniqueName: \"kubernetes.io/projected/623f3c8f-d741-4ba4-baca-905a13102f38-kube-api-access-pq7wh\") pod \"root-account-create-update-6xxt5\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.542070 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.544529 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwf5\" (UniqueName: \"kubernetes.io/projected/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-kube-api-access-jpwf5\") pod \"barbican-worker-7789bbd757-45b5w\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.548093 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.636601 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cb5c-account-create-update-n52qj"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.647943 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts\") pod \"root-account-create-update-6xxt5\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.648032 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7wh\" (UniqueName: \"kubernetes.io/projected/623f3c8f-d741-4ba4-baca-905a13102f38-kube-api-access-pq7wh\") pod \"root-account-create-update-6xxt5\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.649289 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts\") pod \"root-account-create-update-6xxt5\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.650591 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.663566 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cb5c-account-create-update-n52qj"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.673123 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.687970 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.688228 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api-log" containerID="cri-o://c948d726013eb4e8273ef998118172023cae6536ca99db77a7f1ebd4884def12" gracePeriod=30 Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.688342 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api" containerID="cri-o://2823dcc09d156bc746ffbc3ab196c3d6e136f453bf377837d7cce10861a168f4" gracePeriod=30 Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.712692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7wh\" (UniqueName: \"kubernetes.io/projected/623f3c8f-d741-4ba4-baca-905a13102f38-kube-api-access-pq7wh\") pod \"root-account-create-update-6xxt5\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.713663 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cbdc6b6c8-m9v7c"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.715128 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.728102 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.739977 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-p975f"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.751411 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ad7de9-e01d-414c-8a4d-9073ad986186-operator-scripts\") pod \"barbican-cb5c-account-create-update-n52qj\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.751700 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9gv\" (UniqueName: \"kubernetes.io/projected/46ad7de9-e01d-414c-8a4d-9073ad986186-kube-api-access-5c9gv\") pod \"barbican-cb5c-account-create-update-n52qj\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.781821 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-p975f"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.805733 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cbdc6b6c8-m9v7c"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.806174 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.829529 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cb5c-account-create-update-7n4vq"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.842924 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cb5c-account-create-update-7n4vq"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855566 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9tkc\" (UniqueName: \"kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855654 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-public-tls-certs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855701 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-combined-ca-bundle\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855724 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-internal-tls-certs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855750 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9gv\" (UniqueName: \"kubernetes.io/projected/46ad7de9-e01d-414c-8a4d-9073ad986186-kube-api-access-5c9gv\") pod \"barbican-cb5c-account-create-update-n52qj\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855786 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855844 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98fff7bd-db4c-462f-8f2c-34733f4e81ad-logs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855882 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data-custom\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.855908 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ad7de9-e01d-414c-8a4d-9073ad986186-operator-scripts\") pod \"barbican-cb5c-account-create-update-n52qj\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.856542 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ad7de9-e01d-414c-8a4d-9073ad986186-operator-scripts\") pod \"barbican-cb5c-account-create-update-n52qj\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.856630 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.856669 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data podName:fc3f4796-66b1-452b-afca-5e62cbf2a53b nodeName:}" failed. No retries permitted until 2026-01-30 05:29:42.856656704 +0000 UTC m=+1318.226566961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data") pod "rabbitmq-server-0" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b") : configmap "rabbitmq-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.877512 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.894077 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9gv\" (UniqueName: \"kubernetes.io/projected/46ad7de9-e01d-414c-8a4d-9073ad986186-kube-api-access-5c9gv\") pod \"barbican-cb5c-account-create-update-n52qj\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.951907 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8ee9-account-create-update-c7rsn"] Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.954279 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969053 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-public-tls-certs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-combined-ca-bundle\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969131 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-internal-tls-certs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969192 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969281 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98fff7bd-db4c-462f-8f2c-34733f4e81ad-logs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data-custom\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.969368 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9tkc\" (UniqueName: \"kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.979580 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.982718 4931 projected.go:194] Error preparing data for projected volume kube-api-access-t9tkc for pod openstack/barbican-api-cbdc6b6c8-m9v7c: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.982797 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:42.482778751 +0000 UTC m=+1317.852688998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t9tkc" (UniqueName: "kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:41 crc kubenswrapper[4931]: I0130 05:29:41.983303 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8ee9-account-create-update-c7rsn"] Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.984804 4931 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.984845 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:42.484832129 +0000 UTC m=+1317.854742386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : secret "barbican-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.984878 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:41 crc kubenswrapper[4931]: E0130 05:29:41.985035 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data podName:081e3873-ea99-4486-925f-784a98e49405 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:42.484890541 +0000 UTC m=+1317.854935392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data") pod "rabbitmq-cell1-server-0" (UID: "081e3873-ea99-4486-925f-784a98e49405") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:41.992665 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-internal-tls-certs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.014902 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98fff7bd-db4c-462f-8f2c-34733f4e81ad-logs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.015601 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.020496 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8ee9-account-create-update-sdn4j"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.026387 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-combined-ca-bundle\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.028390 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-public-tls-certs\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.035476 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kbkmb"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.037303 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data-custom\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.071552 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4flg\" (UniqueName: \"kubernetes.io/projected/5d4d7097-4e75-41cb-b451-6feb8e2184b9-kube-api-access-q4flg\") pod \"cinder-8ee9-account-create-update-c7rsn\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.071627 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d7097-4e75-41cb-b451-6feb8e2184b9-operator-scripts\") pod \"cinder-8ee9-account-create-update-c7rsn\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.072521 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8ee9-account-create-update-sdn4j"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.096089 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kbkmb"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.141084 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.141331 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="ovn-northd" containerID="cri-o://cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949" gracePeriod=30 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.141789 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="openstack-network-exporter" containerID="cri-o://dea51d6ee685a2470eaa0864347990ea744cedc85d00846568c67d56ba221ee1" gracePeriod=30 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.156096 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-df05-account-create-update-xmzpk"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.173085 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4flg\" (UniqueName: \"kubernetes.io/projected/5d4d7097-4e75-41cb-b451-6feb8e2184b9-kube-api-access-q4flg\") pod \"cinder-8ee9-account-create-update-c7rsn\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.173411 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d7097-4e75-41cb-b451-6feb8e2184b9-operator-scripts\") pod \"cinder-8ee9-account-create-update-c7rsn\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.174204 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d7097-4e75-41cb-b451-6feb8e2184b9-operator-scripts\") pod \"cinder-8ee9-account-create-update-c7rsn\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.174481 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-df05-account-create-update-nrbm4"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.175535 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.189929 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.190304 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-df05-account-create-update-xmzpk"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.210567 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-df05-account-create-update-nrbm4"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.234290 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4flg\" (UniqueName: \"kubernetes.io/projected/5d4d7097-4e75-41cb-b451-6feb8e2184b9-kube-api-access-q4flg\") pod \"cinder-8ee9-account-create-update-c7rsn\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.258026 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0120-account-create-update-cj262"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.259141 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.275053 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-ntbbc"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.276449 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.282654 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ef60747-e73b-451c-b8e1-6abd596d31bb-operator-scripts\") pod \"glance-df05-account-create-update-nrbm4\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.282901 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjv6q\" (UniqueName: \"kubernetes.io/projected/7ef60747-e73b-451c-b8e1-6abd596d31bb-kube-api-access-xjv6q\") pod \"glance-df05-account-create-update-nrbm4\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.285155 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.290221 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.308471 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0120-account-create-update-cj262"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.334663 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-ntbbc"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.337869 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.384338 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjv6q\" (UniqueName: \"kubernetes.io/projected/7ef60747-e73b-451c-b8e1-6abd596d31bb-kube-api-access-xjv6q\") pod \"glance-df05-account-create-update-nrbm4\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.384402 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc6vk\" (UniqueName: \"kubernetes.io/projected/f493e630-c604-4fd1-9fa6-f26d6d1a179a-kube-api-access-zc6vk\") pod \"nova-cell0-10f6-account-create-update-ntbbc\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.398497 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f493e630-c604-4fd1-9fa6-f26d6d1a179a-operator-scripts\") pod \"nova-cell0-10f6-account-create-update-ntbbc\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.398607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6j5x\" (UniqueName: \"kubernetes.io/projected/d13136a7-4633-4386-822d-ceb2cb3320b8-kube-api-access-l6j5x\") pod \"nova-api-0120-account-create-update-cj262\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.398738 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ef60747-e73b-451c-b8e1-6abd596d31bb-operator-scripts\") pod \"glance-df05-account-create-update-nrbm4\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.398826 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13136a7-4633-4386-822d-ceb2cb3320b8-operator-scripts\") pod \"nova-api-0120-account-create-update-cj262\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.399765 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ef60747-e73b-451c-b8e1-6abd596d31bb-operator-scripts\") pod \"glance-df05-account-create-update-nrbm4\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.436074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjv6q\" (UniqueName: \"kubernetes.io/projected/7ef60747-e73b-451c-b8e1-6abd596d31bb-kube-api-access-xjv6q\") pod \"glance-df05-account-create-update-nrbm4\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.443493 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fkqxj"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.460922 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fkqxj"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.523543 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9tkc\" (UniqueName: \"kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.523655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13136a7-4633-4386-822d-ceb2cb3320b8-operator-scripts\") pod \"nova-api-0120-account-create-update-cj262\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.523706 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.523843 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc6vk\" (UniqueName: \"kubernetes.io/projected/f493e630-c604-4fd1-9fa6-f26d6d1a179a-kube-api-access-zc6vk\") pod \"nova-cell0-10f6-account-create-update-ntbbc\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.523863 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f493e630-c604-4fd1-9fa6-f26d6d1a179a-operator-scripts\") pod \"nova-cell0-10f6-account-create-update-ntbbc\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.523898 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6j5x\" (UniqueName: \"kubernetes.io/projected/d13136a7-4633-4386-822d-ceb2cb3320b8-kube-api-access-l6j5x\") pod \"nova-api-0120-account-create-update-cj262\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.525206 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.525302 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data podName:081e3873-ea99-4486-925f-784a98e49405 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:43.525270955 +0000 UTC m=+1318.895181212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data") pod "rabbitmq-cell1-server-0" (UID: "081e3873-ea99-4486-925f-784a98e49405") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.526466 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13136a7-4633-4386-822d-ceb2cb3320b8-operator-scripts\") pod \"nova-api-0120-account-create-update-cj262\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.526546 4931 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.526580 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:43.526569392 +0000 UTC m=+1318.896479649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : secret "barbican-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.527685 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f493e630-c604-4fd1-9fa6-f26d6d1a179a-operator-scripts\") pod \"nova-cell0-10f6-account-create-update-ntbbc\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.536464 4931 generic.go:334] "Generic (PLEG): container finished" podID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerID="dea51d6ee685a2470eaa0864347990ea744cedc85d00846568c67d56ba221ee1" exitCode=2 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.536566 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91","Type":"ContainerDied","Data":"dea51d6ee685a2470eaa0864347990ea744cedc85d00846568c67d56ba221ee1"} Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.547321 4931 projected.go:194] Error preparing data for projected volume kube-api-access-t9tkc for pod openstack/barbican-api-cbdc6b6c8-m9v7c: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.549913 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.557829 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:43.557764694 +0000 UTC m=+1318.927674951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9tkc" (UniqueName: "kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.563055 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-326d-account-create-update-b25zb"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.565978 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.574212 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.623589 4931 generic.go:334] "Generic (PLEG): container finished" podID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerID="c948d726013eb4e8273ef998118172023cae6536ca99db77a7f1ebd4884def12" exitCode=143 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.623975 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e","Type":"ContainerDied","Data":"c948d726013eb4e8273ef998118172023cae6536ca99db77a7f1ebd4884def12"} Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.689323 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc6vk\" (UniqueName: \"kubernetes.io/projected/f493e630-c604-4fd1-9fa6-f26d6d1a179a-kube-api-access-zc6vk\") pod \"nova-cell0-10f6-account-create-update-ntbbc\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.695394 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6j5x\" (UniqueName: \"kubernetes.io/projected/d13136a7-4633-4386-822d-ceb2cb3320b8-kube-api-access-l6j5x\") pod \"nova-api-0120-account-create-update-cj262\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.699105 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.757945 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-b25zb"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.758615 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-operator-scripts\") pod \"nova-cell1-326d-account-create-update-b25zb\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.758739 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swhwl\" (UniqueName: \"kubernetes.io/projected/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-kube-api-access-swhwl\") pod \"nova-cell1-326d-account-create-update-b25zb\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.845404 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.846057 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="openstack-network-exporter" containerID="cri-o://4583c4efc80289000d7023b793d84dc55442d51907c690cc558e927738cb2e88" gracePeriod=300 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.861703 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-operator-scripts\") pod \"nova-cell1-326d-account-create-update-b25zb\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.861779 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swhwl\" (UniqueName: \"kubernetes.io/projected/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-kube-api-access-swhwl\") pod \"nova-cell1-326d-account-create-update-b25zb\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.861939 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: E0130 05:29:42.861985 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data podName:fc3f4796-66b1-452b-afca-5e62cbf2a53b nodeName:}" failed. No retries permitted until 2026-01-30 05:29:44.861970479 +0000 UTC m=+1320.231880736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data") pod "rabbitmq-server-0" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b") : configmap "rabbitmq-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.874902 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-operator-scripts\") pod \"nova-cell1-326d-account-create-update-b25zb\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.902156 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swhwl\" (UniqueName: \"kubernetes.io/projected/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-kube-api-access-swhwl\") pod \"nova-cell1-326d-account-create-update-b25zb\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.929767 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rpr97"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.940085 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0120-account-create-update-dptmf"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.940751 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.956508 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75d9f6f6ff-kmswn"] Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.956843 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75d9f6f6ff-kmswn" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-api" containerID="cri-o://59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22" gracePeriod=30 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.957248 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75d9f6f6ff-kmswn" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-httpd" containerID="cri-o://e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e" gracePeriod=30 Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.975865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:42 crc kubenswrapper[4931]: I0130 05:29:42.986312 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0120-account-create-update-dptmf"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.003677 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="ovsdbserver-sb" containerID="cri-o://8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c" gracePeriod=300 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.030340 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rpr97"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.047311 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.047689 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="openstack-network-exporter" containerID="cri-o://36f1f59d90f7e1367de837bd2375e2c11d0df21e5687e4d77b474faff3e8df0b" gracePeriod=300 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.073784 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-vfdzl"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.089467 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-vfdzl"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.124205 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-dvktv"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.124545 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-dvktv" podUID="4ba289fc-17e9-45e9-ac24-434d69045d97" containerName="openstack-network-exporter" containerID="cri-o://82c70d68aab65fc3db72ee184a048732b17b72a09f49232810d0c430a261f1e7" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.140706 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ggjtl"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.151410 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-thxc2"] Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.158980 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c is running failed: container process not found" containerID="8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.161013 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c is running failed: container process not found" containerID="8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.166889 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c is running failed: container process not found" containerID="8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.166962 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="ovsdbserver-sb" Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.199275 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf28f211b_be26_4f15_92a1_36b91cb53bbb.slice/crio-4583c4efc80289000d7023b793d84dc55442d51907c690cc558e927738cb2e88.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf28f211b_be26_4f15_92a1_36b91cb53bbb.slice/crio-conmon-8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.219600 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wxb94"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.256813 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wxb94"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.281575 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-cwv94"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.295980 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-rvcsw"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.317516 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-cwv94"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.327235 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ldr24"] Jan 30 05:29:43 crc kubenswrapper[4931]: W0130 05:29:43.339129 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod623f3c8f_d741_4ba4_baca_905a13102f38.slice/crio-b0a95f43d76f1ec6c04259c654f8d4d57485ed8bd7f8f8efedcad5a660e7a5b0 WatchSource:0}: Error finding container b0a95f43d76f1ec6c04259c654f8d4d57485ed8bd7f8f8efedcad5a660e7a5b0: Status 404 returned error can't find the container with id b0a95f43d76f1ec6c04259c654f8d4d57485ed8bd7f8f8efedcad5a660e7a5b0 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.390080 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-rvcsw"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.418365 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ldr24"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.451518 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="ovsdbserver-nb" containerID="cri-o://ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21" gracePeriod=300 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.453844 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="053ccacf-d473-49f5-89e5-545a753e5e03" path="/var/lib/kubelet/pods/053ccacf-d473-49f5-89e5-545a753e5e03/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.454799 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c65b18-0526-4eec-a608-20478c5eb008" path="/var/lib/kubelet/pods/08c65b18-0526-4eec-a608-20478c5eb008/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.455481 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff3c7ac-e403-4826-bf45-a6bed05570b7" path="/var/lib/kubelet/pods/1ff3c7ac-e403-4826-bf45-a6bed05570b7/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.456004 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719" path="/var/lib/kubelet/pods/2b999b60-d1ed-4c1a-8ca2-da7aa9dbb719/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.457024 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c29ace9-3be7-44a1-b8eb-d356a4721152" path="/var/lib/kubelet/pods/2c29ace9-3be7-44a1-b8eb-d356a4721152/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.457558 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b78d3f2-c575-4b24-bbb8-c956f61a575d" path="/var/lib/kubelet/pods/3b78d3f2-c575-4b24-bbb8-c956f61a575d/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.458110 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438fbbb5-a318-4714-9dac-e3f0fc3f63d3" path="/var/lib/kubelet/pods/438fbbb5-a318-4714-9dac-e3f0fc3f63d3/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.476560 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd6723b-baf8-47eb-a774-68a5dfbcc4a6" path="/var/lib/kubelet/pods/6dd6723b-baf8-47eb-a774-68a5dfbcc4a6/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.483217 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1" path="/var/lib/kubelet/pods/7d8ef5ef-69a5-48f4-87d7-dbce2a99f8a1/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.487746 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1b1f6c-2147-48f7-87ea-e64672036831" path="/var/lib/kubelet/pods/bf1b1f6c-2147-48f7-87ea-e64672036831/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.490040 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98e6af1-4571-4da7-a6e8-0b54505af47c" path="/var/lib/kubelet/pods/d98e6af1-4571-4da7-a6e8-0b54505af47c/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.493264 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd612f9b-4de8-48e4-a945-c97e5c495292" path="/var/lib/kubelet/pods/dd612f9b-4de8-48e4-a945-c97e5c495292/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.499910 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ddcee7-a757-43b5-bf76-552cbd8d9078" path="/var/lib/kubelet/pods/f3ddcee7-a757-43b5-bf76-552cbd8d9078/volumes" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.502094 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-ctzjd"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.502948 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.523306 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerName="dnsmasq-dns" containerID="cri-o://e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e" gracePeriod=10 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.606073 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9tkc\" (UniqueName: \"kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.611621 4931 projected.go:194] Error preparing data for projected volume kube-api-access-t9tkc for pod openstack/barbican-api-cbdc6b6c8-m9v7c: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.632409 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.634950 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.635021 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data podName:081e3873-ea99-4486-925f-784a98e49405 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:45.635002744 +0000 UTC m=+1321.004913001 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data") pod "rabbitmq-cell1-server-0" (UID: "081e3873-ea99-4486-925f-784a98e49405") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.636554 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637175 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-server" containerID="cri-o://e78b14aff0684ab7de691a18e86ce169b9e67b8f02342d8eba7927de4cb39ec6" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637579 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="swift-recon-cron" containerID="cri-o://7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637625 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="rsync" containerID="cri-o://fa1a93081b269f4ada317cd8046ebb7f1a7c1edf1f6e97c13ed393eacd7e1973" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637668 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-expirer" containerID="cri-o://cf71a5f4e5a8611b3edb8a350ba0e2eedbd78c0fb76770c94841152df4a3ab69" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637706 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-updater" containerID="cri-o://577bb47efe5f44d38e3c888fdf879028a229599eb11554a344d6a077afa58802" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637749 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-auditor" containerID="cri-o://2aa8176e0269c78ed82e92b582f8a0a44311ad87daa1079e948a15315c72207f" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637793 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-replicator" containerID="cri-o://cc9cceab2cf461936102038fbf1707f4be2e195decb5808193c7f51c4adb08d3" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637832 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-server" containerID="cri-o://b4e57fcd32132c1d41dc41783803f5dfdbd53a0317437ca189732c5c62a33471" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637872 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-updater" containerID="cri-o://6088dfc85b1e09a936dd16faccae994e80e2ccb29840c2c09302842b83328fc2" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637915 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-auditor" containerID="cri-o://072ac216076cb0fec2ec21789975f4b6fdf297b846d9774f980cb280a52a2718" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637954 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-replicator" containerID="cri-o://840bb675eae49d372214aa49017516eb7cc03feb5f0cebb6fb56a2dd4d0837b9" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.637994 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-server" containerID="cri-o://01987d0b4f025a347544f55c09ea6cf4f3249f746d37222f35ee196eb4525b63" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.638052 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-reaper" containerID="cri-o://9eac75907fb7af02eb9159bedaf64c4ca7dee04ca441b549c3a48132b186515f" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.638092 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-auditor" containerID="cri-o://64945c3ef451f83b413801249e5cc8cedcb622a5a129ee80defec98e393eed29" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.638168 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-replicator" containerID="cri-o://de7939eb8c76f478b34a03e7dd08f4a308dc8bbb63a287db0a5b3eec8794cc7c" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.638286 4931 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.638328 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:45.638317297 +0000 UTC m=+1321.008227554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : secret "barbican-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4931]: E0130 05:29:43.642654 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:45.642620279 +0000 UTC m=+1321.012530536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9tkc" (UniqueName: "kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.684406 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-798b7dc5fb-xl2zq"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.684730 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-798b7dc5fb-xl2zq" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-log" containerID="cri-o://1d733edb3ceaca43f34355e23bbaaced9e55a731057ead7b89c96398337d6e11" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.684917 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-798b7dc5fb-xl2zq" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-api" containerID="cri-o://e8519c60ec437acc9c9b5934ab3951ad5ad349186eda26fc85c2bae9b3010203" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.720844 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bcdcb"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.769594 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="rabbitmq" containerID="cri-o://1c82fc5914a60be58942659c6c59b346ba961ba1c401d7f0c82d22447fc0b135" gracePeriod=604800 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.770513 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-bcdcb"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.779657 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6xxt5" event={"ID":"623f3c8f-d741-4ba4-baca-905a13102f38","Type":"ContainerStarted","Data":"b0a95f43d76f1ec6c04259c654f8d4d57485ed8bd7f8f8efedcad5a660e7a5b0"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.791007 4931 generic.go:334] "Generic (PLEG): container finished" podID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerID="571155fa2c4a4cc11bc78f96b7b5b636cdda183726d54338d2bf0cc02d77f003" exitCode=0 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.791157 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd","Type":"ContainerDied","Data":"571155fa2c4a4cc11bc78f96b7b5b636cdda183726d54338d2bf0cc02d77f003"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.794448 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-557f-account-create-update-6vjq5"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.817884 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-557f-account-create-update-6vjq5"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.825139 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_49a63fb4-24bc-4834-b6e7-937688c5de09/ovsdbserver-nb/0.log" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.825199 4931 generic.go:334] "Generic (PLEG): container finished" podID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerID="36f1f59d90f7e1367de837bd2375e2c11d0df21e5687e4d77b474faff3e8df0b" exitCode=2 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.825221 4931 generic.go:334] "Generic (PLEG): container finished" podID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerID="ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21" exitCode=143 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.825289 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"49a63fb4-24bc-4834-b6e7-937688c5de09","Type":"ContainerDied","Data":"36f1f59d90f7e1367de837bd2375e2c11d0df21e5687e4d77b474faff3e8df0b"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.825315 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"49a63fb4-24bc-4834-b6e7-937688c5de09","Type":"ContainerDied","Data":"ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.850793 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" event={"ID":"9ac55021-a07e-443f-9ee9-e7516556b975","Type":"ContainerStarted","Data":"ddc1c9e389f315057ec0a85201907373bcd7582adeb9a6f356d1b36e03264dc9"} Jan 30 05:29:43 crc kubenswrapper[4931]: W0130 05:29:43.858080 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46ad7de9_e01d_414c_8a4d_9073ad986186.slice/crio-d1886781d10ecb06348f9abb9caccf69a1bc5841a942465cb5e9e6aa23d451dd WatchSource:0}: Error finding container d1886781d10ecb06348f9abb9caccf69a1bc5841a942465cb5e9e6aa23d451dd: Status 404 returned error can't find the container with id d1886781d10ecb06348f9abb9caccf69a1bc5841a942465cb5e9e6aa23d451dd Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.895771 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dvktv_4ba289fc-17e9-45e9-ac24-434d69045d97/openstack-network-exporter/0.log" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.895823 4931 generic.go:334] "Generic (PLEG): container finished" podID="4ba289fc-17e9-45e9-ac24-434d69045d97" containerID="82c70d68aab65fc3db72ee184a048732b17b72a09f49232810d0c430a261f1e7" exitCode=2 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.895916 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dvktv" event={"ID":"4ba289fc-17e9-45e9-ac24-434d69045d97","Type":"ContainerDied","Data":"82c70d68aab65fc3db72ee184a048732b17b72a09f49232810d0c430a261f1e7"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.913052 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9b8l8"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.943109 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f28f211b-be26-4f15-92a1-36b91cb53bbb/ovsdbserver-sb/0.log" Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.943515 4931 generic.go:334] "Generic (PLEG): container finished" podID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerID="4583c4efc80289000d7023b793d84dc55442d51907c690cc558e927738cb2e88" exitCode=2 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.943629 4931 generic.go:334] "Generic (PLEG): container finished" podID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerID="8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c" exitCode=143 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.943851 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f28f211b-be26-4f15-92a1-36b91cb53bbb","Type":"ContainerDied","Data":"4583c4efc80289000d7023b793d84dc55442d51907c690cc558e927738cb2e88"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.944038 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f28f211b-be26-4f15-92a1-36b91cb53bbb","Type":"ContainerDied","Data":"8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.951558 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9b8l8"] Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.970892 4931 generic.go:334] "Generic (PLEG): container finished" podID="e1f9790c-c395-4c72-b569-3140f703b56f" containerID="e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e" exitCode=0 Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.971055 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75d9f6f6ff-kmswn" event={"ID":"e1f9790c-c395-4c72-b569-3140f703b56f","Type":"ContainerDied","Data":"e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e"} Jan 30 05:29:43 crc kubenswrapper[4931]: I0130 05:29:43.987959 4931 generic.go:334] "Generic (PLEG): container finished" podID="6b263e8e-7618-4044-bed1-b35174d6a8f4" containerID="998f26954c016e9a4be4fed72f68f879a1f7793c171311545d8f4958871325fd" exitCode=137 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.046650 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tjkcd"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.051678 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" containerID="cri-o://52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.053922 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: if [ -n "barbican" ]; then Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="barbican" Jan 30 05:29:44 crc kubenswrapper[4931]: else Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4931]: fi Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4931]: # support updates Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.062688 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tjkcd"] Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.066079 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-cb5c-account-create-update-n52qj" podUID="46ad7de9-e01d-414c-8a4d-9073ad986186" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.070466 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-867d8cd54-77bnr"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.101696 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6xxt5"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.109501 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hqm5b"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.117390 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hqm5b"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.125474 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.125755 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-log" containerID="cri-o://754804bc268dc311547eddc996a035b132392b26798898d2ba034bc32dc1ee16" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.125962 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-httpd" containerID="cri-o://3509d69982e816f9732671b256bbf363b32c9a199362011499a8607bf3a6e808" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.131526 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.131733 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-log" containerID="cri-o://3795752efe01b170e153d47107ec186f09220104cdec5c71d0a210a63580f858" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.132106 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-httpd" containerID="cri-o://cc448c5e4a9d4def969b75156b3cc39bbccbe47f49a05ef9d15592b4643a809f" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.141231 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a921-account-create-update-mqpxv"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.169357 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a921-account-create-update-mqpxv"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.179016 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dvktv_4ba289fc-17e9-45e9-ac24-434d69045d97/openstack-network-exporter/0.log" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.179071 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.185782 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8ee9-account-create-update-c7rsn"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.196308 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wtjbg"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.211827 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f28f211b-be26-4f15-92a1-36b91cb53bbb/ovsdbserver-sb/0.log" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.211888 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.212321 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wtjbg"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.248595 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9z9pd"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.268504 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9z9pd"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.271432 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.294628 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-df05-account-create-update-nrbm4"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.319895 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4c2nt"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.328132 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4c2nt"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.334562 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0120-account-create-update-cj262"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.340444 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cb5c-account-create-update-n52qj"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.350716 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vbzqc"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.362632 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vbzqc"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.362711 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.371271 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373083 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-combined-ca-bundle\") pod \"6b263e8e-7618-4044-bed1-b35174d6a8f4\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373130 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-scripts\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373152 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-combined-ca-bundle\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373168 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdb-rundir\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373229 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-combined-ca-bundle\") pod \"4ba289fc-17e9-45e9-ac24-434d69045d97\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373243 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovn-rundir\") pod \"4ba289fc-17e9-45e9-ac24-434d69045d97\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373262 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k758\" (UniqueName: \"kubernetes.io/projected/6b263e8e-7618-4044-bed1-b35174d6a8f4-kube-api-access-5k758\") pod \"6b263e8e-7618-4044-bed1-b35174d6a8f4\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373317 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9mdl\" (UniqueName: \"kubernetes.io/projected/f28f211b-be26-4f15-92a1-36b91cb53bbb-kube-api-access-k9mdl\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373341 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-metrics-certs-tls-certs\") pod \"4ba289fc-17e9-45e9-ac24-434d69045d97\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.373381 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-config\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.379496 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba289fc-17e9-45e9-ac24-434d69045d97-config\") pod \"4ba289fc-17e9-45e9-ac24-434d69045d97\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.379553 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config-secret\") pod \"6b263e8e-7618-4044-bed1-b35174d6a8f4\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.379588 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config\") pod \"6b263e8e-7618-4044-bed1-b35174d6a8f4\" (UID: \"6b263e8e-7618-4044-bed1-b35174d6a8f4\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.379611 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6ntv\" (UniqueName: \"kubernetes.io/projected/4ba289fc-17e9-45e9-ac24-434d69045d97-kube-api-access-j6ntv\") pod \"4ba289fc-17e9-45e9-ac24-434d69045d97\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.379654 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovs-rundir\") pod \"4ba289fc-17e9-45e9-ac24-434d69045d97\" (UID: \"4ba289fc-17e9-45e9-ac24-434d69045d97\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.380376 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xvdtt"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.380440 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xvdtt"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.381159 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba289fc-17e9-45e9-ac24-434d69045d97-config" (OuterVolumeSpecName: "config") pod "4ba289fc-17e9-45e9-ac24-434d69045d97" (UID: "4ba289fc-17e9-45e9-ac24-434d69045d97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.390496 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.390739 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-log" containerID="cri-o://e5dadd497214a5d2efc5b8027947f3661f7f73599b0778570358c42329955e8d" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.390964 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-config" (OuterVolumeSpecName: "config") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.391146 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-api" containerID="cri-o://d70e15b0f074e59f1a9f39048c2cf45a62e81400091cb70df139030d514fe003" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.392942 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-ntbbc"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.393518 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "4ba289fc-17e9-45e9-ac24-434d69045d97" (UID: "4ba289fc-17e9-45e9-ac24-434d69045d97"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.393627 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdbserver-sb-tls-certs\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.393670 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-metrics-certs-tls-certs\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.393811 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28f211b-be26-4f15-92a1-36b91cb53bbb-kube-api-access-k9mdl" (OuterVolumeSpecName: "kube-api-access-k9mdl") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "kube-api-access-k9mdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.393832 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"f28f211b-be26-4f15-92a1-36b91cb53bbb\" (UID: \"f28f211b-be26-4f15-92a1-36b91cb53bbb\") " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.394760 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9mdl\" (UniqueName: \"kubernetes.io/projected/f28f211b-be26-4f15-92a1-36b91cb53bbb-kube-api-access-k9mdl\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.394833 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.394890 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba289fc-17e9-45e9-ac24-434d69045d97-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.394945 4931 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.395233 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.401322 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "4ba289fc-17e9-45e9-ac24-434d69045d97" (UID: "4ba289fc-17e9-45e9-ac24-434d69045d97"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.401346 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.401541 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-scripts" (OuterVolumeSpecName: "scripts") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.402211 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-x4mqp"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.407938 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-x4mqp"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.413811 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-b25zb"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.414039 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b263e8e-7618-4044-bed1-b35174d6a8f4-kube-api-access-5k758" (OuterVolumeSpecName: "kube-api-access-5k758") pod "6b263e8e-7618-4044-bed1-b35174d6a8f4" (UID: "6b263e8e-7618-4044-bed1-b35174d6a8f4"). InnerVolumeSpecName "kube-api-access-5k758". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.420043 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.420959 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-log" containerID="cri-o://a268ff4ead170d4fc7c25a89e846ed2d0f10278b94da4082529cc4ebd9ab4f0e" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.421028 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-metadata" containerID="cri-o://6c90254ae67ae50ab19fa555ce55d1839d94322882a09bb91ad616b62efcfeeb" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.423733 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba289fc-17e9-45e9-ac24-434d69045d97-kube-api-access-j6ntv" (OuterVolumeSpecName: "kube-api-access-j6ntv") pod "4ba289fc-17e9-45e9-ac24-434d69045d97" (UID: "4ba289fc-17e9-45e9-ac24-434d69045d97"). InnerVolumeSpecName "kube-api-access-j6ntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.425911 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7789bbd757-45b5w"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.434028 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d69b6c966-npv8t"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.434312 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d69b6c966-npv8t" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api-log" containerID="cri-o://0e0a199cc977b5213010336cc2b6c461a3916b61b6c3d9f6dc8eecc7d5c8d17e" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.434855 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d69b6c966-npv8t" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api" containerID="cri-o://44392d9ac535d9a3ce2ca47aa88e680823c3197a2e50d537aa67df4b03e52fd1" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.437645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.437986 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.445806 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.445962 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.481464 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-867d8cd54-77bnr"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.512076 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" containerID="cri-o://ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" gracePeriod=29 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.518187 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6ntv\" (UniqueName: \"kubernetes.io/projected/4ba289fc-17e9-45e9-ac24-434d69045d97-kube-api-access-j6ntv\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.518235 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.518246 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28f211b-be26-4f15-92a1-36b91cb53bbb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.518261 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.518271 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ba289fc-17e9-45e9-ac24-434d69045d97-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.518281 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k758\" (UniqueName: \"kubernetes.io/projected/6b263e8e-7618-4044-bed1-b35174d6a8f4-kube-api-access-5k758\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.530110 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5f5d456c6b-66jxb"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.530370 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener-log" containerID="cri-o://f0f483180c30bc672edc2e00c840d52567eb9b0c61f8c285d3a7c2a185f38020" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.530871 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener" containerID="cri-o://1bd0c14353cbfd196f658cae7f7167624a1cc818a0ca23ec5151f1c871a22e65" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.553802 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7c996f77-c9rqm"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.554147 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7c996f77-c9rqm" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker-log" containerID="cri-o://4ec5d987a513f3f04cf30f8d242bd5ee734a2387c5a279b070c18b72f4a56519" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.556598 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7c996f77-c9rqm" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker" containerID="cri-o://2c58ff417f0ff91cddebc47f633febd5ac50ac3ddd97dcc2fce28574c94ac8a6" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.580957 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cbdc6b6c8-m9v7c"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.586168 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6b263e8e-7618-4044-bed1-b35174d6a8f4" (UID: "6b263e8e-7618-4044-bed1-b35174d6a8f4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.589525 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.589745 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2565fa42-f180-4948-8b2f-68c419d78d2b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.592072 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data kube-api-access-t9tkc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/barbican-api-cbdc6b6c8-m9v7c" podUID="98fff7bd-db4c-462f-8f2c-34733f4e81ad" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.594548 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7789bbd757-45b5w"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.608331 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cb5c-account-create-update-n52qj"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.614299 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.614903 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="9bb44c01-e79f-42d8-912c-66db07c6b328" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9aac5eb9a7735bf1efc6d134170c4297691e32c19fa1a2cd01ab0ae918243436" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.622262 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.626936 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jhn9j"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.631685 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jhn9j"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.639464 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s287f"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.646124 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.646362 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="88988b92-cd64-490d-b55f-959ecf4095af" containerName="nova-cell0-conductor-conductor" containerID="cri-o://83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.646680 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s287f"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.667172 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-ntbbc"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.677475 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8ee9-account-create-update-c7rsn"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.682457 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-df05-account-create-update-nrbm4"] Jan 30 05:29:44 crc kubenswrapper[4931]: W0130 05:29:44.686215 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef60747_e73b_451c_b8e1_6abd596d31bb.slice/crio-5f23441d1b937de628ce230831d40e43097271e249db0186baccaa1a1137dc00 WatchSource:0}: Error finding container 5f23441d1b937de628ce230831d40e43097271e249db0186baccaa1a1137dc00: Status 404 returned error can't find the container with id 5f23441d1b937de628ce230831d40e43097271e249db0186baccaa1a1137dc00 Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.716467 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: if [ -n "cinder" ]; then Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="cinder" Jan 30 05:29:44 crc kubenswrapper[4931]: else Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4931]: fi Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4931]: # support updates Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.716959 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: if [ -n "nova_cell0" ]; then Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="nova_cell0" Jan 30 05:29:44 crc kubenswrapper[4931]: else Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4931]: fi Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4931]: # support updates Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.717279 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: if [ -n "glance" ]; then Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="glance" Jan 30 05:29:44 crc kubenswrapper[4931]: else Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4931]: fi Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4931]: # support updates Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.718586 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-df05-account-create-update-nrbm4" podUID="7ef60747-e73b-451c-b8e1-6abd596d31bb" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.718653 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-8ee9-account-create-update-c7rsn" podUID="5d4d7097-4e75-41cb-b451-6feb8e2184b9" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.718690 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" podUID="f493e630-c604-4fd1-9fa6-f26d6d1a179a" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.743363 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-b25zb"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.779455 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.782146 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.782322 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" containerName="nova-scheduler-scheduler" containerID="cri-o://cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.824051 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21 is running failed: container process not found" containerID="ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.824533 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21 is running failed: container process not found" containerID="ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.824685 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21 is running failed: container process not found" containerID="ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.824704 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="ovsdbserver-nb" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.832855 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-76fb878d5c-s22sw"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.833511 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.833104 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-76fb878d5c-s22sw" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-httpd" containerID="cri-o://3baca7478354f0fb4066c265761c1fd4465b993347daea084eeb2d40cd40bed6" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.833713 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-76fb878d5c-s22sw" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-server" containerID="cri-o://02a426537f79889d684c812318ea1dd0bd0af03a098fb1e7d47cd94e43353e1c" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.851250 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0120-account-create-update-cj262"] Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.916707 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="rabbitmq" containerID="cri-o://1bb7e19530d33f5a4cf134ca5c6644743c868cc750cac0c1bb313f0f47240dd8" gracePeriod=604800 Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.917442 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: if [ -n "nova_cell1" ]; then Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="nova_cell1" Jan 30 05:29:44 crc kubenswrapper[4931]: else Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4931]: fi Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4931]: # support updates Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.918194 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: if [ -n "nova_api" ]; then Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="nova_api" Jan 30 05:29:44 crc kubenswrapper[4931]: else Jan 30 05:29:44 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4931]: fi Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4931]: # support updates Jan 30 05:29:44 crc kubenswrapper[4931]: Jan 30 05:29:44 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.919032 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-326d-account-create-update-b25zb" podUID="2b6b4ccf-805f-463c-b8c9-d975fd2a9059" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.922990 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-0120-account-create-update-cj262" podUID="d13136a7-4633-4386-822d-ceb2cb3320b8" Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.934802 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:44 crc kubenswrapper[4931]: E0130 05:29:44.934883 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data podName:fc3f4796-66b1-452b-afca-5e62cbf2a53b nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.934869998 +0000 UTC m=+1324.304780255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data") pod "rabbitmq-server-0" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b") : configmap "rabbitmq-config-data" not found Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.983574 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ba289fc-17e9-45e9-ac24-434d69045d97" (UID: "4ba289fc-17e9-45e9-ac24-434d69045d97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4931]: I0130 05:29:44.985637 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.006571 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b263e8e-7618-4044-bed1-b35174d6a8f4" (UID: "6b263e8e-7618-4044-bed1-b35174d6a8f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.008320 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dvktv_4ba289fc-17e9-45e9-ac24-434d69045d97/openstack-network-exporter/0.log" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.008389 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dvktv" event={"ID":"4ba289fc-17e9-45e9-ac24-434d69045d97","Type":"ContainerDied","Data":"39a86ec198f21c9ed97c5b274927fc46f2f6f56ea606ee080f8268afe4d2241b"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.008473 4931 scope.go:117] "RemoveContainer" containerID="82c70d68aab65fc3db72ee184a048732b17b72a09f49232810d0c430a261f1e7" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.008594 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dvktv" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.025886 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6b263e8e-7618-4044-bed1-b35174d6a8f4" (UID: "6b263e8e-7618-4044-bed1-b35174d6a8f4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.032789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8ee9-account-create-update-c7rsn" event={"ID":"5d4d7097-4e75-41cb-b451-6feb8e2184b9","Type":"ContainerStarted","Data":"f0d507bce832298463e6a094cc7b0f7eb6c19d37e2a1f9f33913556dc5ffc1c1"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.034067 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerName="galera" containerID="cri-o://1c39e215a63df32503630d9061f8096755ed069b7c32eec93d18140193ca977b" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.040714 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4ba289fc-17e9-45e9-ac24-434d69045d97" (UID: "4ba289fc-17e9-45e9-ac24-434d69045d97"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.056991 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.057022 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.057032 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.057040 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba289fc-17e9-45e9-ac24-434d69045d97-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.057054 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b263e8e-7618-4044-bed1-b35174d6a8f4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.057522 4931 generic.go:334] "Generic (PLEG): container finished" podID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerID="1d733edb3ceaca43f34355e23bbaaced9e55a731057ead7b89c96398337d6e11" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.057621 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798b7dc5fb-xl2zq" event={"ID":"ebe4f743-9a60-428f-8b58-14ba160d9fd7","Type":"ContainerDied","Data":"1d733edb3ceaca43f34355e23bbaaced9e55a731057ead7b89c96398337d6e11"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.064481 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.081412 4931 generic.go:334] "Generic (PLEG): container finished" podID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.081482 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerDied","Data":"ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.091487 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" event={"ID":"f493e630-c604-4fd1-9fa6-f26d6d1a179a","Type":"ContainerStarted","Data":"34afa4a36598164bccdcad1293ddead8d7610abc3c9551b334f25c08a708b5f9"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.103341 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f28f211b-be26-4f15-92a1-36b91cb53bbb" (UID: "f28f211b-be26-4f15-92a1-36b91cb53bbb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.158906 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.158935 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28f211b-be26-4f15-92a1-36b91cb53bbb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166834 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="fa1a93081b269f4ada317cd8046ebb7f1a7c1edf1f6e97c13ed393eacd7e1973" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166870 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="cf71a5f4e5a8611b3edb8a350ba0e2eedbd78c0fb76770c94841152df4a3ab69" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166879 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="577bb47efe5f44d38e3c888fdf879028a229599eb11554a344d6a077afa58802" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166888 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="2aa8176e0269c78ed82e92b582f8a0a44311ad87daa1079e948a15315c72207f" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166895 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="cc9cceab2cf461936102038fbf1707f4be2e195decb5808193c7f51c4adb08d3" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166903 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="b4e57fcd32132c1d41dc41783803f5dfdbd53a0317437ca189732c5c62a33471" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166909 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="6088dfc85b1e09a936dd16faccae994e80e2ccb29840c2c09302842b83328fc2" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166918 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="072ac216076cb0fec2ec21789975f4b6fdf297b846d9774f980cb280a52a2718" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166924 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="840bb675eae49d372214aa49017516eb7cc03feb5f0cebb6fb56a2dd4d0837b9" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166930 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="01987d0b4f025a347544f55c09ea6cf4f3249f746d37222f35ee196eb4525b63" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166937 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="9eac75907fb7af02eb9159bedaf64c4ca7dee04ca441b549c3a48132b186515f" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166942 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="64945c3ef451f83b413801249e5cc8cedcb622a5a129ee80defec98e393eed29" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166948 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="de7939eb8c76f478b34a03e7dd08f4a308dc8bbb63a287db0a5b3eec8794cc7c" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166954 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="e78b14aff0684ab7de691a18e86ce169b9e67b8f02342d8eba7927de4cb39ec6" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.166994 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"fa1a93081b269f4ada317cd8046ebb7f1a7c1edf1f6e97c13ed393eacd7e1973"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167019 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"cf71a5f4e5a8611b3edb8a350ba0e2eedbd78c0fb76770c94841152df4a3ab69"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167029 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"577bb47efe5f44d38e3c888fdf879028a229599eb11554a344d6a077afa58802"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167037 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"2aa8176e0269c78ed82e92b582f8a0a44311ad87daa1079e948a15315c72207f"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167045 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"cc9cceab2cf461936102038fbf1707f4be2e195decb5808193c7f51c4adb08d3"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167056 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"b4e57fcd32132c1d41dc41783803f5dfdbd53a0317437ca189732c5c62a33471"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"6088dfc85b1e09a936dd16faccae994e80e2ccb29840c2c09302842b83328fc2"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167073 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"072ac216076cb0fec2ec21789975f4b6fdf297b846d9774f980cb280a52a2718"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167081 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"840bb675eae49d372214aa49017516eb7cc03feb5f0cebb6fb56a2dd4d0837b9"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167089 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"01987d0b4f025a347544f55c09ea6cf4f3249f746d37222f35ee196eb4525b63"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167098 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"9eac75907fb7af02eb9159bedaf64c4ca7dee04ca441b549c3a48132b186515f"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167105 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"64945c3ef451f83b413801249e5cc8cedcb622a5a129ee80defec98e393eed29"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"de7939eb8c76f478b34a03e7dd08f4a308dc8bbb63a287db0a5b3eec8794cc7c"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.167121 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"e78b14aff0684ab7de691a18e86ce169b9e67b8f02342d8eba7927de4cb39ec6"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.168746 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_49a63fb4-24bc-4834-b6e7-937688c5de09/ovsdbserver-nb/0.log" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.168793 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"49a63fb4-24bc-4834-b6e7-937688c5de09","Type":"ContainerDied","Data":"d92e6599441617be3a228c318ab5084a192dbb4da1df24b47362bc9f2366da37"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.168810 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92e6599441617be3a228c318ab5084a192dbb4da1df24b47362bc9f2366da37" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.169525 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df05-account-create-update-nrbm4" event={"ID":"7ef60747-e73b-451c-b8e1-6abd596d31bb","Type":"ContainerStarted","Data":"5f23441d1b937de628ce230831d40e43097271e249db0186baccaa1a1137dc00"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.178892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7789bbd757-45b5w" event={"ID":"c0c7aeee-9023-433a-83d0-aa0e9942a0ed","Type":"ContainerStarted","Data":"eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.178935 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7789bbd757-45b5w" event={"ID":"c0c7aeee-9023-433a-83d0-aa0e9942a0ed","Type":"ContainerStarted","Data":"b30436eda9ab254987a1049643d0e45f01f12d10b3f44f43863aa93c4c7ce86b"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.194207 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0120-account-create-update-cj262" event={"ID":"d13136a7-4633-4386-822d-ceb2cb3320b8","Type":"ContainerStarted","Data":"94259a3980ca06eabd57f602644b7974c5802e08901f053b5b514caf5639d01b"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.214029 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cb5c-account-create-update-n52qj" event={"ID":"46ad7de9-e01d-414c-8a4d-9073ad986186","Type":"ContainerStarted","Data":"d1886781d10ecb06348f9abb9caccf69a1bc5841a942465cb5e9e6aa23d451dd"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.242595 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-326d-account-create-update-b25zb" event={"ID":"2b6b4ccf-805f-463c-b8c9-d975fd2a9059","Type":"ContainerStarted","Data":"c0d032c4bd8a6102c282961d37b4968dfb10aaf972fe8b42cd15f5070c0f0f3a"} Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.256008 4931 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:45 crc kubenswrapper[4931]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:45 crc kubenswrapper[4931]: Jan 30 05:29:45 crc kubenswrapper[4931]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:45 crc kubenswrapper[4931]: Jan 30 05:29:45 crc kubenswrapper[4931]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:45 crc kubenswrapper[4931]: Jan 30 05:29:45 crc kubenswrapper[4931]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:45 crc kubenswrapper[4931]: Jan 30 05:29:45 crc kubenswrapper[4931]: if [ -n "barbican" ]; then Jan 30 05:29:45 crc kubenswrapper[4931]: GRANT_DATABASE="barbican" Jan 30 05:29:45 crc kubenswrapper[4931]: else Jan 30 05:29:45 crc kubenswrapper[4931]: GRANT_DATABASE="*" Jan 30 05:29:45 crc kubenswrapper[4931]: fi Jan 30 05:29:45 crc kubenswrapper[4931]: Jan 30 05:29:45 crc kubenswrapper[4931]: # going for maximum compatibility here: Jan 30 05:29:45 crc kubenswrapper[4931]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:45 crc kubenswrapper[4931]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:45 crc kubenswrapper[4931]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:45 crc kubenswrapper[4931]: # support updates Jan 30 05:29:45 crc kubenswrapper[4931]: Jan 30 05:29:45 crc kubenswrapper[4931]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.276731 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-cb5c-account-create-update-n52qj" podUID="46ad7de9-e01d-414c-8a4d-9073ad986186" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.361977 4931 generic.go:334] "Generic (PLEG): container finished" podID="7c0ddaec-4521-4898-8649-262b52f24acb" containerID="754804bc268dc311547eddc996a035b132392b26798898d2ba034bc32dc1ee16" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.362290 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c0ddaec-4521-4898-8649-262b52f24acb","Type":"ContainerDied","Data":"754804bc268dc311547eddc996a035b132392b26798898d2ba034bc32dc1ee16"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.375759 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.397754 4931 generic.go:334] "Generic (PLEG): container finished" podID="3415cfc4-a71a-4110-bf82-295181bb386f" containerID="3795752efe01b170e153d47107ec186f09220104cdec5c71d0a210a63580f858" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.397836 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3415cfc4-a71a-4110-bf82-295181bb386f","Type":"ContainerDied","Data":"3795752efe01b170e153d47107ec186f09220104cdec5c71d0a210a63580f858"} Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.401981 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_49a63fb4-24bc-4834-b6e7-937688c5de09/ovsdbserver-nb/0.log" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.402039 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.407403 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.424842 4931 scope.go:117] "RemoveContainer" containerID="998f26954c016e9a4be4fed72f68f879a1f7793c171311545d8f4958871325fd" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.484750 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-nb\") pod \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.484831 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-swift-storage-0\") pod \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.484863 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-config\") pod \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.484890 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-scripts\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.484942 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.484974 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-config\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485001 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-sb\") pod \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485048 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdbserver-nb-tls-certs\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485073 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdb-rundir\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485156 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnld\" (UniqueName: \"kubernetes.io/projected/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-kube-api-access-nrnld\") pod \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485198 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f2l8\" (UniqueName: \"kubernetes.io/projected/49a63fb4-24bc-4834-b6e7-937688c5de09-kube-api-access-6f2l8\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485239 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-svc\") pod \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\" (UID: \"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485279 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-metrics-certs-tls-certs\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.485333 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-combined-ca-bundle\") pod \"49a63fb4-24bc-4834-b6e7-937688c5de09\" (UID: \"49a63fb4-24bc-4834-b6e7-937688c5de09\") " Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.505819 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f28f211b-be26-4f15-92a1-36b91cb53bbb/ovsdbserver-sb/0.log" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.506044 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.506761 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-scripts" (OuterVolumeSpecName: "scripts") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.540449 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.561165 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-kube-api-access-nrnld" (OuterVolumeSpecName: "kube-api-access-nrnld") pod "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" (UID: "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d"). InnerVolumeSpecName "kube-api-access-nrnld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.569279 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-config" (OuterVolumeSpecName: "config") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.588640 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.588671 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.588683 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnld\" (UniqueName: \"kubernetes.io/projected/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-kube-api-access-nrnld\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.588692 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a63fb4-24bc-4834-b6e7-937688c5de09-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.615205 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b263e8e-7618-4044-bed1-b35174d6a8f4" path="/var/lib/kubelet/pods/6b263e8e-7618-4044-bed1-b35174d6a8f4/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.617065 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdb7d70-31a9-4d52-aae0-072e8c62a23f" path="/var/lib/kubelet/pods/6bdb7d70-31a9-4d52-aae0-072e8c62a23f/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.618404 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee75b9c-df74-490e-94ff-21eacce0b65a" path="/var/lib/kubelet/pods/6ee75b9c-df74-490e-94ff-21eacce0b65a/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.619019 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9262fbc3-2503-4252-b2dd-10cd8dcfbd6f" path="/var/lib/kubelet/pods/9262fbc3-2503-4252-b2dd-10cd8dcfbd6f/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.620380 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9ebe73-0201-4486-9de9-e8828e84de53" path="/var/lib/kubelet/pods/9b9ebe73-0201-4486-9de9-e8828e84de53/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.620997 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a8f8fe-306a-4373-bbb0-d96f2b498d62" path="/var/lib/kubelet/pods/b0a8f8fe-306a-4373-bbb0-d96f2b498d62/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.621861 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b14699-8089-4af7-b0bd-654a8fda9715" path="/var/lib/kubelet/pods/c3b14699-8089-4af7-b0bd-654a8fda9715/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.621888 4931 generic.go:334] "Generic (PLEG): container finished" podID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerID="e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.622118 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.622974 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae14e96-e869-491f-bbab-32bccf87cc10" path="/var/lib/kubelet/pods/cae14e96-e869-491f-bbab-32bccf87cc10/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.623545 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a63fb4-24bc-4834-b6e7-937688c5de09-kube-api-access-6f2l8" (OuterVolumeSpecName: "kube-api-access-6f2l8") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "kube-api-access-6f2l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.623604 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5cbb37a-882a-46cf-9cee-0543ac708004" path="/var/lib/kubelet/pods/d5cbb37a-882a-46cf-9cee-0543ac708004/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.623752 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.624857 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1ef5f2-7d57-4f89-9b48-9c603b322e5e" path="/var/lib/kubelet/pods/da1ef5f2-7d57-4f89-9b48-9c603b322e5e/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.640128 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6b82f5-5c39-4101-b9f8-05aaf9547a0b" path="/var/lib/kubelet/pods/df6b82f5-5c39-4101-b9f8-05aaf9547a0b/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.642053 4931 generic.go:334] "Generic (PLEG): container finished" podID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerID="a268ff4ead170d4fc7c25a89e846ed2d0f10278b94da4082529cc4ebd9ab4f0e" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.667071 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65373ae-84e0-4338-be4c-8cc8bd2d3fb0" path="/var/lib/kubelet/pods/e65373ae-84e0-4338-be4c-8cc8bd2d3fb0/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.668382 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8624816-8c2c-4d9c-b3a5-426253850926" path="/var/lib/kubelet/pods/e8624816-8c2c-4d9c-b3a5-426253850926/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.669023 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac7a7da-7577-4269-8e37-fd964be6f75c" path="/var/lib/kubelet/pods/fac7a7da-7577-4269-8e37-fd964be6f75c/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.669642 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b" path="/var/lib/kubelet/pods/ff26ff0b-dc1f-4605-bf81-1a5dcde91d1b/volumes" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.689581 4931 generic.go:334] "Generic (PLEG): container finished" podID="58928fea-709c-44d8-bd12-23937da8e2c4" containerID="0e0a199cc977b5213010336cc2b6c461a3916b61b6c3d9f6dc8eecc7d5c8d17e" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.691676 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9tkc\" (UniqueName: \"kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.691961 4931 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.691882 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data\") pod \"barbican-api-cbdc6b6c8-m9v7c\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.692024 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.692009514 +0000 UTC m=+1325.061919771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : secret "barbican-config-data" not found Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.692327 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.692365 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data podName:081e3873-ea99-4486-925f-784a98e49405 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.692352543 +0000 UTC m=+1325.062262800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data") pod "rabbitmq-cell1-server-0" (UID: "081e3873-ea99-4486-925f-784a98e49405") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.692472 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f2l8\" (UniqueName: \"kubernetes.io/projected/49a63fb4-24bc-4834-b6e7-937688c5de09-kube-api-access-6f2l8\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.692496 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.694302 4931 projected.go:194] Error preparing data for projected volume kube-api-access-t9tkc for pod openstack/barbican-api-cbdc6b6c8-m9v7c: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:45 crc kubenswrapper[4931]: E0130 05:29:45.694343 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc podName:98fff7bd-db4c-462f-8f2c-34733f4e81ad nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.694333949 +0000 UTC m=+1325.064244206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9tkc" (UniqueName: "kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc") pod "barbican-api-cbdc6b6c8-m9v7c" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.710320 4931 generic.go:334] "Generic (PLEG): container finished" podID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerID="4ec5d987a513f3f04cf30f8d242bd5ee734a2387c5a279b070c18b72f4a56519" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.725434 4931 generic.go:334] "Generic (PLEG): container finished" podID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerID="e5dadd497214a5d2efc5b8027947f3661f7f73599b0778570358c42329955e8d" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.728923 4931 generic.go:334] "Generic (PLEG): container finished" podID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerID="f0f483180c30bc672edc2e00c840d52567eb9b0c61f8c285d3a7c2a185f38020" exitCode=143 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.746547 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.185:8776/healthcheck\": dial tcp 10.217.0.185:8776: connect: connection refused" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.783761 4931 generic.go:334] "Generic (PLEG): container finished" podID="623f3c8f-d741-4ba4-baca-905a13102f38" containerID="3a989cbeb7f4bf86d12831f4d3313ed2a342cf72d86cb0362de32fea4fb7324a" exitCode=1 Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.784285 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.785585 4931 scope.go:117] "RemoveContainer" containerID="3a989cbeb7f4bf86d12831f4d3313ed2a342cf72d86cb0362de32fea4fb7324a" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.906108 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" (UID: "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.909486 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 30 05:29:45 crc kubenswrapper[4931]: I0130 05:29:45.996187 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-config" (OuterVolumeSpecName: "config") pod "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" (UID: "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.004192 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.004219 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.004228 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.016569 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.018472 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" (UID: "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.028097 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" (UID: "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.106414 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.106649 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.106661 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.110770 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.111094 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" (UID: "1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.146509 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "49a63fb4-24bc-4834-b6e7-937688c5de09" (UID: "49a63fb4-24bc-4834-b6e7-937688c5de09"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.207928 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.207954 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a63fb4-24bc-4834-b6e7-937688c5de09-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.207963 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4931]: E0130 05:29:46.338405 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06 is running failed: container process not found" containerID="83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:46 crc kubenswrapper[4931]: E0130 05:29:46.338620 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06 is running failed: container process not found" containerID="83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:46 crc kubenswrapper[4931]: E0130 05:29:46.338796 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06 is running failed: container process not found" containerID="83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:46 crc kubenswrapper[4931]: E0130 05:29:46.338817 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="88988b92-cd64-490d-b55f-959ecf4095af" containerName="nova-cell0-conductor-conductor" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346237 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f28f211b-be26-4f15-92a1-36b91cb53bbb","Type":"ContainerDied","Data":"920213aded2a6124fc2a4c0ef0f31260bf1b62f8e7693371989b52df16882f74"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346376 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" event={"ID":"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d","Type":"ContainerDied","Data":"e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346391 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-ctzjd" event={"ID":"1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d","Type":"ContainerDied","Data":"645723e127490c600cf593cc161f0207c0a197195fa54096da51b7634ddd33ac"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346402 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74","Type":"ContainerDied","Data":"a268ff4ead170d4fc7c25a89e846ed2d0f10278b94da4082529cc4ebd9ab4f0e"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346415 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d69b6c966-npv8t" event={"ID":"58928fea-709c-44d8-bd12-23937da8e2c4","Type":"ContainerDied","Data":"0e0a199cc977b5213010336cc2b6c461a3916b61b6c3d9f6dc8eecc7d5c8d17e"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346462 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c996f77-c9rqm" event={"ID":"7729e2d8-6c8c-4759-9e5d-535ad1586f47","Type":"ContainerDied","Data":"4ec5d987a513f3f04cf30f8d242bd5ee734a2387c5a279b070c18b72f4a56519"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346476 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"406c25f3-c398-4ace-ba4b-1d9b48b289a2","Type":"ContainerDied","Data":"e5dadd497214a5d2efc5b8027947f3661f7f73599b0778570358c42329955e8d"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346493 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" event={"ID":"728a2e60-915e-4447-9465-aa64f7f5c7bb","Type":"ContainerDied","Data":"f0f483180c30bc672edc2e00c840d52567eb9b0c61f8c285d3a7c2a185f38020"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346563 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" event={"ID":"9ac55021-a07e-443f-9ee9-e7516556b975","Type":"ContainerStarted","Data":"1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346581 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6xxt5" event={"ID":"623f3c8f-d741-4ba4-baca-905a13102f38","Type":"ContainerDied","Data":"3a989cbeb7f4bf86d12831f4d3313ed2a342cf72d86cb0362de32fea4fb7324a"} Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.346631 4931 scope.go:117] "RemoveContainer" containerID="4583c4efc80289000d7023b793d84dc55442d51907c690cc558e927738cb2e88" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.380007 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.414980 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98fff7bd-db4c-462f-8f2c-34733f4e81ad-logs\") pod \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.415030 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-internal-tls-certs\") pod \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.415085 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-public-tls-certs\") pod \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.415222 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data-custom\") pod \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.415292 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-combined-ca-bundle\") pod \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\" (UID: \"98fff7bd-db4c-462f-8f2c-34733f4e81ad\") " Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.417036 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98fff7bd-db4c-462f-8f2c-34733f4e81ad-logs" (OuterVolumeSpecName: "logs") pod "98fff7bd-db4c-462f-8f2c-34733f4e81ad" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.423312 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "98fff7bd-db4c-462f-8f2c-34733f4e81ad" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.424400 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "98fff7bd-db4c-462f-8f2c-34733f4e81ad" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.432124 4931 scope.go:117] "RemoveContainer" containerID="8f3ed5c70b0a0c2e85aa57cc8cb01dca94391f61dba6acbcb746da8a8225d47c" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.435792 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98fff7bd-db4c-462f-8f2c-34733f4e81ad" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4931]: I0130 05:29:46.445016 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "98fff7bd-db4c-462f-8f2c-34733f4e81ad" (UID: "98fff7bd-db4c-462f-8f2c-34733f4e81ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.481636 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.487152 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.492124 4931 scope.go:117] "RemoveContainer" containerID="e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.513569 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516237 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4flg\" (UniqueName: \"kubernetes.io/projected/5d4d7097-4e75-41cb-b451-6feb8e2184b9-kube-api-access-q4flg\") pod \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516505 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d7097-4e75-41cb-b451-6feb8e2184b9-operator-scripts\") pod \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\" (UID: \"5d4d7097-4e75-41cb-b451-6feb8e2184b9\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516920 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516933 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516941 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98fff7bd-db4c-462f-8f2c-34733f4e81ad-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516951 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.516962 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.519378 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4d7097-4e75-41cb-b451-6feb8e2184b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d4d7097-4e75-41cb-b451-6feb8e2184b9" (UID: "5d4d7097-4e75-41cb-b451-6feb8e2184b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.530537 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4d7097-4e75-41cb-b451-6feb8e2184b9-kube-api-access-q4flg" (OuterVolumeSpecName: "kube-api-access-q4flg") pod "5d4d7097-4e75-41cb-b451-6feb8e2184b9" (UID: "5d4d7097-4e75-41cb-b451-6feb8e2184b9"). InnerVolumeSpecName "kube-api-access-q4flg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.533586 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-dvktv"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.542091 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-dvktv"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.549027 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-ctzjd"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.554763 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-ctzjd"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.621543 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4d7097-4e75-41cb-b451-6feb8e2184b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.621570 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4flg\" (UniqueName: \"kubernetes.io/projected/5d4d7097-4e75-41cb-b451-6feb8e2184b9-kube-api-access-q4flg\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.735169 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.753158 4931 scope.go:117] "RemoveContainer" containerID="d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.769883 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.781582 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.813783 4931 generic.go:334] "Generic (PLEG): container finished" podID="98d21216-5a0f-422c-9642-0ea353a33e82" containerID="02a426537f79889d684c812318ea1dd0bd0af03a098fb1e7d47cd94e43353e1c" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.813815 4931 generic.go:334] "Generic (PLEG): container finished" podID="98d21216-5a0f-422c-9642-0ea353a33e82" containerID="3baca7478354f0fb4066c265761c1fd4465b993347daea084eeb2d40cd40bed6" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.813880 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76fb878d5c-s22sw" event={"ID":"98d21216-5a0f-422c-9642-0ea353a33e82","Type":"ContainerDied","Data":"02a426537f79889d684c812318ea1dd0bd0af03a098fb1e7d47cd94e43353e1c"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.813940 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76fb878d5c-s22sw" event={"ID":"98d21216-5a0f-422c-9642-0ea353a33e82","Type":"ContainerDied","Data":"3baca7478354f0fb4066c265761c1fd4465b993347daea084eeb2d40cd40bed6"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.813959 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76fb878d5c-s22sw" event={"ID":"98d21216-5a0f-422c-9642-0ea353a33e82","Type":"ContainerDied","Data":"68d0e2dfe8dc67ba7ff79544ecf0a950e34ec34379d61e5a1edf698fb315e6f7"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.813972 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d0e2dfe8dc67ba7ff79544ecf0a950e34ec34379d61e5a1edf698fb315e6f7" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.822589 4931 generic.go:334] "Generic (PLEG): container finished" podID="88988b92-cd64-490d-b55f-959ecf4095af" containerID="83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.822648 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"88988b92-cd64-490d-b55f-959ecf4095af","Type":"ContainerDied","Data":"83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.822678 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"88988b92-cd64-490d-b55f-959ecf4095af","Type":"ContainerDied","Data":"56a8c3403b77c67382071da65bd384ea85d43f4776ebb7971c9a14fd4e392984"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.822689 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56a8c3403b77c67382071da65bd384ea85d43f4776ebb7971c9a14fd4e392984" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830161 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc6vk\" (UniqueName: \"kubernetes.io/projected/f493e630-c604-4fd1-9fa6-f26d6d1a179a-kube-api-access-zc6vk\") pod \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830292 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvzsw\" (UniqueName: \"kubernetes.io/projected/2565fa42-f180-4948-8b2f-68c419d78d2b-kube-api-access-qvzsw\") pod \"2565fa42-f180-4948-8b2f-68c419d78d2b\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830333 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swhwl\" (UniqueName: \"kubernetes.io/projected/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-kube-api-access-swhwl\") pod \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830349 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-combined-ca-bundle\") pod \"2565fa42-f180-4948-8b2f-68c419d78d2b\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830373 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-vencrypt-tls-certs\") pod \"2565fa42-f180-4948-8b2f-68c419d78d2b\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830434 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-nova-novncproxy-tls-certs\") pod \"2565fa42-f180-4948-8b2f-68c419d78d2b\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830462 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-operator-scripts\") pod \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\" (UID: \"2b6b4ccf-805f-463c-b8c9-d975fd2a9059\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830488 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f493e630-c604-4fd1-9fa6-f26d6d1a179a-operator-scripts\") pod \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\" (UID: \"f493e630-c604-4fd1-9fa6-f26d6d1a179a\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.830557 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-config-data\") pod \"2565fa42-f180-4948-8b2f-68c419d78d2b\" (UID: \"2565fa42-f180-4948-8b2f-68c419d78d2b\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.841078 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b6b4ccf-805f-463c-b8c9-d975fd2a9059" (UID: "2b6b4ccf-805f-463c-b8c9-d975fd2a9059"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.841257 4931 generic.go:334] "Generic (PLEG): container finished" podID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerID="9398f7e713fb447b3a151d286be2d2910e4d8535fd421e906c46b8cc2c9a4728" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.841328 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd","Type":"ContainerDied","Data":"9398f7e713fb447b3a151d286be2d2910e4d8535fd421e906c46b8cc2c9a4728"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.841369 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd","Type":"ContainerDied","Data":"300c4ac1a78a0898043a5bb9c0ea1e976d3646b2689510ee5ed5d0a93470d249"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.841381 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300c4ac1a78a0898043a5bb9c0ea1e976d3646b2689510ee5ed5d0a93470d249" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.844625 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2565fa42-f180-4948-8b2f-68c419d78d2b-kube-api-access-qvzsw" (OuterVolumeSpecName: "kube-api-access-qvzsw") pod "2565fa42-f180-4948-8b2f-68c419d78d2b" (UID: "2565fa42-f180-4948-8b2f-68c419d78d2b"). InnerVolumeSpecName "kube-api-access-qvzsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.845713 4931 scope.go:117] "RemoveContainer" containerID="e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.846815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f493e630-c604-4fd1-9fa6-f26d6d1a179a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f493e630-c604-4fd1-9fa6-f26d6d1a179a" (UID: "f493e630-c604-4fd1-9fa6-f26d6d1a179a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:46.853082 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e\": container with ID starting with e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e not found: ID does not exist" containerID="e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.853113 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e"} err="failed to get container status \"e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e\": rpc error: code = NotFound desc = could not find container \"e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e\": container with ID starting with e47aded0e7f33012a293c39a0b4c3359af98c56ec7e3320fc16509dd1760490e not found: ID does not exist" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.853134 4931 scope.go:117] "RemoveContainer" containerID="d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:46.855176 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2\": container with ID starting with d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2 not found: ID does not exist" containerID="d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.855226 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2"} err="failed to get container status \"d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2\": rpc error: code = NotFound desc = could not find container \"d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2\": container with ID starting with d6f739809f3c40dd44a526d132e36537732145204e408c6bc109beb8752471c2 not found: ID does not exist" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.855282 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f493e630-c604-4fd1-9fa6-f26d6d1a179a-kube-api-access-zc6vk" (OuterVolumeSpecName: "kube-api-access-zc6vk") pod "f493e630-c604-4fd1-9fa6-f26d6d1a179a" (UID: "f493e630-c604-4fd1-9fa6-f26d6d1a179a"). InnerVolumeSpecName "kube-api-access-zc6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.859974 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-kube-api-access-swhwl" (OuterVolumeSpecName: "kube-api-access-swhwl") pod "2b6b4ccf-805f-463c-b8c9-d975fd2a9059" (UID: "2b6b4ccf-805f-463c-b8c9-d975fd2a9059"). InnerVolumeSpecName "kube-api-access-swhwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.871306 4931 generic.go:334] "Generic (PLEG): container finished" podID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerID="2823dcc09d156bc746ffbc3ab196c3d6e136f453bf377837d7cce10861a168f4" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.871355 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e","Type":"ContainerDied","Data":"2823dcc09d156bc746ffbc3ab196c3d6e136f453bf377837d7cce10861a168f4"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.871379 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e","Type":"ContainerDied","Data":"d58c6ad814535983bdd3740a7cd3d8c344b8ddf68658a3e7d51e045ec46e07d7"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.871389 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58c6ad814535983bdd3740a7cd3d8c344b8ddf68658a3e7d51e045ec46e07d7" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.871492 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.872718 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8ee9-account-create-update-c7rsn" event={"ID":"5d4d7097-4e75-41cb-b451-6feb8e2184b9","Type":"ContainerDied","Data":"f0d507bce832298463e6a094cc7b0f7eb6c19d37e2a1f9f33913556dc5ffc1c1"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.872757 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8ee9-account-create-update-c7rsn" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.883176 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0120-account-create-update-cj262" event={"ID":"d13136a7-4633-4386-822d-ceb2cb3320b8","Type":"ContainerDied","Data":"94259a3980ca06eabd57f602644b7974c5802e08901f053b5b514caf5639d01b"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.883262 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0120-account-create-update-cj262" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.886972 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" event={"ID":"f493e630-c604-4fd1-9fa6-f26d6d1a179a","Type":"ContainerDied","Data":"34afa4a36598164bccdcad1293ddead8d7610abc3c9551b334f25c08a708b5f9"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.887066 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-10f6-account-create-update-ntbbc" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.902499 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6xxt5" event={"ID":"623f3c8f-d741-4ba4-baca-905a13102f38","Type":"ContainerStarted","Data":"7543927ad63a3b9e73b08c0c9fa7a83b683adacd56d1d27f7d72109cd07d12dd"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.903780 4931 scope.go:117] "RemoveContainer" containerID="7543927ad63a3b9e73b08c0c9fa7a83b683adacd56d1d27f7d72109cd07d12dd" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:46.908140 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-6xxt5_openstack(623f3c8f-d741-4ba4-baca-905a13102f38)\"" pod="openstack/root-account-create-update-6xxt5" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.922003 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-326d-account-create-update-b25zb" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.922022 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-326d-account-create-update-b25zb" event={"ID":"2b6b4ccf-805f-463c-b8c9-d975fd2a9059","Type":"ContainerDied","Data":"c0d032c4bd8a6102c282961d37b4968dfb10aaf972fe8b42cd15f5070c0f0f3a"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.931733 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13136a7-4633-4386-822d-ceb2cb3320b8-operator-scripts\") pod \"d13136a7-4633-4386-822d-ceb2cb3320b8\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.931922 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6j5x\" (UniqueName: \"kubernetes.io/projected/d13136a7-4633-4386-822d-ceb2cb3320b8-kube-api-access-l6j5x\") pod \"d13136a7-4633-4386-822d-ceb2cb3320b8\" (UID: \"d13136a7-4633-4386-822d-ceb2cb3320b8\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932116 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13136a7-4633-4386-822d-ceb2cb3320b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d13136a7-4633-4386-822d-ceb2cb3320b8" (UID: "d13136a7-4633-4386-822d-ceb2cb3320b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932397 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvzsw\" (UniqueName: \"kubernetes.io/projected/2565fa42-f180-4948-8b2f-68c419d78d2b-kube-api-access-qvzsw\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932409 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swhwl\" (UniqueName: \"kubernetes.io/projected/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-kube-api-access-swhwl\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932429 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b6b4ccf-805f-463c-b8c9-d975fd2a9059-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932438 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f493e630-c604-4fd1-9fa6-f26d6d1a179a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932447 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d13136a7-4633-4386-822d-ceb2cb3320b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.932456 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc6vk\" (UniqueName: \"kubernetes.io/projected/f493e630-c604-4fd1-9fa6-f26d6d1a179a-kube-api-access-zc6vk\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.943447 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13136a7-4633-4386-822d-ceb2cb3320b8-kube-api-access-l6j5x" (OuterVolumeSpecName: "kube-api-access-l6j5x") pod "d13136a7-4633-4386-822d-ceb2cb3320b8" (UID: "d13136a7-4633-4386-822d-ceb2cb3320b8"). InnerVolumeSpecName "kube-api-access-l6j5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.956402 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-config-data" (OuterVolumeSpecName: "config-data") pod "2565fa42-f180-4948-8b2f-68c419d78d2b" (UID: "2565fa42-f180-4948-8b2f-68c419d78d2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.958259 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" event={"ID":"9ac55021-a07e-443f-9ee9-e7516556b975","Type":"ContainerStarted","Data":"9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.959771 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener-log" containerID="cri-o://1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.960302 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener" containerID="cri-o://9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.983667 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "2565fa42-f180-4948-8b2f-68c419d78d2b" (UID: "2565fa42-f180-4948-8b2f-68c419d78d2b"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.993802 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-df05-account-create-update-nrbm4" event={"ID":"7ef60747-e73b-451c-b8e1-6abd596d31bb","Type":"ContainerDied","Data":"5f23441d1b937de628ce230831d40e43097271e249db0186baccaa1a1137dc00"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:46.993857 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f23441d1b937de628ce230831d40e43097271e249db0186baccaa1a1137dc00" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.010948 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2565fa42-f180-4948-8b2f-68c419d78d2b" (UID: "2565fa42-f180-4948-8b2f-68c419d78d2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.017514 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.017959 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7789bbd757-45b5w" event={"ID":"c0c7aeee-9023-433a-83d0-aa0e9942a0ed","Type":"ContainerStarted","Data":"a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.018153 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7789bbd757-45b5w" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker-log" containerID="cri-o://eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.018219 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7789bbd757-45b5w" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker" containerID="cri-o://a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.035019 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.036450 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ef60747-e73b-451c-b8e1-6abd596d31bb-operator-scripts\") pod \"7ef60747-e73b-451c-b8e1-6abd596d31bb\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.036645 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjv6q\" (UniqueName: \"kubernetes.io/projected/7ef60747-e73b-451c-b8e1-6abd596d31bb-kube-api-access-xjv6q\") pod \"7ef60747-e73b-451c-b8e1-6abd596d31bb\" (UID: \"7ef60747-e73b-451c-b8e1-6abd596d31bb\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.041481 4931 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.048569 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.048586 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6j5x\" (UniqueName: \"kubernetes.io/projected/d13136a7-4633-4386-822d-ceb2cb3320b8-kube-api-access-l6j5x\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.048598 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.044220 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" podStartSLOduration=6.044202739 podStartE2EDuration="6.044202739s" podCreationTimestamp="2026-01-30 05:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:46.984036348 +0000 UTC m=+1322.353946605" watchObservedRunningTime="2026-01-30 05:29:47.044202739 +0000 UTC m=+1322.414112996" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.042952 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef","Type":"ContainerDied","Data":"1c39e215a63df32503630d9061f8096755ed069b7c32eec93d18140193ca977b"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.042930 4931 generic.go:334] "Generic (PLEG): container finished" podID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerID="1c39e215a63df32503630d9061f8096755ed069b7c32eec93d18140193ca977b" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.043563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef60747-e73b-451c-b8e1-6abd596d31bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ef60747-e73b-451c-b8e1-6abd596d31bb" (UID: "7ef60747-e73b-451c-b8e1-6abd596d31bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.046640 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef60747-e73b-451c-b8e1-6abd596d31bb-kube-api-access-xjv6q" (OuterVolumeSpecName: "kube-api-access-xjv6q") pod "7ef60747-e73b-451c-b8e1-6abd596d31bb" (UID: "7ef60747-e73b-451c-b8e1-6abd596d31bb"). InnerVolumeSpecName "kube-api-access-xjv6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.048747 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef","Type":"ContainerDied","Data":"651858dcd740868b54f1818387952f7e3dd92b06537502abf826f277b0f1c2f7"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.048767 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651858dcd740868b54f1818387952f7e3dd92b06537502abf826f277b0f1c2f7" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.046771 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.076214 4931 generic.go:334] "Generic (PLEG): container finished" podID="9bb44c01-e79f-42d8-912c-66db07c6b328" containerID="9aac5eb9a7735bf1efc6d134170c4297691e32c19fa1a2cd01ab0ae918243436" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.076291 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bb44c01-e79f-42d8-912c-66db07c6b328","Type":"ContainerDied","Data":"9aac5eb9a7735bf1efc6d134170c4297691e32c19fa1a2cd01ab0ae918243436"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.076410 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.081515 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "2565fa42-f180-4948-8b2f-68c419d78d2b" (UID: "2565fa42-f180-4948-8b2f-68c419d78d2b"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.098904 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.104288 4931 generic.go:334] "Generic (PLEG): container finished" podID="2565fa42-f180-4948-8b2f-68c419d78d2b" containerID="4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.104362 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cbdc6b6c8-m9v7c" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.105748 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2565fa42-f180-4948-8b2f-68c419d78d2b","Type":"ContainerDied","Data":"4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.105781 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2565fa42-f180-4948-8b2f-68c419d78d2b","Type":"ContainerDied","Data":"182ca03d45434848993e7087501801fbb8335a526ad9960a6da96e395124bc68"} Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.105797 4931 scope.go:117] "RemoveContainer" containerID="4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.105880 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.106544 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.114876 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-ntbbc"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.130029 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.134590 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-10f6-account-create-update-ntbbc"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.138113 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.149962 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rplvd\" (UniqueName: \"kubernetes.io/projected/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-kube-api-access-rplvd\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150009 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150030 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-combined-ca-bundle\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-scripts\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150085 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-public-tls-certs\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150109 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data-custom\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150132 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-internal-tls-certs\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150150 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data-custom\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150200 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-config-data\") pod \"88988b92-cd64-490d-b55f-959ecf4095af\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150231 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-galera-tls-certs\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.150251 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-combined-ca-bundle\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.189846 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-etc-machine-id\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.189919 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-run-httpd\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.189941 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hln5j\" (UniqueName: \"kubernetes.io/projected/88988b92-cd64-490d-b55f-959ecf4095af-kube-api-access-hln5j\") pod \"88988b92-cd64-490d-b55f-959ecf4095af\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190005 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-config-data\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-combined-ca-bundle\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190073 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-combined-ca-bundle\") pod \"88988b92-cd64-490d-b55f-959ecf4095af\" (UID: \"88988b92-cd64-490d-b55f-959ecf4095af\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190107 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kolla-config\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190149 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-public-tls-certs\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190230 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsxq8\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-kube-api-access-dsxq8\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190251 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190281 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-etc-machine-id\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190322 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-operator-scripts\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190343 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190379 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-logs\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190402 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190441 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-log-httpd\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190494 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190614 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbw9z\" (UniqueName: \"kubernetes.io/projected/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kube-api-access-fbw9z\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190645 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64zlj\" (UniqueName: \"kubernetes.io/projected/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-kube-api-access-64zlj\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190666 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-generated\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190685 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-internal-tls-certs\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190710 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-combined-ca-bundle\") pod \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\" (UID: \"c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.190726 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-default\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.191682 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ef60747-e73b-451c-b8e1-6abd596d31bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.191700 4931 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2565fa42-f180-4948-8b2f-68c419d78d2b-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.191709 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjv6q\" (UniqueName: \"kubernetes.io/projected/7ef60747-e73b-451c-b8e1-6abd596d31bb-kube-api-access-xjv6q\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.193671 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.194056 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.195051 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.217797 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.220081 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-scripts" (OuterVolumeSpecName: "scripts") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.222852 4931 scope.go:117] "RemoveContainer" containerID="4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.222966 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88988b92-cd64-490d-b55f-959ecf4095af-kube-api-access-hln5j" (OuterVolumeSpecName: "kube-api-access-hln5j") pod "88988b92-cd64-490d-b55f-959ecf4095af" (UID: "88988b92-cd64-490d-b55f-959ecf4095af"). InnerVolumeSpecName "kube-api-access-hln5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.229497 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.229529 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-logs" (OuterVolumeSpecName: "logs") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.233694 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.235131 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.236146 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kube-api-access-fbw9z" (OuterVolumeSpecName: "kube-api-access-fbw9z") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "kube-api-access-fbw9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.237932 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.239453 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-kube-api-access-64zlj" (OuterVolumeSpecName: "kube-api-access-64zlj") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "kube-api-access-64zlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.239817 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.240112 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.240619 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe\": container with ID starting with 4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe not found: ID does not exist" containerID="4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.240655 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe"} err="failed to get container status \"4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe\": rpc error: code = NotFound desc = could not find container \"4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe\": container with ID starting with 4d787c63b2588decfd0b1e48116ec781fe2f24d2cd2bea253561aac466911bbe not found: ID does not exist" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.245211 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-kube-api-access-rplvd" (OuterVolumeSpecName: "kube-api-access-rplvd") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "kube-api-access-rplvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.258620 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-config-data" (OuterVolumeSpecName: "config-data") pod "88988b92-cd64-490d-b55f-959ecf4095af" (UID: "88988b92-cd64-490d-b55f-959ecf4095af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.264548 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-kube-api-access-dsxq8" (OuterVolumeSpecName: "kube-api-access-dsxq8") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "kube-api-access-dsxq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.267818 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-b25zb"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.300948 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.300966 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-326d-account-create-update-b25zb"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.301522 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts" (OuterVolumeSpecName: "scripts") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.322007 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329027 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hmc\" (UniqueName: \"kubernetes.io/projected/9bb44c01-e79f-42d8-912c-66db07c6b328-kube-api-access-67hmc\") pod \"9bb44c01-e79f-42d8-912c-66db07c6b328\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329067 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\" (UID: \"7e9a7f86-7e9d-4062-9c50-72d0d82e24ef\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-config-data\") pod \"9bb44c01-e79f-42d8-912c-66db07c6b328\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " Jan 30 05:29:47 crc kubenswrapper[4931]: W0130 05:29:47.329164 4931 mount_helper_common.go:34] Warning: mount cleanup skipped because path does not exist: /var/lib/kubelet/pods/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef/volumes/kubernetes.io~local-volume/local-storage02-crc Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329198 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329295 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift\") pod \"98d21216-5a0f-422c-9642-0ea353a33e82\" (UID: \"98d21216-5a0f-422c-9642-0ea353a33e82\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329315 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-combined-ca-bundle\") pod \"9bb44c01-e79f-42d8-912c-66db07c6b328\" (UID: \"9bb44c01-e79f-42d8-912c-66db07c6b328\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts\") pod \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\" (UID: \"bb6268b1-9fae-42a1-9f9e-0bed9c69cadd\") " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329933 4931 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329945 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsxq8\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-kube-api-access-dsxq8\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329957 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329965 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329973 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329980 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329989 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbw9z\" (UniqueName: \"kubernetes.io/projected/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-kube-api-access-fbw9z\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.329997 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330005 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64zlj\" (UniqueName: \"kubernetes.io/projected/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-kube-api-access-64zlj\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330013 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330022 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rplvd\" (UniqueName: \"kubernetes.io/projected/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-kube-api-access-rplvd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330041 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330051 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330060 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330069 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330077 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330085 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330093 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98d21216-5a0f-422c-9642-0ea353a33e82-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.330101 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hln5j\" (UniqueName: \"kubernetes.io/projected/88988b92-cd64-490d-b55f-959ecf4095af-kube-api-access-hln5j\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: W0130 05:29:47.331790 4931 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/98d21216-5a0f-422c-9642-0ea353a33e82/volumes/kubernetes.io~projected/etc-swift Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.331847 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: W0130 05:29:47.332163 4931 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd/volumes/kubernetes.io~secret/scripts Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.332181 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts" (OuterVolumeSpecName: "scripts") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.348649 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8ee9-account-create-update-c7rsn"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.358192 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8ee9-account-create-update-c7rsn"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.360231 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb44c01-e79f-42d8-912c-66db07c6b328-kube-api-access-67hmc" (OuterVolumeSpecName: "kube-api-access-67hmc") pod "9bb44c01-e79f-42d8-912c-66db07c6b328" (UID: "9bb44c01-e79f-42d8-912c-66db07c6b328"). InnerVolumeSpecName "kube-api-access-67hmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.366014 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7789bbd757-45b5w" podStartSLOduration=6.360559058 podStartE2EDuration="6.360559058s" podCreationTimestamp="2026-01-30 05:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:47.234096151 +0000 UTC m=+1322.604006408" watchObservedRunningTime="2026-01-30 05:29:47.360559058 +0000 UTC m=+1322.730469315" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.372672 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.407813 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.413641 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.432653 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.432925 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67hmc\" (UniqueName: \"kubernetes.io/projected/9bb44c01-e79f-42d8-912c-66db07c6b328-kube-api-access-67hmc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.432960 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.432972 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.432982 4931 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98d21216-5a0f-422c-9642-0ea353a33e82-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.463621 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-config-data" (OuterVolumeSpecName: "config-data") pod "9bb44c01-e79f-42d8-912c-66db07c6b328" (UID: "9bb44c01-e79f-42d8-912c-66db07c6b328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.482508 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" path="/var/lib/kubelet/pods/1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d/volumes" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.483435 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6b4ccf-805f-463c-b8c9-d975fd2a9059" path="/var/lib/kubelet/pods/2b6b4ccf-805f-463c-b8c9-d975fd2a9059/volumes" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.483811 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba289fc-17e9-45e9-ac24-434d69045d97" path="/var/lib/kubelet/pods/4ba289fc-17e9-45e9-ac24-434d69045d97/volumes" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.484457 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4d7097-4e75-41cb-b451-6feb8e2184b9" path="/var/lib/kubelet/pods/5d4d7097-4e75-41cb-b451-6feb8e2184b9/volumes" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.484917 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" path="/var/lib/kubelet/pods/f28f211b-be26-4f15-92a1-36b91cb53bbb/volumes" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.485910 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f493e630-c604-4fd1-9fa6-f26d6d1a179a" path="/var/lib/kubelet/pods/f493e630-c604-4fd1-9fa6-f26d6d1a179a/volumes" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.521563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-config-data" (OuterVolumeSpecName: "config-data") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.523972 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.526210 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.535955 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.535983 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.536016 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.536025 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.585781 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.590227 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.595826 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data" (OuterVolumeSpecName: "config-data") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.637714 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.637740 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.637750 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.678227 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88988b92-cd64-490d-b55f-959ecf4095af" (UID: "88988b92-cd64-490d-b55f-959ecf4095af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.697027 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.731891 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" (UID: "7e9a7f86-7e9d-4062-9c50-72d0d82e24ef"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.750227 4931 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.750260 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88988b92-cd64-490d-b55f-959ecf4095af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.750268 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.832576 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bb44c01-e79f-42d8-912c-66db07c6b328" (UID: "9bb44c01-e79f-42d8-912c-66db07c6b328"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.837689 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" (UID: "c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.849405 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98d21216-5a0f-422c-9642-0ea353a33e82" (UID: "98d21216-5a0f-422c-9642-0ea353a33e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.852163 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.852237 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d21216-5a0f-422c-9642-0ea353a33e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.852287 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb44c01-e79f-42d8-912c-66db07c6b328-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865779 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865820 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865833 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865847 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cbdc6b6c8-m9v7c"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865856 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-cbdc6b6c8-m9v7c"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865890 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0120-account-create-update-cj262"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865902 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0120-account-create-update-cj262"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865914 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865929 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865941 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-595b-account-create-update-hcchn"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865954 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-595b-account-create-update-hcchn"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.865970 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-595b-account-create-update-jk6fx"] Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866314 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88988b92-cd64-490d-b55f-959ecf4095af" containerName="nova-cell0-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866324 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="88988b92-cd64-490d-b55f-959ecf4095af" containerName="nova-cell0-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866338 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api-log" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866344 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api-log" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866359 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba289fc-17e9-45e9-ac24-434d69045d97" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866365 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba289fc-17e9-45e9-ac24-434d69045d97" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866376 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866383 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866396 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866402 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866432 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerName="init" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866439 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerName="init" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866448 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2565fa42-f180-4948-8b2f-68c419d78d2b" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866455 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2565fa42-f180-4948-8b2f-68c419d78d2b" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866465 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866472 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866495 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="ovsdbserver-nb" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866502 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="ovsdbserver-nb" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866512 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerName="dnsmasq-dns" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866519 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerName="dnsmasq-dns" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866531 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerName="mysql-bootstrap" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866538 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerName="mysql-bootstrap" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866549 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-httpd" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866555 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-httpd" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866565 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-server" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866571 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-server" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866579 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb44c01-e79f-42d8-912c-66db07c6b328" containerName="nova-cell1-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866585 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb44c01-e79f-42d8-912c-66db07c6b328" containerName="nova-cell1-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866595 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="probe" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866601 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="probe" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866611 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="cinder-scheduler" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866618 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="cinder-scheduler" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866631 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="ovsdbserver-sb" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866638 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="ovsdbserver-sb" Jan 30 05:29:47 crc kubenswrapper[4931]: E0130 05:29:47.866653 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerName="galera" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866659 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerName="galera" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866810 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-server" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866820 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" containerName="proxy-httpd" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866829 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2565fa42-f180-4948-8b2f-68c419d78d2b" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866841 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="cinder-scheduler" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866848 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="88988b92-cd64-490d-b55f-959ecf4095af" containerName="nova-cell0-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866857 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866864 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866870 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c29d9d9-1679-4a4f-82da-9a2b0ae32a6d" containerName="dnsmasq-dns" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866876 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba289fc-17e9-45e9-ac24-434d69045d97" containerName="openstack-network-exporter" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866888 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" containerName="ovsdbserver-nb" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866899 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866911 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28f211b-be26-4f15-92a1-36b91cb53bbb" containerName="ovsdbserver-sb" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866920 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" containerName="cinder-api-log" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866930 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" containerName="galera" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866937 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb44c01-e79f-42d8-912c-66db07c6b328" containerName="nova-cell1-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.866946 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" containerName="probe" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.867479 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595b-account-create-update-jk6fx"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.867495 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.867746 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" containerName="memcached" containerID="cri-o://d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.869643 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.870907 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="30f9b591-fea6-4010-99db-45eef2237cdc" containerName="kube-state-metrics" containerID="cri-o://100081f00d3d095ca7d8dca6b7343ac8590f3de539067c314527dbcd86ceca1c" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.872089 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-central-agent" containerID="cri-o://9186f065f875b33ba8c3817977c815fff7b67dc7404ddb06ea6a85fb64800755" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.872212 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="proxy-httpd" containerID="cri-o://25cc12087ab98d0fc79e679c4de5be61f557329f293a5a68393ba8b20a57c428" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.872245 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="sg-core" containerID="cri-o://0f6848e1ccd25c33da13cfce62f451555f794b35623ff124d320281a39cb9911" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.872275 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-notification-agent" containerID="cri-o://62da5f526098b3b9f5437a81119156f87878963a1c44c026236d9b63e20bbac5" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.879035 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.931774 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:46564->10.217.0.207:8775: read: connection reset by peer" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.934326 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:46568->10.217.0.207:8775: read: connection reset by peer" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.944178 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d69b6c966-npv8t" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:43470->10.217.0.166:9311: read: connection reset by peer" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.944292 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d69b6c966-npv8t" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:43464->10.217.0.166:9311: read: connection reset by peer" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.953589 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4gqzx"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.954453 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrsv\" (UniqueName: \"kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.954533 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.954621 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9tkc\" (UniqueName: \"kubernetes.io/projected/98fff7bd-db4c-462f-8f2c-34733f4e81ad-kube-api-access-t9tkc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.954632 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98fff7bd-db4c-462f-8f2c-34733f4e81ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.961272 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sdn7d"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.962090 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="30f9b591-fea6-4010-99db-45eef2237cdc" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.176:8080/livez\": EOF" Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.971580 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4gqzx"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.977677 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sdn7d"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.982456 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-97bdbd495-2prdt"] Jan 30 05:29:47 crc kubenswrapper[4931]: I0130 05:29:47.982666 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-97bdbd495-2prdt" podUID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" containerName="keystone-api" containerID="cri-o://2031f531f783ff9fda1aa19098c42b8b6619a54760d8a1056a1788a1c38b669b" gracePeriod=30 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:47.996828 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.004490 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6xxt5"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.016486 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data" (OuterVolumeSpecName: "config-data") pod "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" (UID: "bb6268b1-9fae-42a1-9f9e-0bed9c69cadd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.048183 4931 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_openstack-cell1-galera-0_7e9a7f86-7e9d-4062-9c50-72d0d82e24ef/galera/0.log" to get inode usage: stat /var/log/pods/openstack_openstack-cell1-galera-0_7e9a7f86-7e9d-4062-9c50-72d0d82e24ef/galera/0.log: no such file or directory Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.056609 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.056751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrsv\" (UniqueName: \"kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.056845 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.057219 4931 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.057267 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts podName:f0ad84e9-a4cc-40a3-850c-f7757aad5b5d nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.557254062 +0000 UTC m=+1323.927164319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts") pod "keystone-595b-account-create-update-jk6fx" (UID: "f0ad84e9-a4cc-40a3-850c-f7757aad5b5d") : configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.060908 4931 projected.go:194] Error preparing data for projected volume kube-api-access-9qrsv for pod openstack/keystone-595b-account-create-update-jk6fx: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.060983 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv podName:f0ad84e9-a4cc-40a3-850c-f7757aad5b5d nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.560956827 +0000 UTC m=+1323.930867084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9qrsv" (UniqueName: "kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv") pod "keystone-595b-account-create-update-jk6fx" (UID: "f0ad84e9-a4cc-40a3-850c-f7757aad5b5d") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.127193 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9bbdw"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.129197 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9bbdw"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.143590 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-595b-account-create-update-jk6fx"] Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.144150 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9qrsv operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-595b-account-create-update-jk6fx" podUID="f0ad84e9-a4cc-40a3-850c-f7757aad5b5d" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.171665 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.172157 4931 generic.go:334] "Generic (PLEG): container finished" podID="623f3c8f-d741-4ba4-baca-905a13102f38" containerID="7543927ad63a3b9e73b08c0c9fa7a83b683adacd56d1d27f7d72109cd07d12dd" exitCode=1 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.172210 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6xxt5" event={"ID":"623f3c8f-d741-4ba4-baca-905a13102f38","Type":"ContainerDied","Data":"7543927ad63a3b9e73b08c0c9fa7a83b683adacd56d1d27f7d72109cd07d12dd"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.172241 4931 scope.go:117] "RemoveContainer" containerID="3a989cbeb7f4bf86d12831f4d3313ed2a342cf72d86cb0362de32fea4fb7324a" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.172616 4931 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-6xxt5" secret="" err="secret \"galera-openstack-dockercfg-ms6mr\" not found" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.172651 4931 scope.go:117] "RemoveContainer" containerID="7543927ad63a3b9e73b08c0c9fa7a83b683adacd56d1d27f7d72109cd07d12dd" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.172911 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-6xxt5_openstack(623f3c8f-d741-4ba4-baca-905a13102f38)\"" pod="openstack/root-account-create-update-6xxt5" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.198484 4931 generic.go:334] "Generic (PLEG): container finished" podID="30f9b591-fea6-4010-99db-45eef2237cdc" containerID="100081f00d3d095ca7d8dca6b7343ac8590f3de539067c314527dbcd86ceca1c" exitCode=2 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.198741 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30f9b591-fea6-4010-99db-45eef2237cdc","Type":"ContainerDied","Data":"100081f00d3d095ca7d8dca6b7343ac8590f3de539067c314527dbcd86ceca1c"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.219721 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bb44c01-e79f-42d8-912c-66db07c6b328","Type":"ContainerDied","Data":"1c009cefddaacdb91464295ae716d32cc8f92887e0517a9011cacb821ea578bd"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.219823 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.234596 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": dial tcp 10.217.0.206:3000: connect: connection refused" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.234721 4931 generic.go:334] "Generic (PLEG): container finished" podID="58928fea-709c-44d8-bd12-23937da8e2c4" containerID="44392d9ac535d9a3ce2ca47aa88e680823c3197a2e50d537aa67df4b03e52fd1" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.234804 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d69b6c966-npv8t" event={"ID":"58928fea-709c-44d8-bd12-23937da8e2c4","Type":"ContainerDied","Data":"44392d9ac535d9a3ce2ca47aa88e680823c3197a2e50d537aa67df4b03e52fd1"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.239767 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.241629 4931 generic.go:334] "Generic (PLEG): container finished" podID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerID="d70e15b0f074e59f1a9f39048c2cf45a62e81400091cb70df139030d514fe003" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.241671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"406c25f3-c398-4ace-ba4b-1d9b48b289a2","Type":"ContainerDied","Data":"d70e15b0f074e59f1a9f39048c2cf45a62e81400091cb70df139030d514fe003"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.254349 4931 scope.go:117] "RemoveContainer" containerID="9aac5eb9a7735bf1efc6d134170c4297691e32c19fa1a2cd01ab0ae918243436" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.264140 4931 generic.go:334] "Generic (PLEG): container finished" podID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerID="6c90254ae67ae50ab19fa555ce55d1839d94322882a09bb91ad616b62efcfeeb" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.264198 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74","Type":"ContainerDied","Data":"6c90254ae67ae50ab19fa555ce55d1839d94322882a09bb91ad616b62efcfeeb"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.269730 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c9gv\" (UniqueName: \"kubernetes.io/projected/46ad7de9-e01d-414c-8a4d-9073ad986186-kube-api-access-5c9gv\") pod \"46ad7de9-e01d-414c-8a4d-9073ad986186\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.269939 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ad7de9-e01d-414c-8a4d-9073ad986186-operator-scripts\") pod \"46ad7de9-e01d-414c-8a4d-9073ad986186\" (UID: \"46ad7de9-e01d-414c-8a4d-9073ad986186\") " Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.270788 4931 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.270828 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts podName:623f3c8f-d741-4ba4-baca-905a13102f38 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.770814263 +0000 UTC m=+1324.140724520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts") pod "root-account-create-update-6xxt5" (UID: "623f3c8f-d741-4ba4-baca-905a13102f38") : configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.271023 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46ad7de9-e01d-414c-8a4d-9073ad986186-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46ad7de9-e01d-414c-8a4d-9073ad986186" (UID: "46ad7de9-e01d-414c-8a4d-9073ad986186"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.271263 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cb5c-account-create-update-n52qj" event={"ID":"46ad7de9-e01d-414c-8a4d-9073ad986186","Type":"ContainerDied","Data":"d1886781d10ecb06348f9abb9caccf69a1bc5841a942465cb5e9e6aa23d451dd"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.271330 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cb5c-account-create-update-n52qj" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.271576 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.286617 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ad7de9-e01d-414c-8a4d-9073ad986186-kube-api-access-5c9gv" (OuterVolumeSpecName: "kube-api-access-5c9gv") pod "46ad7de9-e01d-414c-8a4d-9073ad986186" (UID: "46ad7de9-e01d-414c-8a4d-9073ad986186"). InnerVolumeSpecName "kube-api-access-5c9gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.292171 4931 generic.go:334] "Generic (PLEG): container finished" podID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerID="25cc12087ab98d0fc79e679c4de5be61f557329f293a5a68393ba8b20a57c428" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.292206 4931 generic.go:334] "Generic (PLEG): container finished" podID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerID="0f6848e1ccd25c33da13cfce62f451555f794b35623ff124d320281a39cb9911" exitCode=2 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.292254 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerDied","Data":"25cc12087ab98d0fc79e679c4de5be61f557329f293a5a68393ba8b20a57c428"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.292279 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerDied","Data":"0f6848e1ccd25c33da13cfce62f451555f794b35623ff124d320281a39cb9911"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.299006 4931 generic.go:334] "Generic (PLEG): container finished" podID="7c0ddaec-4521-4898-8649-262b52f24acb" containerID="3509d69982e816f9732671b256bbf363b32c9a199362011499a8607bf3a6e808" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.299070 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c0ddaec-4521-4898-8649-262b52f24acb","Type":"ContainerDied","Data":"3509d69982e816f9732671b256bbf363b32c9a199362011499a8607bf3a6e808"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.299215 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.307141 4931 generic.go:334] "Generic (PLEG): container finished" podID="3415cfc4-a71a-4110-bf82-295181bb386f" containerID="cc448c5e4a9d4def969b75156b3cc39bbccbe47f49a05ef9d15592b4643a809f" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.307275 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3415cfc4-a71a-4110-bf82-295181bb386f","Type":"ContainerDied","Data":"cc448c5e4a9d4def969b75156b3cc39bbccbe47f49a05ef9d15592b4643a809f"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.311820 4931 generic.go:334] "Generic (PLEG): container finished" podID="9ac55021-a07e-443f-9ee9-e7516556b975" containerID="1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d" exitCode=143 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.311897 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" event={"ID":"9ac55021-a07e-443f-9ee9-e7516556b975","Type":"ContainerDied","Data":"1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.315262 4931 generic.go:334] "Generic (PLEG): container finished" podID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerID="eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123" exitCode=143 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.315331 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7789bbd757-45b5w" event={"ID":"c0c7aeee-9023-433a-83d0-aa0e9942a0ed","Type":"ContainerDied","Data":"eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.321661 4931 generic.go:334] "Generic (PLEG): container finished" podID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerID="e8519c60ec437acc9c9b5934ab3951ad5ad349186eda26fc85c2bae9b3010203" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.321726 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-798b7dc5fb-xl2zq" event={"ID":"ebe4f743-9a60-428f-8b58-14ba160d9fd7","Type":"ContainerDied","Data":"e8519c60ec437acc9c9b5934ab3951ad5ad349186eda26fc85c2bae9b3010203"} Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.321801 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-798b7dc5fb-xl2zq" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.328737 4931 scope.go:117] "RemoveContainer" containerID="3509d69982e816f9732671b256bbf363b32c9a199362011499a8607bf3a6e808" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.328873 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.328945 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76fb878d5c-s22sw" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.329003 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-df05-account-create-update-nrbm4" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.329061 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.329086 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.329170 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.342958 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.359952 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.368183 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371703 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-public-tls-certs\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371757 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-logs\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371798 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-scripts\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371830 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-config-data\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371889 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-combined-ca-bundle\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371933 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-internal-tls-certs\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371954 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.371982 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-public-tls-certs\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372006 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vcv\" (UniqueName: \"kubernetes.io/projected/ebe4f743-9a60-428f-8b58-14ba160d9fd7-kube-api-access-68vcv\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372031 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmpkl\" (UniqueName: \"kubernetes.io/projected/7c0ddaec-4521-4898-8649-262b52f24acb-kube-api-access-pmpkl\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372062 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-config-data\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372082 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-httpd-run\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-combined-ca-bundle\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372146 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-scripts\") pod \"7c0ddaec-4521-4898-8649-262b52f24acb\" (UID: \"7c0ddaec-4521-4898-8649-262b52f24acb\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372175 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe4f743-9a60-428f-8b58-14ba160d9fd7-logs\") pod \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\" (UID: \"ebe4f743-9a60-428f-8b58-14ba160d9fd7\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.372628 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ad7de9-e01d-414c-8a4d-9073ad986186-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.373034 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c9gv\" (UniqueName: \"kubernetes.io/projected/46ad7de9-e01d-414c-8a4d-9073ad986186-kube-api-access-5c9gv\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.373087 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-logs" (OuterVolumeSpecName: "logs") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.373645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe4f743-9a60-428f-8b58-14ba160d9fd7-logs" (OuterVolumeSpecName: "logs") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.373914 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.376983 4931 scope.go:117] "RemoveContainer" containerID="754804bc268dc311547eddc996a035b132392b26798898d2ba034bc32dc1ee16" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.377267 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0ddaec-4521-4898-8649-262b52f24acb-kube-api-access-pmpkl" (OuterVolumeSpecName: "kube-api-access-pmpkl") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "kube-api-access-pmpkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.377597 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-scripts" (OuterVolumeSpecName: "scripts") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.380019 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-scripts" (OuterVolumeSpecName: "scripts") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.393886 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.394734 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe4f743-9a60-428f-8b58-14ba160d9fd7-kube-api-access-68vcv" (OuterVolumeSpecName: "kube-api-access-68vcv") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "kube-api-access-68vcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.417481 4931 scope.go:117] "RemoveContainer" containerID="e8519c60ec437acc9c9b5934ab3951ad5ad349186eda26fc85c2bae9b3010203" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.430037 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-76fb878d5c-s22sw"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.440337 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-76fb878d5c-s22sw"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.446290 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.447556 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerName="galera" containerID="cri-o://2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257" gracePeriod=30 Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.477941 4931 scope.go:117] "RemoveContainer" containerID="1d733edb3ceaca43f34355e23bbaaced9e55a731057ead7b89c96398337d6e11" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.479507 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.481010 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-combined-ca-bundle\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.481898 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.481944 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-config-data\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.481977 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-internal-tls-certs\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482015 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnhps\" (UniqueName: \"kubernetes.io/projected/3415cfc4-a71a-4110-bf82-295181bb386f-kube-api-access-nnhps\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482125 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-httpd-run\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482166 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-scripts\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482217 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-logs\") pod \"3415cfc4-a71a-4110-bf82-295181bb386f\" (UID: \"3415cfc4-a71a-4110-bf82-295181bb386f\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482664 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482680 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe4f743-9a60-428f-8b58-14ba160d9fd7-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482689 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482701 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482708 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482726 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482735 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vcv\" (UniqueName: \"kubernetes.io/projected/ebe4f743-9a60-428f-8b58-14ba160d9fd7-kube-api-access-68vcv\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482744 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmpkl\" (UniqueName: \"kubernetes.io/projected/7c0ddaec-4521-4898-8649-262b52f24acb-kube-api-access-pmpkl\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.482753 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c0ddaec-4521-4898-8649-262b52f24acb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.485004 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.485586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-logs" (OuterVolumeSpecName: "logs") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.486612 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.494021 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3415cfc4-a71a-4110-bf82-295181bb386f-kube-api-access-nnhps" (OuterVolumeSpecName: "kube-api-access-nnhps") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "kube-api-access-nnhps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.498444 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.503954 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.506264 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.507864 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-scripts" (OuterVolumeSpecName: "scripts") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.518325 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.522794 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.524053 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.527274 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-config-data" (OuterVolumeSpecName: "config-data") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.550562 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-df05-account-create-update-nrbm4"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.559387 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.559793 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c0ddaec-4521-4898-8649-262b52f24acb" (UID: "7c0ddaec-4521-4898-8649-262b52f24acb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.563067 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-df05-account-create-update-nrbm4"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.568919 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.574785 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.583756 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrsv\" (UniqueName: \"kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.583909 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.584138 4931 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.584304 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts podName:f0ad84e9-a4cc-40a3-850c-f7757aad5b5d nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.584288389 +0000 UTC m=+1324.954198646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts") pod "keystone-595b-account-create-update-jk6fx" (UID: "f0ad84e9-a4cc-40a3-850c-f7757aad5b5d") : configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584392 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584479 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnhps\" (UniqueName: \"kubernetes.io/projected/3415cfc4-a71a-4110-bf82-295181bb386f-kube-api-access-nnhps\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584533 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584581 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584628 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584676 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584730 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584778 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3415cfc4-a71a-4110-bf82-295181bb386f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.584929 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0ddaec-4521-4898-8649-262b52f24acb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.601127 4931 projected.go:194] Error preparing data for projected volume kube-api-access-9qrsv for pod openstack/keystone-595b-account-create-update-jk6fx: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.601201 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv podName:f0ad84e9-a4cc-40a3-850c-f7757aad5b5d nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.601182937 +0000 UTC m=+1324.971093194 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qrsv" (UniqueName: "kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv") pod "keystone-595b-account-create-update-jk6fx" (UID: "f0ad84e9-a4cc-40a3-850c-f7757aad5b5d") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.606289 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.609889 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.612938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.637883 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.637916 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-config-data" (OuterVolumeSpecName: "config-data") pod "3415cfc4-a71a-4110-bf82-295181bb386f" (UID: "3415cfc4-a71a-4110-bf82-295181bb386f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.637931 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-config-data" (OuterVolumeSpecName: "config-data") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.664318 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebe4f743-9a60-428f-8b58-14ba160d9fd7" (UID: "ebe4f743-9a60-428f-8b58-14ba160d9fd7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686515 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686547 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686560 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686570 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686578 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686586 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4f743-9a60-428f-8b58-14ba160d9fd7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.686594 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3415cfc4-a71a-4110-bf82-295181bb386f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.733082 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.742156 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.765291 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cb5c-account-create-update-n52qj"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.772694 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.780063 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.784477 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cb5c-account-create-update-n52qj"] Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.788353 4931 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.788403 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts podName:623f3c8f-d741-4ba4-baca-905a13102f38 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.788390642 +0000 UTC m=+1325.158300899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts") pod "root-account-create-update-6xxt5" (UID: "623f3c8f-d741-4ba4-baca-905a13102f38") : configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.825454 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.850220 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.888996 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-nova-metadata-tls-certs\") pod \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889037 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-public-tls-certs\") pod \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889067 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dkxm\" (UniqueName: \"kubernetes.io/projected/406c25f3-c398-4ace-ba4b-1d9b48b289a2-kube-api-access-8dkxm\") pod \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889088 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889103 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znwrn\" (UniqueName: \"kubernetes.io/projected/30f9b591-fea6-4010-99db-45eef2237cdc-kube-api-access-znwrn\") pod \"30f9b591-fea6-4010-99db-45eef2237cdc\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889128 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-combined-ca-bundle\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889154 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-internal-tls-certs\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889184 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data-custom\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-certs\") pod \"30f9b591-fea6-4010-99db-45eef2237cdc\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889227 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-internal-tls-certs\") pod \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889253 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-public-tls-certs\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889274 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x99zt\" (UniqueName: \"kubernetes.io/projected/58928fea-709c-44d8-bd12-23937da8e2c4-kube-api-access-x99zt\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-combined-ca-bundle\") pod \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889322 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-logs\") pod \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889340 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-config-data\") pod \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889355 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-combined-ca-bundle\") pod \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889373 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-config-data\") pod \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889392 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-config\") pod \"30f9b591-fea6-4010-99db-45eef2237cdc\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889471 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58928fea-709c-44d8-bd12-23937da8e2c4-logs\") pod \"58928fea-709c-44d8-bd12-23937da8e2c4\" (UID: \"58928fea-709c-44d8-bd12-23937da8e2c4\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889513 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/406c25f3-c398-4ace-ba4b-1d9b48b289a2-logs\") pod \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\" (UID: \"406c25f3-c398-4ace-ba4b-1d9b48b289a2\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889550 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc8vz\" (UniqueName: \"kubernetes.io/projected/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-kube-api-access-pc8vz\") pod \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\" (UID: \"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.889572 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-combined-ca-bundle\") pod \"30f9b591-fea6-4010-99db-45eef2237cdc\" (UID: \"30f9b591-fea6-4010-99db-45eef2237cdc\") " Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.894594 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58928fea-709c-44d8-bd12-23937da8e2c4-logs" (OuterVolumeSpecName: "logs") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.895693 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58928fea-709c-44d8-bd12-23937da8e2c4-kube-api-access-x99zt" (OuterVolumeSpecName: "kube-api-access-x99zt") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "kube-api-access-x99zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.896578 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f9b591-fea6-4010-99db-45eef2237cdc-kube-api-access-znwrn" (OuterVolumeSpecName: "kube-api-access-znwrn") pod "30f9b591-fea6-4010-99db-45eef2237cdc" (UID: "30f9b591-fea6-4010-99db-45eef2237cdc"). InnerVolumeSpecName "kube-api-access-znwrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.896898 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-logs" (OuterVolumeSpecName: "logs") pod "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" (UID: "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.907126 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406c25f3-c398-4ace-ba4b-1d9b48b289a2-logs" (OuterVolumeSpecName: "logs") pod "406c25f3-c398-4ace-ba4b-1d9b48b289a2" (UID: "406c25f3-c398-4ace-ba4b-1d9b48b289a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.914625 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406c25f3-c398-4ace-ba4b-1d9b48b289a2-kube-api-access-8dkxm" (OuterVolumeSpecName: "kube-api-access-8dkxm") pod "406c25f3-c398-4ace-ba4b-1d9b48b289a2" (UID: "406c25f3-c398-4ace-ba4b-1d9b48b289a2"). InnerVolumeSpecName "kube-api-access-8dkxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.919341 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.924773 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f9b591-fea6-4010-99db-45eef2237cdc" (UID: "30f9b591-fea6-4010-99db-45eef2237cdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.948216 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-kube-api-access-pc8vz" (OuterVolumeSpecName: "kube-api-access-pc8vz") pod "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" (UID: "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74"). InnerVolumeSpecName "kube-api-access-pc8vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.977987 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "30f9b591-fea6-4010-99db-45eef2237cdc" (UID: "30f9b591-fea6-4010-99db-45eef2237cdc"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.980482 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-798b7dc5fb-xl2zq"] Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.986977 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "406c25f3-c398-4ace-ba4b-1d9b48b289a2" (UID: "406c25f3-c398-4ace-ba4b-1d9b48b289a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.991275 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991294 4931 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: E0130 05:29:48.991332 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data podName:fc3f4796-66b1-452b-afca-5e62cbf2a53b nodeName:}" failed. No retries permitted until 2026-01-30 05:29:56.991314311 +0000 UTC m=+1332.361224568 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data") pod "rabbitmq-server-0" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b") : configmap "rabbitmq-config-data" not found Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991357 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58928fea-709c-44d8-bd12-23937da8e2c4-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991370 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/406c25f3-c398-4ace-ba4b-1d9b48b289a2-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991382 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc8vz\" (UniqueName: \"kubernetes.io/projected/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-kube-api-access-pc8vz\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991397 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991406 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dkxm\" (UniqueName: \"kubernetes.io/projected/406c25f3-c398-4ace-ba4b-1d9b48b289a2-kube-api-access-8dkxm\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991416 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znwrn\" (UniqueName: \"kubernetes.io/projected/30f9b591-fea6-4010-99db-45eef2237cdc-kube-api-access-znwrn\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991437 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991447 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x99zt\" (UniqueName: \"kubernetes.io/projected/58928fea-709c-44d8-bd12-23937da8e2c4-kube-api-access-x99zt\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991455 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4931]: I0130 05:29:48.991463 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.000887 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-config-data" (OuterVolumeSpecName: "config-data") pod "406c25f3-c398-4ace-ba4b-1d9b48b289a2" (UID: "406c25f3-c398-4ace-ba4b-1d9b48b289a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.003302 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-798b7dc5fb-xl2zq"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.025002 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-config-data" (OuterVolumeSpecName: "config-data") pod "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" (UID: "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.032550 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.032697 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "406c25f3-c398-4ace-ba4b-1d9b48b289a2" (UID: "406c25f3-c398-4ace-ba4b-1d9b48b289a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.035856 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data" (OuterVolumeSpecName: "config-data") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.047681 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.051669 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "30f9b591-fea6-4010-99db-45eef2237cdc" (UID: "30f9b591-fea6-4010-99db-45eef2237cdc"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.051794 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" (UID: "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.053179 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" (UID: "e4e6d6a8-599b-4ab9-b1f7-cf521e455d74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.073986 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "406c25f3-c398-4ace-ba4b-1d9b48b289a2" (UID: "406c25f3-c398-4ace-ba4b-1d9b48b289a2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.078994 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "58928fea-709c-44d8-bd12-23937da8e2c4" (UID: "58928fea-709c-44d8-bd12-23937da8e2c4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093372 4931 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093400 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093411 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093422 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093441 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093450 4931 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f9b591-fea6-4010-99db-45eef2237cdc-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093459 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093467 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58928fea-709c-44d8-bd12-23937da8e2c4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093475 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406c25f3-c398-4ace-ba4b-1d9b48b289a2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093483 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.093491 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.212950 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.277258 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.278808 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.279800 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.279835 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" containerName="nova-scheduler-scheduler" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.297086 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-config-data\") pod \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.297172 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l65cq\" (UniqueName: \"kubernetes.io/projected/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kube-api-access-l65cq\") pod \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.297205 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-memcached-tls-certs\") pod \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.297259 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kolla-config\") pod \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.297321 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-combined-ca-bundle\") pod \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\" (UID: \"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.298071 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" (UID: "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.298169 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-config-data" (OuterVolumeSpecName: "config-data") pod "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" (UID: "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.298610 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.298629 4931 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.302562 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kube-api-access-l65cq" (OuterVolumeSpecName: "kube-api-access-l65cq") pod "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" (UID: "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba"). InnerVolumeSpecName "kube-api-access-l65cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.331943 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" (UID: "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.350181 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.350395 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.351407 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.351465 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.352720 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3415cfc4-a71a-4110-bf82-295181bb386f","Type":"ContainerDied","Data":"8686488d53f891915ba13840ec460659816d6140e0778cc81ec5034b3206cf0a"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.352760 4931 scope.go:117] "RemoveContainer" containerID="cc448c5e4a9d4def969b75156b3cc39bbccbe47f49a05ef9d15592b4643a809f" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.352930 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.360934 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" (UID: "3bc265a8-34e2-4ec9-bdd5-69d75ea14bba"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.363400 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.365060 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.368338 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.368388 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.373865 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4e6d6a8-599b-4ab9-b1f7-cf521e455d74","Type":"ContainerDied","Data":"575ed258be47595d56c46486254ee62d83c72e80bf57019828419331f46802a7"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.374093 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.382997 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d69b6c966-npv8t" event={"ID":"58928fea-709c-44d8-bd12-23937da8e2c4","Type":"ContainerDied","Data":"80b7562d2e28920f91efc6005a3dada9547915b81544850dc4480d2be479f27a"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.383190 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d69b6c966-npv8t" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.405111 4931 generic.go:334] "Generic (PLEG): container finished" podID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerID="9186f065f875b33ba8c3817977c815fff7b67dc7404ddb06ea6a85fb64800755" exitCode=0 Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.405358 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerDied","Data":"9186f065f875b33ba8c3817977c815fff7b67dc7404ddb06ea6a85fb64800755"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.407334 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.407944 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l65cq\" (UniqueName: \"kubernetes.io/projected/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-kube-api-access-l65cq\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.408013 4931 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.409374 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.413816 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"406c25f3-c398-4ace-ba4b-1d9b48b289a2","Type":"ContainerDied","Data":"76659dd7c99e1a96db2d103669e2a9a9122278f8d46deab3719b00840b99f159"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.413973 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.423707 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.429316 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.439128 4931 generic.go:334] "Generic (PLEG): container finished" podID="3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" containerID="d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a" exitCode=0 Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.439352 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.439624 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.455972 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.456198 4931 scope.go:117] "RemoveContainer" containerID="3795752efe01b170e153d47107ec186f09220104cdec5c71d0a210a63580f858" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.459681 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d338366-1ff1-4c95-aa94-30ba5c813138" path="/var/lib/kubelet/pods/0d338366-1ff1-4c95-aa94-30ba5c813138/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.460350 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2400d2d7-1da5-4a38-a558-c970226f95b9" path="/var/lib/kubelet/pods/2400d2d7-1da5-4a38-a558-c970226f95b9/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.460901 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2565fa42-f180-4948-8b2f-68c419d78d2b" path="/var/lib/kubelet/pods/2565fa42-f180-4948-8b2f-68c419d78d2b/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.465670 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" path="/var/lib/kubelet/pods/3415cfc4-a71a-4110-bf82-295181bb386f/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.466859 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ad7de9-e01d-414c-8a4d-9073ad986186" path="/var/lib/kubelet/pods/46ad7de9-e01d-414c-8a4d-9073ad986186/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.467545 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a63fb4-24bc-4834-b6e7-937688c5de09" path="/var/lib/kubelet/pods/49a63fb4-24bc-4834-b6e7-937688c5de09/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.468969 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" path="/var/lib/kubelet/pods/7c0ddaec-4521-4898-8649-262b52f24acb/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.469856 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9a7f86-7e9d-4062-9c50-72d0d82e24ef" path="/var/lib/kubelet/pods/7e9a7f86-7e9d-4062-9c50-72d0d82e24ef/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.470991 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef60747-e73b-451c-b8e1-6abd596d31bb" path="/var/lib/kubelet/pods/7ef60747-e73b-451c-b8e1-6abd596d31bb/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.471867 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88988b92-cd64-490d-b55f-959ecf4095af" path="/var/lib/kubelet/pods/88988b92-cd64-490d-b55f-959ecf4095af/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.472455 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d21216-5a0f-422c-9642-0ea353a33e82" path="/var/lib/kubelet/pods/98d21216-5a0f-422c-9642-0ea353a33e82/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.473199 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98fff7bd-db4c-462f-8f2c-34733f4e81ad" path="/var/lib/kubelet/pods/98fff7bd-db4c-462f-8f2c-34733f4e81ad/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.473623 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb44c01-e79f-42d8-912c-66db07c6b328" path="/var/lib/kubelet/pods/9bb44c01-e79f-42d8-912c-66db07c6b328/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.474561 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d923658-472c-4565-bae3-5eb1e329a92c" path="/var/lib/kubelet/pods/9d923658-472c-4565-bae3-5eb1e329a92c/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.475060 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6268b1-9fae-42a1-9f9e-0bed9c69cadd" path="/var/lib/kubelet/pods/bb6268b1-9fae-42a1-9f9e-0bed9c69cadd/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.475756 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5722020-7619-4a17-8990-e025402e2c3a" path="/var/lib/kubelet/pods/c5722020-7619-4a17-8990-e025402e2c3a/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.476967 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e" path="/var/lib/kubelet/pods/c86b96a8-cd5c-4ea7-8a6f-5b3a4b2d923e/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.477581 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13136a7-4633-4386-822d-ceb2cb3320b8" path="/var/lib/kubelet/pods/d13136a7-4633-4386-822d-ceb2cb3320b8/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.477943 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" path="/var/lib/kubelet/pods/ebe4f743-9a60-428f-8b58-14ba160d9fd7/volumes" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.478969 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.478995 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30f9b591-fea6-4010-99db-45eef2237cdc","Type":"ContainerDied","Data":"3ab5021fa2dee4a0cbf054b6b79552974b77b39e6c35cbc24e07bc801848b48b"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.479016 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.479032 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba","Type":"ContainerDied","Data":"d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.479079 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3bc265a8-34e2-4ec9-bdd5-69d75ea14bba","Type":"ContainerDied","Data":"04861bcc57b9390c9ad1874bbf632a1a5e0da259d664ad8c22e1c2db45c343a6"} Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.479093 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d69b6c966-npv8t"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.479105 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7d69b6c966-npv8t"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.497739 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.525346 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.533507 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.541956 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.584198 4931 scope.go:117] "RemoveContainer" containerID="6c90254ae67ae50ab19fa555ce55d1839d94322882a09bb91ad616b62efcfeeb" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.590264 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.606395 4931 scope.go:117] "RemoveContainer" containerID="a268ff4ead170d4fc7c25a89e846ed2d0f10278b94da4082529cc4ebd9ab4f0e" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.615568 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.618156 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrsv\" (UniqueName: \"kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.618244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts\") pod \"keystone-595b-account-create-update-jk6fx\" (UID: \"f0ad84e9-a4cc-40a3-850c-f7757aad5b5d\") " pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.618407 4931 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.618471 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts podName:f0ad84e9-a4cc-40a3-850c-f7757aad5b5d nodeName:}" failed. No retries permitted until 2026-01-30 05:29:51.61845634 +0000 UTC m=+1326.988366597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts") pod "keystone-595b-account-create-update-jk6fx" (UID: "f0ad84e9-a4cc-40a3-850c-f7757aad5b5d") : configmap "openstack-scripts" not found Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.621012 4931 projected.go:194] Error preparing data for projected volume kube-api-access-9qrsv for pod openstack/keystone-595b-account-create-update-jk6fx: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.621213 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv podName:f0ad84e9-a4cc-40a3-850c-f7757aad5b5d nodeName:}" failed. No retries permitted until 2026-01-30 05:29:51.621049263 +0000 UTC m=+1326.990959520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qrsv" (UniqueName: "kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv") pod "keystone-595b-account-create-update-jk6fx" (UID: "f0ad84e9-a4cc-40a3-850c-f7757aad5b5d") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.625619 4931 scope.go:117] "RemoveContainer" containerID="44392d9ac535d9a3ce2ca47aa88e680823c3197a2e50d537aa67df4b03e52fd1" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.651888 4931 scope.go:117] "RemoveContainer" containerID="0e0a199cc977b5213010336cc2b6c461a3916b61b6c3d9f6dc8eecc7d5c8d17e" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.686760 4931 scope.go:117] "RemoveContainer" containerID="d70e15b0f074e59f1a9f39048c2cf45a62e81400091cb70df139030d514fe003" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.727413 4931 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.727600 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data podName:081e3873-ea99-4486-925f-784a98e49405 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:57.727579126 +0000 UTC m=+1333.097489463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data") pod "rabbitmq-cell1-server-0" (UID: "081e3873-ea99-4486-925f-784a98e49405") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.742675 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.759619 4931 scope.go:117] "RemoveContainer" containerID="e5dadd497214a5d2efc5b8027947f3661f7f73599b0778570358c42329955e8d" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.798084 4931 scope.go:117] "RemoveContainer" containerID="100081f00d3d095ca7d8dca6b7343ac8590f3de539067c314527dbcd86ceca1c" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.825256 4931 scope.go:117] "RemoveContainer" containerID="d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.829035 4931 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.829115 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts podName:623f3c8f-d741-4ba4-baca-905a13102f38 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:51.829100648 +0000 UTC m=+1327.199010905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts") pod "root-account-create-update-6xxt5" (UID: "623f3c8f-d741-4ba4-baca-905a13102f38") : configmap "openstack-scripts" not found Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.851552 4931 scope.go:117] "RemoveContainer" containerID="d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a" Jan 30 05:29:49 crc kubenswrapper[4931]: E0130 05:29:49.853186 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a\": container with ID starting with d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a not found: ID does not exist" containerID="d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.853221 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a"} err="failed to get container status \"d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a\": rpc error: code = NotFound desc = could not find container \"d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a\": container with ID starting with d18c2a61b748bbaa1434cb9b18af9aeefe6182613328656e2c5f81364416b28a not found: ID does not exist" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.930690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq7wh\" (UniqueName: \"kubernetes.io/projected/623f3c8f-d741-4ba4-baca-905a13102f38-kube-api-access-pq7wh\") pod \"623f3c8f-d741-4ba4-baca-905a13102f38\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.930818 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts\") pod \"623f3c8f-d741-4ba4-baca-905a13102f38\" (UID: \"623f3c8f-d741-4ba4-baca-905a13102f38\") " Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.931310 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "623f3c8f-d741-4ba4-baca-905a13102f38" (UID: "623f3c8f-d741-4ba4-baca-905a13102f38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.931809 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623f3c8f-d741-4ba4-baca-905a13102f38-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4931]: I0130 05:29:49.941878 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623f3c8f-d741-4ba4-baca-905a13102f38-kube-api-access-pq7wh" (OuterVolumeSpecName: "kube-api-access-pq7wh") pod "623f3c8f-d741-4ba4-baca-905a13102f38" (UID: "623f3c8f-d741-4ba4-baca-905a13102f38"). InnerVolumeSpecName "kube-api-access-pq7wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.034325 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq7wh\" (UniqueName: \"kubernetes.io/projected/623f3c8f-d741-4ba4-baca-905a13102f38-kube-api-access-pq7wh\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.107973 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ggjtl" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" probeResult="failure" output="command timed out" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.140216 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ggjtl" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" probeResult="failure" output=< Jan 30 05:29:50 crc kubenswrapper[4931]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Jan 30 05:29:50 crc kubenswrapper[4931]: > Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.333229 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:29:50 crc kubenswrapper[4931]: E0130 05:29:50.411768 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949 is running failed: container process not found" containerID="cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 30 05:29:50 crc kubenswrapper[4931]: E0130 05:29:50.412175 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949 is running failed: container process not found" containerID="cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 30 05:29:50 crc kubenswrapper[4931]: E0130 05:29:50.412360 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949 is running failed: container process not found" containerID="cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 30 05:29:50 crc kubenswrapper[4931]: E0130 05:29:50.412392 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="ovn-northd" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.430692 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.442811 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-default\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.442909 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5fq4\" (UniqueName: \"kubernetes.io/projected/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kube-api-access-t5fq4\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.442968 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-galera-tls-certs\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443034 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kolla-config\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443125 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-operator-scripts\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443168 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443202 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-generated\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443232 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-combined-ca-bundle\") pod \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\" (UID: \"348ffd7a-9b7f-40aa-ada9-145a3a783d09\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443842 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.443718 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.444306 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.444468 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.457590 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_586d7a1d-7b2a-45ac-aacb-b77e95bf3d91/ovn-northd/0.log" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.457678 4931 generic.go:334] "Generic (PLEG): container finished" podID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerID="cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949" exitCode=139 Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.457789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91","Type":"ContainerDied","Data":"cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949"} Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.457820 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91","Type":"ContainerDied","Data":"14085ad5b1fb30e2e472a98f1ef3cb304ab6fa42857d4a5f5e235f581937f71b"} Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.457832 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14085ad5b1fb30e2e472a98f1ef3cb304ab6fa42857d4a5f5e235f581937f71b" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.460037 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.463467 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.463460 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fc3f4796-66b1-452b-afca-5e62cbf2a53b","Type":"ContainerDied","Data":"1c82fc5914a60be58942659c6c59b346ba961ba1c401d7f0c82d22447fc0b135"} Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.464081 4931 scope.go:117] "RemoveContainer" containerID="1c82fc5914a60be58942659c6c59b346ba961ba1c401d7f0c82d22447fc0b135" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.463308 4931 generic.go:334] "Generic (PLEG): container finished" podID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerID="1c82fc5914a60be58942659c6c59b346ba961ba1c401d7f0c82d22447fc0b135" exitCode=0 Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.467404 4931 generic.go:334] "Generic (PLEG): container finished" podID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerID="2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257" exitCode=0 Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.467502 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"348ffd7a-9b7f-40aa-ada9-145a3a783d09","Type":"ContainerDied","Data":"2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257"} Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.467530 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"348ffd7a-9b7f-40aa-ada9-145a3a783d09","Type":"ContainerDied","Data":"b84c7628d09612f5f198418a62d4a2daabe598f826560ac2746867af05368a8f"} Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.467658 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.470146 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.482538 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_586d7a1d-7b2a-45ac-aacb-b77e95bf3d91/ovn-northd/0.log" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.482645 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.482933 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595b-account-create-update-jk6fx" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.483953 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6xxt5" event={"ID":"623f3c8f-d741-4ba4-baca-905a13102f38","Type":"ContainerDied","Data":"b0a95f43d76f1ec6c04259c654f8d4d57485ed8bd7f8f8efedcad5a660e7a5b0"} Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.484267 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6xxt5" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.499571 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kube-api-access-t5fq4" (OuterVolumeSpecName: "kube-api-access-t5fq4") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "kube-api-access-t5fq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.545883 4931 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.545932 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.545958 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.545970 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.551645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.552479 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/348ffd7a-9b7f-40aa-ada9-145a3a783d09-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.552502 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5fq4\" (UniqueName: \"kubernetes.io/projected/348ffd7a-9b7f-40aa-ada9-145a3a783d09-kube-api-access-t5fq4\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.563336 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.571788 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "348ffd7a-9b7f-40aa-ada9-145a3a783d09" (UID: "348ffd7a-9b7f-40aa-ada9-145a3a783d09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.604878 4931 scope.go:117] "RemoveContainer" containerID="8bdcfbd624616a917de046867a4b176539e978b80dd1b9fad737dfdab9cb1bce" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.648861 4931 scope.go:117] "RemoveContainer" containerID="2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.649062 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-595b-account-create-update-jk6fx"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.652961 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-plugins-conf\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653010 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-northd-tls-certs\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653043 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-plugins\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653064 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc3f4796-66b1-452b-afca-5e62cbf2a53b-erlang-cookie-secret\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653092 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653108 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-metrics-certs-tls-certs\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653133 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-scripts\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653189 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb59q\" (UniqueName: \"kubernetes.io/projected/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-kube-api-access-qb59q\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653221 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653241 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-combined-ca-bundle\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653266 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc3f4796-66b1-452b-afca-5e62cbf2a53b-pod-info\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653282 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-confd\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653302 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xv5\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-kube-api-access-d6xv5\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653319 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-config\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653346 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-server-conf\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-erlang-cookie\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653388 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-rundir\") pod \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\" (UID: \"586d7a1d-7b2a-45ac-aacb-b77e95bf3d91\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653402 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-tls\") pod \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\" (UID: \"fc3f4796-66b1-452b-afca-5e62cbf2a53b\") " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.653537 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.655114 4931 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.655138 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.655151 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.655166 4931 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/348ffd7a-9b7f-40aa-ada9-145a3a783d09-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.655190 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-595b-account-create-update-jk6fx"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.655924 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-scripts" (OuterVolumeSpecName: "scripts") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.662689 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.662741 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-config" (OuterVolumeSpecName: "config") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.662975 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.663219 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.665021 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.666387 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.666481 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-kube-api-access-d6xv5" (OuterVolumeSpecName: "kube-api-access-d6xv5") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "kube-api-access-d6xv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.668538 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fc3f4796-66b1-452b-afca-5e62cbf2a53b-pod-info" (OuterVolumeSpecName: "pod-info") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.672582 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-kube-api-access-qb59q" (OuterVolumeSpecName: "kube-api-access-qb59q") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "kube-api-access-qb59q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.678089 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6xxt5"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.678799 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data" (OuterVolumeSpecName: "config-data") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.679568 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3f4796-66b1-452b-afca-5e62cbf2a53b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.681867 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.683579 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6xxt5"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.707385 4931 scope.go:117] "RemoveContainer" containerID="8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.711820 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-server-conf" (OuterVolumeSpecName: "server-conf") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.727837 4931 scope.go:117] "RemoveContainer" containerID="2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257" Jan 30 05:29:50 crc kubenswrapper[4931]: E0130 05:29:50.728138 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257\": container with ID starting with 2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257 not found: ID does not exist" containerID="2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.728201 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257"} err="failed to get container status \"2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257\": rpc error: code = NotFound desc = could not find container \"2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257\": container with ID starting with 2be8fb84347f40a619544d41d9e655f731841a9706fc18fb88a00804b3fe0257 not found: ID does not exist" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.728224 4931 scope.go:117] "RemoveContainer" containerID="8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07" Jan 30 05:29:50 crc kubenswrapper[4931]: E0130 05:29:50.728492 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07\": container with ID starting with 8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07 not found: ID does not exist" containerID="8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.728533 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07"} err="failed to get container status \"8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07\": rpc error: code = NotFound desc = could not find container \"8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07\": container with ID starting with 8c5e06a4cbe821677c0328bfab275af633aec8c57e858d92fdfeb861d12c3e07 not found: ID does not exist" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.728600 4931 scope.go:117] "RemoveContainer" containerID="7543927ad63a3b9e73b08c0c9fa7a83b683adacd56d1d27f7d72109cd07d12dd" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.731639 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.736650 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" (UID: "586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.749055 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fc3f4796-66b1-452b-afca-5e62cbf2a53b" (UID: "fc3f4796-66b1-452b-afca-5e62cbf2a53b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756619 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756643 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756652 4931 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fc3f4796-66b1-452b-afca-5e62cbf2a53b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756697 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756714 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756725 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756733 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756765 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb59q\" (UniqueName: \"kubernetes.io/projected/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-kube-api-access-qb59q\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756774 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qrsv\" (UniqueName: \"kubernetes.io/projected/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d-kube-api-access-9qrsv\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756783 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756792 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756799 4931 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fc3f4796-66b1-452b-afca-5e62cbf2a53b-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756807 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756816 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xv5\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-kube-api-access-d6xv5\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756842 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756850 4931 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fc3f4796-66b1-452b-afca-5e62cbf2a53b-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756859 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756868 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.756876 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fc3f4796-66b1-452b-afca-5e62cbf2a53b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.788084 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.799129 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.806005 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.823372 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.829272 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:50 crc kubenswrapper[4931]: I0130 05:29:50.858024 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.225451 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-97bdbd495-2prdt" podUID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.155:5000/v3\": read tcp 10.217.0.2:50830->10.217.0.155:5000: read: connection reset by peer" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.432239 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f9b591-fea6-4010-99db-45eef2237cdc" path="/var/lib/kubelet/pods/30f9b591-fea6-4010-99db-45eef2237cdc/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.433172 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" path="/var/lib/kubelet/pods/348ffd7a-9b7f-40aa-ada9-145a3a783d09/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.433935 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" path="/var/lib/kubelet/pods/3bc265a8-34e2-4ec9-bdd5-69d75ea14bba/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.435105 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" path="/var/lib/kubelet/pods/406c25f3-c398-4ace-ba4b-1d9b48b289a2/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.435794 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" path="/var/lib/kubelet/pods/58928fea-709c-44d8-bd12-23937da8e2c4/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.436340 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" path="/var/lib/kubelet/pods/623f3c8f-d741-4ba4-baca-905a13102f38/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.437586 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" path="/var/lib/kubelet/pods/e4e6d6a8-599b-4ab9-b1f7-cf521e455d74/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.438034 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ad84e9-a4cc-40a3-850c-f7757aad5b5d" path="/var/lib/kubelet/pods/f0ad84e9-a4cc-40a3-850c-f7757aad5b5d/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.438618 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" path="/var/lib/kubelet/pods/fc3f4796-66b1-452b-afca-5e62cbf2a53b/volumes" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.501118 4931 generic.go:334] "Generic (PLEG): container finished" podID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" containerID="2031f531f783ff9fda1aa19098c42b8b6619a54760d8a1056a1788a1c38b669b" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.501543 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97bdbd495-2prdt" event={"ID":"2d6e5156-6e75-4dff-a322-b3d43e596c7e","Type":"ContainerDied","Data":"2031f531f783ff9fda1aa19098c42b8b6619a54760d8a1056a1788a1c38b669b"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.504976 4931 generic.go:334] "Generic (PLEG): container finished" podID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerID="62da5f526098b3b9f5437a81119156f87878963a1c44c026236d9b63e20bbac5" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.505013 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerDied","Data":"62da5f526098b3b9f5437a81119156f87878963a1c44c026236d9b63e20bbac5"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.505028 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cacfcbd5-8c12-4fc5-88ce-516fda23464d","Type":"ContainerDied","Data":"99a1153c1cd92ab2a34d0651a54dd16cc1116a03a4d5c96b1f4e7e5abbde1e2d"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.505038 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a1153c1cd92ab2a34d0651a54dd16cc1116a03a4d5c96b1f4e7e5abbde1e2d" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.506409 4931 generic.go:334] "Generic (PLEG): container finished" podID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerID="2c58ff417f0ff91cddebc47f633febd5ac50ac3ddd97dcc2fce28574c94ac8a6" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.506468 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c996f77-c9rqm" event={"ID":"7729e2d8-6c8c-4759-9e5d-535ad1586f47","Type":"ContainerDied","Data":"2c58ff417f0ff91cddebc47f633febd5ac50ac3ddd97dcc2fce28574c94ac8a6"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.508911 4931 generic.go:334] "Generic (PLEG): container finished" podID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerID="1bd0c14353cbfd196f658cae7f7167624a1cc818a0ca23ec5151f1c871a22e65" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.508950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" event={"ID":"728a2e60-915e-4447-9465-aa64f7f5c7bb","Type":"ContainerDied","Data":"1bd0c14353cbfd196f658cae7f7167624a1cc818a0ca23ec5151f1c871a22e65"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.510471 4931 generic.go:334] "Generic (PLEG): container finished" podID="081e3873-ea99-4486-925f-784a98e49405" containerID="1bb7e19530d33f5a4cf134ca5c6644743c868cc750cac0c1bb313f0f47240dd8" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.510528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"081e3873-ea99-4486-925f-784a98e49405","Type":"ContainerDied","Data":"1bb7e19530d33f5a4cf134ca5c6644743c868cc750cac0c1bb313f0f47240dd8"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.512374 4931 generic.go:334] "Generic (PLEG): container finished" podID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" containerID="cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.512482 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.514187 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1acfa9c2-a802-404e-976b-93d9f99e1fbb","Type":"ContainerDied","Data":"cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52"} Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.545519 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.553833 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.559106 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:29:51 crc kubenswrapper[4931]: E0130 05:29:51.572889 4931 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 30 05:29:51 crc kubenswrapper[4931]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-01-30T05:29:44Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 30 05:29:51 crc kubenswrapper[4931]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Jan 30 05:29:51 crc kubenswrapper[4931]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-ggjtl" message=< Jan 30 05:29:51 crc kubenswrapper[4931]: Exiting ovn-controller (1) [FAILED] Jan 30 05:29:51 crc kubenswrapper[4931]: Killing ovn-controller (1) [ OK ] Jan 30 05:29:51 crc kubenswrapper[4931]: 2026-01-30T05:29:44Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 30 05:29:51 crc kubenswrapper[4931]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Jan 30 05:29:51 crc kubenswrapper[4931]: > Jan 30 05:29:51 crc kubenswrapper[4931]: E0130 05:29:51.572937 4931 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 30 05:29:51 crc kubenswrapper[4931]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-01-30T05:29:44Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 30 05:29:51 crc kubenswrapper[4931]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Jan 30 05:29:51 crc kubenswrapper[4931]: > pod="openstack/ovn-controller-ggjtl" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" containerID="cri-o://324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.572998 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ggjtl" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" containerID="cri-o://324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9" gracePeriod=22 Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.677741 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-run-httpd\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.677841 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4df4\" (UniqueName: \"kubernetes.io/projected/cacfcbd5-8c12-4fc5-88ce-516fda23464d-kube-api-access-q4df4\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.677899 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-combined-ca-bundle\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.677924 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-scripts\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.677943 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-ceilometer-tls-certs\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.678011 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-config-data\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.678039 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-sg-core-conf-yaml\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.678071 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-log-httpd\") pod \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\" (UID: \"cacfcbd5-8c12-4fc5-88ce-516fda23464d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.678797 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.679049 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.684962 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-scripts" (OuterVolumeSpecName: "scripts") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.685017 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cacfcbd5-8c12-4fc5-88ce-516fda23464d-kube-api-access-q4df4" (OuterVolumeSpecName: "kube-api-access-q4df4") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "kube-api-access-q4df4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.728968 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.749578 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.751653 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.769518 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.779905 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781275 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4df4\" (UniqueName: \"kubernetes.io/projected/cacfcbd5-8c12-4fc5-88ce-516fda23464d-kube-api-access-q4df4\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781301 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781310 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781320 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781328 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781336 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.781344 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cacfcbd5-8c12-4fc5-88ce-516fda23464d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.786641 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.790645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-config-data" (OuterVolumeSpecName: "config-data") pod "cacfcbd5-8c12-4fc5-88ce-516fda23464d" (UID: "cacfcbd5-8c12-4fc5-88ce-516fda23464d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.798133 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.830502 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.882999 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data-custom\") pod \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883096 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r29mt\" (UniqueName: \"kubernetes.io/projected/728a2e60-915e-4447-9465-aa64f7f5c7bb-kube-api-access-r29mt\") pod \"728a2e60-915e-4447-9465-aa64f7f5c7bb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883122 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data\") pod \"728a2e60-915e-4447-9465-aa64f7f5c7bb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883154 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data-custom\") pod \"728a2e60-915e-4447-9465-aa64f7f5c7bb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883214 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-combined-ca-bundle\") pod \"728a2e60-915e-4447-9465-aa64f7f5c7bb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883291 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7729e2d8-6c8c-4759-9e5d-535ad1586f47-logs\") pod \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpv7q\" (UniqueName: \"kubernetes.io/projected/7729e2d8-6c8c-4759-9e5d-535ad1586f47-kube-api-access-vpv7q\") pod \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883386 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-combined-ca-bundle\") pod \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883407 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data\") pod \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\" (UID: \"7729e2d8-6c8c-4759-9e5d-535ad1586f47\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883457 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2e60-915e-4447-9465-aa64f7f5c7bb-logs\") pod \"728a2e60-915e-4447-9465-aa64f7f5c7bb\" (UID: \"728a2e60-915e-4447-9465-aa64f7f5c7bb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883838 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cacfcbd5-8c12-4fc5-88ce-516fda23464d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.883851 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7729e2d8-6c8c-4759-9e5d-535ad1586f47-logs" (OuterVolumeSpecName: "logs") pod "7729e2d8-6c8c-4759-9e5d-535ad1586f47" (UID: "7729e2d8-6c8c-4759-9e5d-535ad1586f47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.884337 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728a2e60-915e-4447-9465-aa64f7f5c7bb-logs" (OuterVolumeSpecName: "logs") pod "728a2e60-915e-4447-9465-aa64f7f5c7bb" (UID: "728a2e60-915e-4447-9465-aa64f7f5c7bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.887386 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728a2e60-915e-4447-9465-aa64f7f5c7bb-kube-api-access-r29mt" (OuterVolumeSpecName: "kube-api-access-r29mt") pod "728a2e60-915e-4447-9465-aa64f7f5c7bb" (UID: "728a2e60-915e-4447-9465-aa64f7f5c7bb"). InnerVolumeSpecName "kube-api-access-r29mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.888338 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7729e2d8-6c8c-4759-9e5d-535ad1586f47" (UID: "7729e2d8-6c8c-4759-9e5d-535ad1586f47"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.893596 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7729e2d8-6c8c-4759-9e5d-535ad1586f47-kube-api-access-vpv7q" (OuterVolumeSpecName: "kube-api-access-vpv7q") pod "7729e2d8-6c8c-4759-9e5d-535ad1586f47" (UID: "7729e2d8-6c8c-4759-9e5d-535ad1586f47"). InnerVolumeSpecName "kube-api-access-vpv7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.895506 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "728a2e60-915e-4447-9465-aa64f7f5c7bb" (UID: "728a2e60-915e-4447-9465-aa64f7f5c7bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.907340 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7729e2d8-6c8c-4759-9e5d-535ad1586f47" (UID: "7729e2d8-6c8c-4759-9e5d-535ad1586f47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.924595 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728a2e60-915e-4447-9465-aa64f7f5c7bb" (UID: "728a2e60-915e-4447-9465-aa64f7f5c7bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.931673 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data" (OuterVolumeSpecName: "config-data") pod "7729e2d8-6c8c-4759-9e5d-535ad1586f47" (UID: "7729e2d8-6c8c-4759-9e5d-535ad1586f47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.933610 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ggjtl_8a337463-8b7e-496b-9a01-fc491120c21d/ovn-controller/0.log" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.933697 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.942999 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data" (OuterVolumeSpecName: "config-data") pod "728a2e60-915e-4447-9465-aa64f7f5c7bb" (UID: "728a2e60-915e-4447-9465-aa64f7f5c7bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985325 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985538 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-internal-tls-certs\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985667 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrm76\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-kube-api-access-rrm76\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985741 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-config-data\") pod \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985820 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-server-conf\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985894 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/081e3873-ea99-4486-925f-784a98e49405-erlang-cookie-secret\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.985969 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/081e3873-ea99-4486-925f-784a98e49405-pod-info\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986054 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-ovn-controller-tls-certs\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986148 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-config-data\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-plugins-conf\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986269 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx75c\" (UniqueName: \"kubernetes.io/projected/2d6e5156-6e75-4dff-a322-b3d43e596c7e-kube-api-access-hx75c\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-scripts\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986392 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-fernet-keys\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986481 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-public-tls-certs\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986556 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986632 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-erlang-cookie\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986702 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-tls\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986770 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5r9p\" (UniqueName: \"kubernetes.io/projected/1acfa9c2-a802-404e-976b-93d9f99e1fbb-kube-api-access-z5r9p\") pod \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986834 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a337463-8b7e-496b-9a01-fc491120c21d-scripts\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986893 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-combined-ca-bundle\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.986982 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-combined-ca-bundle\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987058 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7snh\" (UniqueName: \"kubernetes.io/projected/8a337463-8b7e-496b-9a01-fc491120c21d-kube-api-access-l7snh\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987145 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-credential-keys\") pod \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\" (UID: \"2d6e5156-6e75-4dff-a322-b3d43e596c7e\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987608 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-plugins\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987709 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-confd\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987775 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-combined-ca-bundle\") pod \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\" (UID: \"1acfa9c2-a802-404e-976b-93d9f99e1fbb\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987839 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-log-ovn\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.987913 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"081e3873-ea99-4486-925f-784a98e49405\" (UID: \"081e3873-ea99-4486-925f-784a98e49405\") " Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988302 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r29mt\" (UniqueName: \"kubernetes.io/projected/728a2e60-915e-4447-9465-aa64f7f5c7bb-kube-api-access-r29mt\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988365 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988459 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988544 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2e60-915e-4447-9465-aa64f7f5c7bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988612 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7729e2d8-6c8c-4759-9e5d-535ad1586f47-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988742 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpv7q\" (UniqueName: \"kubernetes.io/projected/7729e2d8-6c8c-4759-9e5d-535ad1586f47-kube-api-access-vpv7q\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988815 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988867 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988921 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2e60-915e-4447-9465-aa64f7f5c7bb-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.988973 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7729e2d8-6c8c-4759-9e5d-535ad1586f47-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.989779 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.990967 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.993142 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6e5156-6e75-4dff-a322-b3d43e596c7e-kube-api-access-hx75c" (OuterVolumeSpecName: "kube-api-access-hx75c") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "kube-api-access-hx75c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.993267 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.994623 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run" (OuterVolumeSpecName: "var-run") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.994908 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081e3873-ea99-4486-925f-784a98e49405-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.995903 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.996053 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a337463-8b7e-496b-9a01-fc491120c21d-scripts" (OuterVolumeSpecName: "scripts") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.998248 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.998275 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a337463-8b7e-496b-9a01-fc491120c21d-kube-api-access-l7snh" (OuterVolumeSpecName: "kube-api-access-l7snh") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "kube-api-access-l7snh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.998525 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-kube-api-access-rrm76" (OuterVolumeSpecName: "kube-api-access-rrm76") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "kube-api-access-rrm76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.998720 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.999051 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-scripts" (OuterVolumeSpecName: "scripts") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4931]: I0130 05:29:51.999128 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.000379 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acfa9c2-a802-404e-976b-93d9f99e1fbb-kube-api-access-z5r9p" (OuterVolumeSpecName: "kube-api-access-z5r9p") pod "1acfa9c2-a802-404e-976b-93d9f99e1fbb" (UID: "1acfa9c2-a802-404e-976b-93d9f99e1fbb"). InnerVolumeSpecName "kube-api-access-z5r9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.004921 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.008672 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/081e3873-ea99-4486-925f-784a98e49405-pod-info" (OuterVolumeSpecName: "pod-info") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.024243 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data" (OuterVolumeSpecName: "config-data") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.048570 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-config-data" (OuterVolumeSpecName: "config-data") pod "1acfa9c2-a802-404e-976b-93d9f99e1fbb" (UID: "1acfa9c2-a802-404e-976b-93d9f99e1fbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.054722 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.057078 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1acfa9c2-a802-404e-976b-93d9f99e1fbb" (UID: "1acfa9c2-a802-404e-976b-93d9f99e1fbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.060480 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-server-conf" (OuterVolumeSpecName: "server-conf") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.060872 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-config-data" (OuterVolumeSpecName: "config-data") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.062655 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.074091 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.077590 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d6e5156-6e75-4dff-a322-b3d43e596c7e" (UID: "2d6e5156-6e75-4dff-a322-b3d43e596c7e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.083981 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089288 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run-ovn\") pod \"8a337463-8b7e-496b-9a01-fc491120c21d\" (UID: \"8a337463-8b7e-496b-9a01-fc491120c21d\") " Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089387 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8a337463-8b7e-496b-9a01-fc491120c21d" (UID: "8a337463-8b7e-496b-9a01-fc491120c21d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089587 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089611 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089625 4931 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089637 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx75c\" (UniqueName: \"kubernetes.io/projected/2d6e5156-6e75-4dff-a322-b3d43e596c7e-kube-api-access-hx75c\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089649 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089659 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089669 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089681 4931 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089694 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089703 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089713 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a337463-8b7e-496b-9a01-fc491120c21d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089723 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5r9p\" (UniqueName: \"kubernetes.io/projected/1acfa9c2-a802-404e-976b-93d9f99e1fbb-kube-api-access-z5r9p\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089734 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089745 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a337463-8b7e-496b-9a01-fc491120c21d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089755 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7snh\" (UniqueName: \"kubernetes.io/projected/8a337463-8b7e-496b-9a01-fc491120c21d-kube-api-access-l7snh\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089766 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089776 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089786 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089796 4931 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089828 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089840 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089850 4931 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a337463-8b7e-496b-9a01-fc491120c21d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089860 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6e5156-6e75-4dff-a322-b3d43e596c7e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089871 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrm76\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-kube-api-access-rrm76\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089882 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acfa9c2-a802-404e-976b-93d9f99e1fbb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089893 4931 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/081e3873-ea99-4486-925f-784a98e49405-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089903 4931 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/081e3873-ea99-4486-925f-784a98e49405-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.089913 4931 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/081e3873-ea99-4486-925f-784a98e49405-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.103566 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.107340 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "081e3873-ea99-4486-925f-784a98e49405" (UID: "081e3873-ea99-4486-925f-784a98e49405"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.191643 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/081e3873-ea99-4486-925f-784a98e49405-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.191687 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.525628 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ggjtl_8a337463-8b7e-496b-9a01-fc491120c21d/ovn-controller/0.log" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.526817 4931 generic.go:334] "Generic (PLEG): container finished" podID="8a337463-8b7e-496b-9a01-fc491120c21d" containerID="324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9" exitCode=139 Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.526925 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl" event={"ID":"8a337463-8b7e-496b-9a01-fc491120c21d","Type":"ContainerDied","Data":"324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.527513 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ggjtl" event={"ID":"8a337463-8b7e-496b-9a01-fc491120c21d","Type":"ContainerDied","Data":"c2331a0e3efc476073fa6f72907e46cdc0fd3358dd0c363648234586881ae09d"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.526955 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ggjtl" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.527597 4931 scope.go:117] "RemoveContainer" containerID="324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.533115 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c996f77-c9rqm" event={"ID":"7729e2d8-6c8c-4759-9e5d-535ad1586f47","Type":"ContainerDied","Data":"ac16bb78f3ca5ff67b0b11f3773806ecd75b6cd0b938e4013f99b8b4e7b2e044"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.533246 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c996f77-c9rqm" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.550516 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" event={"ID":"728a2e60-915e-4447-9465-aa64f7f5c7bb","Type":"ContainerDied","Data":"cd9a53b66398f13fcc5edf6801d39072217390bf6fb5b5264a9e5d24f429383b"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.550561 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5d456c6b-66jxb" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.557449 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97bdbd495-2prdt" event={"ID":"2d6e5156-6e75-4dff-a322-b3d43e596c7e","Type":"ContainerDied","Data":"c2c40320b6d71850a7db7d062b86807a450e4758cd147671abdfe8fd00c2df62"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.557470 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97bdbd495-2prdt" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.564907 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"081e3873-ea99-4486-925f-784a98e49405","Type":"ContainerDied","Data":"9ce481797a1f7988304010979cd564d60b819812a50932395d5f66e51b07187f"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.564947 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.571171 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.571163 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1acfa9c2-a802-404e-976b-93d9f99e1fbb","Type":"ContainerDied","Data":"24154bd6bbe2da670ea864204ee97206379a1b7b92792be6f14d33757f908143"} Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.571462 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.595841 4931 scope.go:117] "RemoveContainer" containerID="324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9" Jan 30 05:29:52 crc kubenswrapper[4931]: E0130 05:29:52.598728 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9\": container with ID starting with 324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9 not found: ID does not exist" containerID="324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.598785 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9"} err="failed to get container status \"324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9\": rpc error: code = NotFound desc = could not find container \"324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9\": container with ID starting with 324b877332d5afddf300c75ec0403e36ade2235644b01dec590b1d759e7f75a9 not found: ID does not exist" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.598815 4931 scope.go:117] "RemoveContainer" containerID="2c58ff417f0ff91cddebc47f633febd5ac50ac3ddd97dcc2fce28574c94ac8a6" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.605942 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ggjtl"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.613272 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ggjtl"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.631513 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7c996f77-c9rqm"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.645260 4931 scope.go:117] "RemoveContainer" containerID="4ec5d987a513f3f04cf30f8d242bd5ee734a2387c5a279b070c18b72f4a56519" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.645589 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7c996f77-c9rqm"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.660574 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5f5d456c6b-66jxb"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.668928 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5f5d456c6b-66jxb"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.674576 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.675248 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.680389 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.681527 4931 scope.go:117] "RemoveContainer" containerID="1bd0c14353cbfd196f658cae7f7167624a1cc818a0ca23ec5151f1c871a22e65" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.685744 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.692522 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-97bdbd495-2prdt"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.696704 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-97bdbd495-2prdt"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.701038 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.705179 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.709058 4931 scope.go:117] "RemoveContainer" containerID="f0f483180c30bc672edc2e00c840d52567eb9b0c61f8c285d3a7c2a185f38020" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.725158 4931 scope.go:117] "RemoveContainer" containerID="2031f531f783ff9fda1aa19098c42b8b6619a54760d8a1056a1788a1c38b669b" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.750086 4931 scope.go:117] "RemoveContainer" containerID="1bb7e19530d33f5a4cf134ca5c6644743c868cc750cac0c1bb313f0f47240dd8" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.770896 4931 scope.go:117] "RemoveContainer" containerID="4db4fc560f1e0be65146b56bdc4340b3ff1c5a4fe7510a353795090f99291213" Jan 30 05:29:52 crc kubenswrapper[4931]: I0130 05:29:52.800741 4931 scope.go:117] "RemoveContainer" containerID="cc83b1f403d157f95969723a88f60d5874181dc078497768c1cec64f4187dd52" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.439153 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081e3873-ea99-4486-925f-784a98e49405" path="/var/lib/kubelet/pods/081e3873-ea99-4486-925f-784a98e49405/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.439727 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" path="/var/lib/kubelet/pods/1acfa9c2-a802-404e-976b-93d9f99e1fbb/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.440170 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" path="/var/lib/kubelet/pods/2d6e5156-6e75-4dff-a322-b3d43e596c7e/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.441144 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" path="/var/lib/kubelet/pods/586d7a1d-7b2a-45ac-aacb-b77e95bf3d91/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.441761 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" path="/var/lib/kubelet/pods/728a2e60-915e-4447-9465-aa64f7f5c7bb/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.442318 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" path="/var/lib/kubelet/pods/7729e2d8-6c8c-4759-9e5d-535ad1586f47/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.443458 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" path="/var/lib/kubelet/pods/8a337463-8b7e-496b-9a01-fc491120c21d/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.444127 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" path="/var/lib/kubelet/pods/cacfcbd5-8c12-4fc5-88ce-516fda23464d/volumes" Jan 30 05:29:53 crc kubenswrapper[4931]: I0130 05:29:53.567129 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-75d9f6f6ff-kmswn" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.165:9696/\": dial tcp 10.217.0.165:9696: connect: connection refused" Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.341462 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.342359 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.342835 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.342942 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.344185 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.345755 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.347624 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4931]: E0130 05:29:54.347700 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:29:55 crc kubenswrapper[4931]: I0130 05:29:55.326024 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: i/o timeout" Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.363228 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.363317 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.363385 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.364476 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"083c2726f719c1b6c228fc0d209a309a403985263c1ced3ea0982529442fd973"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.364604 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://083c2726f719c1b6c228fc0d209a309a403985263c1ced3ea0982529442fd973" gracePeriod=600 Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.637987 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="083c2726f719c1b6c228fc0d209a309a403985263c1ced3ea0982529442fd973" exitCode=0 Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.638044 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"083c2726f719c1b6c228fc0d209a309a403985263c1ced3ea0982529442fd973"} Jan 30 05:29:57 crc kubenswrapper[4931]: I0130 05:29:57.638289 4931 scope.go:117] "RemoveContainer" containerID="a45fd242a77041b5be27fe445a509a614e0332f92cf4e23ef129ae6c3582244f" Jan 30 05:29:58 crc kubenswrapper[4931]: I0130 05:29:58.676333 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0"} Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.341204 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.341754 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.342095 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.342158 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.343948 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.345707 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.347192 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4931]: E0130 05:29:59.347299 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177249 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv"] Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177717 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" containerName="nova-scheduler-scheduler" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177738 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" containerName="nova-scheduler-scheduler" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177758 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-metadata" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177770 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-metadata" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177795 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" containerName="mariadb-account-create-update" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177807 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" containerName="mariadb-account-create-update" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177824 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="openstack-network-exporter" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177836 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="openstack-network-exporter" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177852 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="sg-core" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177864 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="sg-core" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177887 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177899 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177920 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177934 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177949 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" containerName="memcached" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177961 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" containerName="memcached" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.177978 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.177990 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178002 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="proxy-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178013 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="proxy-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178032 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178045 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178069 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerName="galera" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178080 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerName="galera" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178094 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178105 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178123 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178135 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178157 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178169 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178185 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178196 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-api" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178220 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="ovn-northd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178232 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="ovn-northd" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178244 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178256 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178271 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178283 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178304 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178317 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178338 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178350 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-api" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178371 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178383 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178401 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178413 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178454 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-notification-agent" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178467 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-notification-agent" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178491 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178504 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178527 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178539 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178555 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" containerName="keystone-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178568 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" containerName="keystone-api" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178583 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" containerName="mariadb-account-create-update" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178596 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" containerName="mariadb-account-create-update" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178618 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerName="mysql-bootstrap" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178630 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerName="mysql-bootstrap" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178642 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178653 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178668 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f9b591-fea6-4010-99db-45eef2237cdc" containerName="kube-state-metrics" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178680 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f9b591-fea6-4010-99db-45eef2237cdc" containerName="kube-state-metrics" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178700 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178714 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178729 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178740 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker-log" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178759 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178771 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker" Jan 30 05:30:00 crc kubenswrapper[4931]: E0130 05:30:00.178787 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-central-agent" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.178799 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-central-agent" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179036 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acfa9c2-a802-404e-976b-93d9f99e1fbb" containerName="nova-scheduler-scheduler" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179058 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" containerName="mariadb-account-create-update" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179077 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="ovn-northd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179099 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179122 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179143 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="348ffd7a-9b7f-40aa-ada9-145a3a783d09" containerName="galera" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179162 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7729e2d8-6c8c-4759-9e5d-535ad1586f47" containerName="barbican-worker-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179181 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179203 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="586d7a1d-7b2a-45ac-aacb-b77e95bf3d91" containerName="openstack-network-exporter" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179227 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-notification-agent" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179244 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f9b591-fea6-4010-99db-45eef2237cdc" containerName="kube-state-metrics" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179256 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a337463-8b7e-496b-9a01-fc491120c21d" containerName="ovn-controller" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179275 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179297 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="081e3873-ea99-4486-925f-784a98e49405" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179319 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe4f743-9a60-428f-8b58-14ba160d9fd7" containerName="placement-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179340 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179358 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="sg-core" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179377 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a2e60-915e-4447-9465-aa64f7f5c7bb" containerName="barbican-keystone-listener-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179391 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc265a8-34e2-4ec9-bdd5-69d75ea14bba" containerName="memcached" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179411 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6e5156-6e75-4dff-a322-b3d43e596c7e" containerName="keystone-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179452 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-metadata" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179473 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179493 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="58928fea-709c-44d8-bd12-23937da8e2c4" containerName="barbican-api-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179509 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3415cfc4-a71a-4110-bf82-295181bb386f" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179524 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-api" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179538 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3f4796-66b1-452b-afca-5e62cbf2a53b" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179586 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="ceilometer-central-agent" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179606 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cacfcbd5-8c12-4fc5-88ce-516fda23464d" containerName="proxy-httpd" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179626 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="406c25f3-c398-4ace-ba4b-1d9b48b289a2" containerName="nova-api-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179652 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e6d6a8-599b-4ab9-b1f7-cf521e455d74" containerName="nova-metadata-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.179677 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0ddaec-4521-4898-8649-262b52f24acb" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.180546 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.185495 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.193787 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.204081 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv"] Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.243374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkswv\" (UniqueName: \"kubernetes.io/projected/09a19500-eb44-455f-a8b7-7ee5375b87ef-kube-api-access-dkswv\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.243647 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09a19500-eb44-455f-a8b7-7ee5375b87ef-config-volume\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.243791 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09a19500-eb44-455f-a8b7-7ee5375b87ef-secret-volume\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.344761 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09a19500-eb44-455f-a8b7-7ee5375b87ef-secret-volume\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.344916 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkswv\" (UniqueName: \"kubernetes.io/projected/09a19500-eb44-455f-a8b7-7ee5375b87ef-kube-api-access-dkswv\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.344946 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09a19500-eb44-455f-a8b7-7ee5375b87ef-config-volume\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.345929 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09a19500-eb44-455f-a8b7-7ee5375b87ef-config-volume\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.351946 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09a19500-eb44-455f-a8b7-7ee5375b87ef-secret-volume\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.368983 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkswv\" (UniqueName: \"kubernetes.io/projected/09a19500-eb44-455f-a8b7-7ee5375b87ef-kube-api-access-dkswv\") pod \"collect-profiles-29495850-8t6xv\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.518284 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:00 crc kubenswrapper[4931]: I0130 05:30:00.821388 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv"] Jan 30 05:30:00 crc kubenswrapper[4931]: W0130 05:30:00.831745 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a19500_eb44_455f_a8b7_7ee5375b87ef.slice/crio-1b851dd5da777878c4407279b086c7ba0baadc7fa13f64016500cc321a47e3f9 WatchSource:0}: Error finding container 1b851dd5da777878c4407279b086c7ba0baadc7fa13f64016500cc321a47e3f9: Status 404 returned error can't find the container with id 1b851dd5da777878c4407279b086c7ba0baadc7fa13f64016500cc321a47e3f9 Jan 30 05:30:01 crc kubenswrapper[4931]: I0130 05:30:01.716844 4931 generic.go:334] "Generic (PLEG): container finished" podID="09a19500-eb44-455f-a8b7-7ee5375b87ef" containerID="f1cc6685442d84c78caf7ee74e69ba6f0a12fa18a641f9f2d8eb2d03f2ae6e04" exitCode=0 Jan 30 05:30:01 crc kubenswrapper[4931]: I0130 05:30:01.716957 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" event={"ID":"09a19500-eb44-455f-a8b7-7ee5375b87ef","Type":"ContainerDied","Data":"f1cc6685442d84c78caf7ee74e69ba6f0a12fa18a641f9f2d8eb2d03f2ae6e04"} Jan 30 05:30:01 crc kubenswrapper[4931]: I0130 05:30:01.717288 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" event={"ID":"09a19500-eb44-455f-a8b7-7ee5375b87ef","Type":"ContainerStarted","Data":"1b851dd5da777878c4407279b086c7ba0baadc7fa13f64016500cc321a47e3f9"} Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.175979 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.193972 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkswv\" (UniqueName: \"kubernetes.io/projected/09a19500-eb44-455f-a8b7-7ee5375b87ef-kube-api-access-dkswv\") pod \"09a19500-eb44-455f-a8b7-7ee5375b87ef\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.194062 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09a19500-eb44-455f-a8b7-7ee5375b87ef-secret-volume\") pod \"09a19500-eb44-455f-a8b7-7ee5375b87ef\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.194099 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09a19500-eb44-455f-a8b7-7ee5375b87ef-config-volume\") pod \"09a19500-eb44-455f-a8b7-7ee5375b87ef\" (UID: \"09a19500-eb44-455f-a8b7-7ee5375b87ef\") " Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.194953 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a19500-eb44-455f-a8b7-7ee5375b87ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "09a19500-eb44-455f-a8b7-7ee5375b87ef" (UID: "09a19500-eb44-455f-a8b7-7ee5375b87ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.210182 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a19500-eb44-455f-a8b7-7ee5375b87ef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "09a19500-eb44-455f-a8b7-7ee5375b87ef" (UID: "09a19500-eb44-455f-a8b7-7ee5375b87ef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.210874 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a19500-eb44-455f-a8b7-7ee5375b87ef-kube-api-access-dkswv" (OuterVolumeSpecName: "kube-api-access-dkswv") pod "09a19500-eb44-455f-a8b7-7ee5375b87ef" (UID: "09a19500-eb44-455f-a8b7-7ee5375b87ef"). InnerVolumeSpecName "kube-api-access-dkswv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.296128 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkswv\" (UniqueName: \"kubernetes.io/projected/09a19500-eb44-455f-a8b7-7ee5375b87ef-kube-api-access-dkswv\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.296194 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09a19500-eb44-455f-a8b7-7ee5375b87ef-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.296214 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09a19500-eb44-455f-a8b7-7ee5375b87ef-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.744438 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.744472 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv" event={"ID":"09a19500-eb44-455f-a8b7-7ee5375b87ef","Type":"ContainerDied","Data":"1b851dd5da777878c4407279b086c7ba0baadc7fa13f64016500cc321a47e3f9"} Jan 30 05:30:03 crc kubenswrapper[4931]: I0130 05:30:03.744526 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b851dd5da777878c4407279b086c7ba0baadc7fa13f64016500cc321a47e3f9" Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.341806 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.342807 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.343397 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.343497 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.345001 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.347497 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.349739 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4931]: E0130 05:30:04.349832 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.649392 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.668980 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-internal-tls-certs\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.669126 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-httpd-config\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.669173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-public-tls-certs\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.669250 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-combined-ca-bundle\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.669292 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c78ch\" (UniqueName: \"kubernetes.io/projected/e1f9790c-c395-4c72-b569-3140f703b56f-kube-api-access-c78ch\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.669339 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-config\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.669378 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-ovndb-tls-certs\") pod \"e1f9790c-c395-4c72-b569-3140f703b56f\" (UID: \"e1f9790c-c395-4c72-b569-3140f703b56f\") " Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.678533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.680480 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f9790c-c395-4c72-b569-3140f703b56f-kube-api-access-c78ch" (OuterVolumeSpecName: "kube-api-access-c78ch") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "kube-api-access-c78ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.724401 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-config" (OuterVolumeSpecName: "config") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.737078 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.739211 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.743269 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.771665 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.771723 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.771743 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.771808 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.771829 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c78ch\" (UniqueName: \"kubernetes.io/projected/e1f9790c-c395-4c72-b569-3140f703b56f-kube-api-access-c78ch\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.771851 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.772397 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e1f9790c-c395-4c72-b569-3140f703b56f" (UID: "e1f9790c-c395-4c72-b569-3140f703b56f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.777347 4931 generic.go:334] "Generic (PLEG): container finished" podID="e1f9790c-c395-4c72-b569-3140f703b56f" containerID="59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22" exitCode=0 Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.777454 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75d9f6f6ff-kmswn" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.777538 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75d9f6f6ff-kmswn" event={"ID":"e1f9790c-c395-4c72-b569-3140f703b56f","Type":"ContainerDied","Data":"59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22"} Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.777644 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75d9f6f6ff-kmswn" event={"ID":"e1f9790c-c395-4c72-b569-3140f703b56f","Type":"ContainerDied","Data":"7ca85a404546ccf8741ecd606d0270826591549f7f7979a41ae78e99d2986a63"} Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.777709 4931 scope.go:117] "RemoveContainer" containerID="e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.813788 4931 scope.go:117] "RemoveContainer" containerID="59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.815469 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75d9f6f6ff-kmswn"] Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.820633 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75d9f6f6ff-kmswn"] Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.835345 4931 scope.go:117] "RemoveContainer" containerID="e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e" Jan 30 05:30:05 crc kubenswrapper[4931]: E0130 05:30:05.835813 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e\": container with ID starting with e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e not found: ID does not exist" containerID="e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.835881 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e"} err="failed to get container status \"e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e\": rpc error: code = NotFound desc = could not find container \"e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e\": container with ID starting with e66d09407f90c54520f8315b166cae42a909127ad176413e644d8b4d72518f5e not found: ID does not exist" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.835920 4931 scope.go:117] "RemoveContainer" containerID="59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22" Jan 30 05:30:05 crc kubenswrapper[4931]: E0130 05:30:05.836302 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22\": container with ID starting with 59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22 not found: ID does not exist" containerID="59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.836335 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22"} err="failed to get container status \"59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22\": rpc error: code = NotFound desc = could not find container \"59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22\": container with ID starting with 59911ed8b4d4f6e717a3a6a2f0ab4674f62b93e56bb563bd9ec43bf578791b22 not found: ID does not exist" Jan 30 05:30:05 crc kubenswrapper[4931]: I0130 05:30:05.873739 4931 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f9790c-c395-4c72-b569-3140f703b56f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:07 crc kubenswrapper[4931]: I0130 05:30:07.440389 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" path="/var/lib/kubelet/pods/e1f9790c-c395-4c72-b569-3140f703b56f/volumes" Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.343242 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.344107 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.344946 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.345015 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.346621 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.349403 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.352079 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4931]: E0130 05:30:09.352177 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:30:13 crc kubenswrapper[4931]: E0130 05:30:13.886790 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52577244_c181_4919_b5b0_040e229163db.slice/crio-conmon-7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52577244_c181_4919_b5b0_040e229163db.slice/crio-7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:30:13 crc kubenswrapper[4931]: I0130 05:30:13.895719 4931 generic.go:334] "Generic (PLEG): container finished" podID="52577244-c181-4919-b5b0-040e229163db" containerID="7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719" exitCode=137 Jan 30 05:30:13 crc kubenswrapper[4931]: I0130 05:30:13.895770 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719"} Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.232449 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.313225 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") pod \"52577244-c181-4919-b5b0-040e229163db\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.313316 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"52577244-c181-4919-b5b0-040e229163db\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.313395 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-lock\") pod \"52577244-c181-4919-b5b0-040e229163db\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.313495 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-cache\") pod \"52577244-c181-4919-b5b0-040e229163db\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.313528 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52577244-c181-4919-b5b0-040e229163db-combined-ca-bundle\") pod \"52577244-c181-4919-b5b0-040e229163db\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.313586 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56w5l\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-kube-api-access-56w5l\") pod \"52577244-c181-4919-b5b0-040e229163db\" (UID: \"52577244-c181-4919-b5b0-040e229163db\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.314373 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-lock" (OuterVolumeSpecName: "lock") pod "52577244-c181-4919-b5b0-040e229163db" (UID: "52577244-c181-4919-b5b0-040e229163db"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.314355 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-cache" (OuterVolumeSpecName: "cache") pod "52577244-c181-4919-b5b0-040e229163db" (UID: "52577244-c181-4919-b5b0-040e229163db"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.314720 4931 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-lock\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.331476 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "52577244-c181-4919-b5b0-040e229163db" (UID: "52577244-c181-4919-b5b0-040e229163db"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.331494 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "52577244-c181-4919-b5b0-040e229163db" (UID: "52577244-c181-4919-b5b0-040e229163db"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.331536 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-kube-api-access-56w5l" (OuterVolumeSpecName: "kube-api-access-56w5l") pod "52577244-c181-4919-b5b0-040e229163db" (UID: "52577244-c181-4919-b5b0-040e229163db"). InnerVolumeSpecName "kube-api-access-56w5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.341841 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.341985 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17 is running failed: container process not found" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.342442 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17 is running failed: container process not found" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.342532 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.342727 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17 is running failed: container process not found" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.342771 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.343094 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:14 crc kubenswrapper[4931]: E0130 05:30:14.343117 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-thxc2" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.415393 4931 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52577244-c181-4919-b5b0-040e229163db-cache\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.415489 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56w5l\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-kube-api-access-56w5l\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.415505 4931 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52577244-c181-4919-b5b0-040e229163db-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.415587 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.434642 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.517203 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.552018 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-thxc2_5732e34e-6330-4a36-9082-dbb50eede9f2/ovs-vswitchd/0.log" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.554744 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618250 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-etc-ovs\") pod \"5732e34e-6330-4a36-9082-dbb50eede9f2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618323 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-log\") pod \"5732e34e-6330-4a36-9082-dbb50eede9f2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618336 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "5732e34e-6330-4a36-9082-dbb50eede9f2" (UID: "5732e34e-6330-4a36-9082-dbb50eede9f2"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-lib\") pod \"5732e34e-6330-4a36-9082-dbb50eede9f2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618415 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-lib" (OuterVolumeSpecName: "var-lib") pod "5732e34e-6330-4a36-9082-dbb50eede9f2" (UID: "5732e34e-6330-4a36-9082-dbb50eede9f2"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618479 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-log" (OuterVolumeSpecName: "var-log") pod "5732e34e-6330-4a36-9082-dbb50eede9f2" (UID: "5732e34e-6330-4a36-9082-dbb50eede9f2"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618570 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5732e34e-6330-4a36-9082-dbb50eede9f2-scripts\") pod \"5732e34e-6330-4a36-9082-dbb50eede9f2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618628 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-run\") pod \"5732e34e-6330-4a36-9082-dbb50eede9f2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618660 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d259g\" (UniqueName: \"kubernetes.io/projected/5732e34e-6330-4a36-9082-dbb50eede9f2-kube-api-access-d259g\") pod \"5732e34e-6330-4a36-9082-dbb50eede9f2\" (UID: \"5732e34e-6330-4a36-9082-dbb50eede9f2\") " Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.618733 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-run" (OuterVolumeSpecName: "var-run") pod "5732e34e-6330-4a36-9082-dbb50eede9f2" (UID: "5732e34e-6330-4a36-9082-dbb50eede9f2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.619090 4931 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.619149 4931 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.619161 4931 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-lib\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.619202 4931 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5732e34e-6330-4a36-9082-dbb50eede9f2-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.620696 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5732e34e-6330-4a36-9082-dbb50eede9f2-scripts" (OuterVolumeSpecName: "scripts") pod "5732e34e-6330-4a36-9082-dbb50eede9f2" (UID: "5732e34e-6330-4a36-9082-dbb50eede9f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.632654 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5732e34e-6330-4a36-9082-dbb50eede9f2-kube-api-access-d259g" (OuterVolumeSpecName: "kube-api-access-d259g") pod "5732e34e-6330-4a36-9082-dbb50eede9f2" (UID: "5732e34e-6330-4a36-9082-dbb50eede9f2"). InnerVolumeSpecName "kube-api-access-d259g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.704167 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52577244-c181-4919-b5b0-040e229163db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52577244-c181-4919-b5b0-040e229163db" (UID: "52577244-c181-4919-b5b0-040e229163db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.721049 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5732e34e-6330-4a36-9082-dbb50eede9f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.721091 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d259g\" (UniqueName: \"kubernetes.io/projected/5732e34e-6330-4a36-9082-dbb50eede9f2-kube-api-access-d259g\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.721106 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52577244-c181-4919-b5b0-040e229163db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.921950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"52577244-c181-4919-b5b0-040e229163db","Type":"ContainerDied","Data":"8f709bd92c7c6c28297de5f91b3d8f5726929abc3fede49c29940651ade456cb"} Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.922160 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.922206 4931 scope.go:117] "RemoveContainer" containerID="7960131bc61ab6450751b905e24e2ccae8d9fe2d400984f5011874b3859c6719" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.927351 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-thxc2_5732e34e-6330-4a36-9082-dbb50eede9f2/ovs-vswitchd/0.log" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.929241 4931 generic.go:334] "Generic (PLEG): container finished" podID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" exitCode=137 Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.929303 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerDied","Data":"52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17"} Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.929318 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-thxc2" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.929342 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-thxc2" event={"ID":"5732e34e-6330-4a36-9082-dbb50eede9f2","Type":"ContainerDied","Data":"f259a70451b1edb6023ad4c42bb1037e4e2cbc756eed3599105a7d0ba07dc5ac"} Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.976144 4931 scope.go:117] "RemoveContainer" containerID="fa1a93081b269f4ada317cd8046ebb7f1a7c1edf1f6e97c13ed393eacd7e1973" Jan 30 05:30:14 crc kubenswrapper[4931]: I0130 05:30:14.982602 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.008730 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.019209 4931 scope.go:117] "RemoveContainer" containerID="cf71a5f4e5a8611b3edb8a350ba0e2eedbd78c0fb76770c94841152df4a3ab69" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.021344 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-thxc2"] Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.031723 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-thxc2"] Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.044622 4931 scope.go:117] "RemoveContainer" containerID="577bb47efe5f44d38e3c888fdf879028a229599eb11554a344d6a077afa58802" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.075971 4931 scope.go:117] "RemoveContainer" containerID="2aa8176e0269c78ed82e92b582f8a0a44311ad87daa1079e948a15315c72207f" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.113811 4931 scope.go:117] "RemoveContainer" containerID="cc9cceab2cf461936102038fbf1707f4be2e195decb5808193c7f51c4adb08d3" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.130251 4931 scope.go:117] "RemoveContainer" containerID="b4e57fcd32132c1d41dc41783803f5dfdbd53a0317437ca189732c5c62a33471" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.148806 4931 scope.go:117] "RemoveContainer" containerID="6088dfc85b1e09a936dd16faccae994e80e2ccb29840c2c09302842b83328fc2" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.166791 4931 scope.go:117] "RemoveContainer" containerID="072ac216076cb0fec2ec21789975f4b6fdf297b846d9774f980cb280a52a2718" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.191020 4931 scope.go:117] "RemoveContainer" containerID="840bb675eae49d372214aa49017516eb7cc03feb5f0cebb6fb56a2dd4d0837b9" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.209451 4931 scope.go:117] "RemoveContainer" containerID="01987d0b4f025a347544f55c09ea6cf4f3249f746d37222f35ee196eb4525b63" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.226666 4931 scope.go:117] "RemoveContainer" containerID="9eac75907fb7af02eb9159bedaf64c4ca7dee04ca441b549c3a48132b186515f" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.251259 4931 scope.go:117] "RemoveContainer" containerID="64945c3ef451f83b413801249e5cc8cedcb622a5a129ee80defec98e393eed29" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.275143 4931 scope.go:117] "RemoveContainer" containerID="de7939eb8c76f478b34a03e7dd08f4a308dc8bbb63a287db0a5b3eec8794cc7c" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.295268 4931 scope.go:117] "RemoveContainer" containerID="e78b14aff0684ab7de691a18e86ce169b9e67b8f02342d8eba7927de4cb39ec6" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.331135 4931 scope.go:117] "RemoveContainer" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.355348 4931 scope.go:117] "RemoveContainer" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.380674 4931 scope.go:117] "RemoveContainer" containerID="f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.417785 4931 scope.go:117] "RemoveContainer" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" Jan 30 05:30:15 crc kubenswrapper[4931]: E0130 05:30:15.418457 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17\": container with ID starting with 52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17 not found: ID does not exist" containerID="52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.418512 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17"} err="failed to get container status \"52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17\": rpc error: code = NotFound desc = could not find container \"52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17\": container with ID starting with 52b5db4015b0059c26ae8bc17932b213ec5274f0f558128594eedffdbd557f17 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.418548 4931 scope.go:117] "RemoveContainer" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" Jan 30 05:30:15 crc kubenswrapper[4931]: E0130 05:30:15.419358 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1\": container with ID starting with ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 not found: ID does not exist" containerID="ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.419418 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1"} err="failed to get container status \"ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1\": rpc error: code = NotFound desc = could not find container \"ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1\": container with ID starting with ebc48ed9003a3c2930951ff8ca463098a233a76571845ef3bd7d328cc44868c1 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.419495 4931 scope.go:117] "RemoveContainer" containerID="f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054" Jan 30 05:30:15 crc kubenswrapper[4931]: E0130 05:30:15.420111 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054\": container with ID starting with f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054 not found: ID does not exist" containerID="f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.420176 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054"} err="failed to get container status \"f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054\": rpc error: code = NotFound desc = could not find container \"f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054\": container with ID starting with f760d9a2256038a984bdc2bea867eb0919b319e2fcb8d4e368c3e9c063a65054 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.445328 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52577244-c181-4919-b5b0-040e229163db" path="/var/lib/kubelet/pods/52577244-c181-4919-b5b0-040e229163db/volumes" Jan 30 05:30:15 crc kubenswrapper[4931]: I0130 05:30:15.450023 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" path="/var/lib/kubelet/pods/5732e34e-6330-4a36-9082-dbb50eede9f2/volumes" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.474821 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.479882 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671252 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-logs\") pod \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671388 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-combined-ca-bundle\") pod \"9ac55021-a07e-443f-9ee9-e7516556b975\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671505 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data\") pod \"9ac55021-a07e-443f-9ee9-e7516556b975\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671608 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpwf5\" (UniqueName: \"kubernetes.io/projected/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-kube-api-access-jpwf5\") pod \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671685 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data-custom\") pod \"9ac55021-a07e-443f-9ee9-e7516556b975\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671729 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data-custom\") pod \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671784 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-combined-ca-bundle\") pod \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.671878 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac55021-a07e-443f-9ee9-e7516556b975-logs\") pod \"9ac55021-a07e-443f-9ee9-e7516556b975\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.672059 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data\") pod \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\" (UID: \"c0c7aeee-9023-433a-83d0-aa0e9942a0ed\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.672141 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mc9v\" (UniqueName: \"kubernetes.io/projected/9ac55021-a07e-443f-9ee9-e7516556b975-kube-api-access-7mc9v\") pod \"9ac55021-a07e-443f-9ee9-e7516556b975\" (UID: \"9ac55021-a07e-443f-9ee9-e7516556b975\") " Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.672553 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac55021-a07e-443f-9ee9-e7516556b975-logs" (OuterVolumeSpecName: "logs") pod "9ac55021-a07e-443f-9ee9-e7516556b975" (UID: "9ac55021-a07e-443f-9ee9-e7516556b975"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.672773 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac55021-a07e-443f-9ee9-e7516556b975-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.673139 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-logs" (OuterVolumeSpecName: "logs") pod "c0c7aeee-9023-433a-83d0-aa0e9942a0ed" (UID: "c0c7aeee-9023-433a-83d0-aa0e9942a0ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.677829 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c0c7aeee-9023-433a-83d0-aa0e9942a0ed" (UID: "c0c7aeee-9023-433a-83d0-aa0e9942a0ed"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.678316 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac55021-a07e-443f-9ee9-e7516556b975-kube-api-access-7mc9v" (OuterVolumeSpecName: "kube-api-access-7mc9v") pod "9ac55021-a07e-443f-9ee9-e7516556b975" (UID: "9ac55021-a07e-443f-9ee9-e7516556b975"). InnerVolumeSpecName "kube-api-access-7mc9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.679318 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9ac55021-a07e-443f-9ee9-e7516556b975" (UID: "9ac55021-a07e-443f-9ee9-e7516556b975"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.683599 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-kube-api-access-jpwf5" (OuterVolumeSpecName: "kube-api-access-jpwf5") pod "c0c7aeee-9023-433a-83d0-aa0e9942a0ed" (UID: "c0c7aeee-9023-433a-83d0-aa0e9942a0ed"). InnerVolumeSpecName "kube-api-access-jpwf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.698791 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ac55021-a07e-443f-9ee9-e7516556b975" (UID: "9ac55021-a07e-443f-9ee9-e7516556b975"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.711617 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0c7aeee-9023-433a-83d0-aa0e9942a0ed" (UID: "c0c7aeee-9023-433a-83d0-aa0e9942a0ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.730933 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data" (OuterVolumeSpecName: "config-data") pod "c0c7aeee-9023-433a-83d0-aa0e9942a0ed" (UID: "c0c7aeee-9023-433a-83d0-aa0e9942a0ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.735691 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data" (OuterVolumeSpecName: "config-data") pod "9ac55021-a07e-443f-9ee9-e7516556b975" (UID: "9ac55021-a07e-443f-9ee9-e7516556b975"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775068 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775125 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775146 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mc9v\" (UniqueName: \"kubernetes.io/projected/9ac55021-a07e-443f-9ee9-e7516556b975-kube-api-access-7mc9v\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775168 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775186 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775203 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775221 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpwf5\" (UniqueName: \"kubernetes.io/projected/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-kube-api-access-jpwf5\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775238 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ac55021-a07e-443f-9ee9-e7516556b975-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.775255 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c7aeee-9023-433a-83d0-aa0e9942a0ed-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.978690 4931 generic.go:334] "Generic (PLEG): container finished" podID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerID="a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33" exitCode=137 Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.978759 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7789bbd757-45b5w" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.978789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7789bbd757-45b5w" event={"ID":"c0c7aeee-9023-433a-83d0-aa0e9942a0ed","Type":"ContainerDied","Data":"a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33"} Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.978886 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7789bbd757-45b5w" event={"ID":"c0c7aeee-9023-433a-83d0-aa0e9942a0ed","Type":"ContainerDied","Data":"b30436eda9ab254987a1049643d0e45f01f12d10b3f44f43863aa93c4c7ce86b"} Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.978917 4931 scope.go:117] "RemoveContainer" containerID="a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33" Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.983954 4931 generic.go:334] "Generic (PLEG): container finished" podID="9ac55021-a07e-443f-9ee9-e7516556b975" containerID="9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae" exitCode=137 Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.984030 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" event={"ID":"9ac55021-a07e-443f-9ee9-e7516556b975","Type":"ContainerDied","Data":"9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae"} Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.984083 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" event={"ID":"9ac55021-a07e-443f-9ee9-e7516556b975","Type":"ContainerDied","Data":"ddc1c9e389f315057ec0a85201907373bcd7582adeb9a6f356d1b36e03264dc9"} Jan 30 05:30:17 crc kubenswrapper[4931]: I0130 05:30:17.984189 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-867d8cd54-77bnr" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.015815 4931 scope.go:117] "RemoveContainer" containerID="eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.032933 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-867d8cd54-77bnr"] Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.044321 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-867d8cd54-77bnr"] Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.054363 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7789bbd757-45b5w"] Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.054439 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7789bbd757-45b5w"] Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.120124 4931 scope.go:117] "RemoveContainer" containerID="a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33" Jan 30 05:30:18 crc kubenswrapper[4931]: E0130 05:30:18.120782 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33\": container with ID starting with a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33 not found: ID does not exist" containerID="a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.120834 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33"} err="failed to get container status \"a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33\": rpc error: code = NotFound desc = could not find container \"a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33\": container with ID starting with a93d68f310310e7303c3edab109a8ed880ebee37b66fadef54a1cd3f06192f33 not found: ID does not exist" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.120874 4931 scope.go:117] "RemoveContainer" containerID="eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123" Jan 30 05:30:18 crc kubenswrapper[4931]: E0130 05:30:18.121356 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123\": container with ID starting with eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123 not found: ID does not exist" containerID="eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.121418 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123"} err="failed to get container status \"eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123\": rpc error: code = NotFound desc = could not find container \"eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123\": container with ID starting with eaa8e6737e823302afed9a1ee3304e390f004c6a3502f957764d475d51b27123 not found: ID does not exist" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.121484 4931 scope.go:117] "RemoveContainer" containerID="9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.152015 4931 scope.go:117] "RemoveContainer" containerID="1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.179460 4931 scope.go:117] "RemoveContainer" containerID="9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae" Jan 30 05:30:18 crc kubenswrapper[4931]: E0130 05:30:18.180115 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae\": container with ID starting with 9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae not found: ID does not exist" containerID="9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.180178 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae"} err="failed to get container status \"9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae\": rpc error: code = NotFound desc = could not find container \"9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae\": container with ID starting with 9f630d7d16f08bd8c63c99b99649d0bf49fe0c9e741e4be3226750e5334469ae not found: ID does not exist" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.180218 4931 scope.go:117] "RemoveContainer" containerID="1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d" Jan 30 05:30:18 crc kubenswrapper[4931]: E0130 05:30:18.180966 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d\": container with ID starting with 1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d not found: ID does not exist" containerID="1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d" Jan 30 05:30:18 crc kubenswrapper[4931]: I0130 05:30:18.181034 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d"} err="failed to get container status \"1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d\": rpc error: code = NotFound desc = could not find container \"1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d\": container with ID starting with 1f6797a8cd9dc5bb3661053087d443c4acbfc334ea75b6f96caee3ee4286c94d not found: ID does not exist" Jan 30 05:30:19 crc kubenswrapper[4931]: I0130 05:30:19.440454 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" path="/var/lib/kubelet/pods/9ac55021-a07e-443f-9ee9-e7516556b975/volumes" Jan 30 05:30:19 crc kubenswrapper[4931]: I0130 05:30:19.442202 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" path="/var/lib/kubelet/pods/c0c7aeee-9023-433a-83d0-aa0e9942a0ed/volumes" Jan 30 05:30:21 crc kubenswrapper[4931]: I0130 05:30:21.525985 4931 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod586d7a1d-7b2a-45ac-aacb-b77e95bf3d91"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod586d7a1d-7b2a-45ac-aacb-b77e95bf3d91] : Timed out while waiting for systemd to remove kubepods-besteffort-pod586d7a1d_7b2a_45ac_aacb_b77e95bf3d91.slice" Jan 30 05:30:47 crc kubenswrapper[4931]: I0130 05:30:47.985631 4931 scope.go:117] "RemoveContainer" containerID="c1a9c2221cdd695e864a018795817226f2731d9a93a0ec46938ca160bd878ce0" Jan 30 05:30:48 crc kubenswrapper[4931]: I0130 05:30:48.031546 4931 scope.go:117] "RemoveContainer" containerID="cf669d89126cd05876fe2026bdc44224135e63c9e8ec5899f87342a850974a32" Jan 30 05:30:48 crc kubenswrapper[4931]: I0130 05:30:48.086766 4931 scope.go:117] "RemoveContainer" containerID="3eb355680179efbcbb2cf73e83f9b34f38755a348dc73a0a8db4b58a9c1de2f1" Jan 30 05:30:48 crc kubenswrapper[4931]: I0130 05:30:48.124953 4931 scope.go:117] "RemoveContainer" containerID="36f1f59d90f7e1367de837bd2375e2c11d0df21e5687e4d77b474faff3e8df0b" Jan 30 05:30:48 crc kubenswrapper[4931]: I0130 05:30:48.155222 4931 scope.go:117] "RemoveContainer" containerID="2951358824ae5ca54f437c7afd5ea7478602f9317a7330914d36e2cd66c684f6" Jan 30 05:30:48 crc kubenswrapper[4931]: I0130 05:30:48.185649 4931 scope.go:117] "RemoveContainer" containerID="1c39e215a63df32503630d9061f8096755ed069b7c32eec93d18140193ca977b" Jan 30 05:30:48 crc kubenswrapper[4931]: I0130 05:30:48.244495 4931 scope.go:117] "RemoveContainer" containerID="ea087fa247f921525edceed36a509f1f8a642fc4da20a1c681c64a7843494e21" Jan 30 05:31:48 crc kubenswrapper[4931]: I0130 05:31:48.855056 4931 scope.go:117] "RemoveContainer" containerID="a64e91cbe33af673e6689e436885784e9c445a56b737d4748cfcdbf6fce27a53" Jan 30 05:31:48 crc kubenswrapper[4931]: I0130 05:31:48.892021 4931 scope.go:117] "RemoveContainer" containerID="0aa30d8d9eae66f63b97cadd6e1c8c0a9f5fe5356f82b3165d21d6b90e8f054f" Jan 30 05:31:48 crc kubenswrapper[4931]: I0130 05:31:48.939628 4931 scope.go:117] "RemoveContainer" containerID="6a0d3a37541dd8bff3edd7d0762f4af19258be892f0b28a3ee8ffd644ba91460" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.004808 4931 scope.go:117] "RemoveContainer" containerID="2de911fa734d3f7bf71674e62b4beae90797f33e1cefb2483c1ee516fdc3ab44" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.030307 4931 scope.go:117] "RemoveContainer" containerID="397e61b0c2ae3421e485ee4187f967d9302d0416b0227f42b0fd6c0769496dc7" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.053870 4931 scope.go:117] "RemoveContainer" containerID="98322602699d7d942a6271f0a7fc74a73af1b5a299d4d538e2ee24bc7375a406" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.082705 4931 scope.go:117] "RemoveContainer" containerID="572f09b29f02dcc488a3f5a5c3037d927c6d17a2fec69ebb6da1e983b4bf5d1f" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.111554 4931 scope.go:117] "RemoveContainer" containerID="94fc6d9869d9820d8c965d9ddc61b4a6003c2bcfb528dd4f82ab1c383ce5be01" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.135519 4931 scope.go:117] "RemoveContainer" containerID="72e98c8676f758af58c2fffef7c54cd9bedf5ae4210e865b9220280e84a05578" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.164167 4931 scope.go:117] "RemoveContainer" containerID="b7fd522240b80788d80f7919145a4aa75ecf42cdb18b9fd6434f7a190f674261" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.194327 4931 scope.go:117] "RemoveContainer" containerID="cd7b183110cc89e2b8163b597d3085517a11ba4e30ab68470086c415c808b949" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.218632 4931 scope.go:117] "RemoveContainer" containerID="31798e9f13d46b8721aae715c1edfd7a01d30cecc4d59728bf20993fd26d459b" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.258787 4931 scope.go:117] "RemoveContainer" containerID="f8a2b41856adf7471c684772afc9b12f445fbd24f6ab5036ce18fde6331c17d4" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.290788 4931 scope.go:117] "RemoveContainer" containerID="1140a7961d708d05c85bc33a569a12461dd710e3403faa5dc7621241292e7e99" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.332965 4931 scope.go:117] "RemoveContainer" containerID="cf97da0cb0eda4f19afadca8bff99c148f7d4875216c60c7fab1145ddf6c9ea3" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.366600 4931 scope.go:117] "RemoveContainer" containerID="342d9a5530d63ffbf73ce7c8e40d07fb36a0b7d11a82e475df103d0dfda95398" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.419226 4931 scope.go:117] "RemoveContainer" containerID="e65d7d2b5f976da6a48bf573c615d7b8b7b4da4391bf0bdecb5b42aeee5717fb" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.460714 4931 scope.go:117] "RemoveContainer" containerID="b62af9a31208f4045d6ab5fc627a9d3f9b63bc460555779073074656653065f9" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.494665 4931 scope.go:117] "RemoveContainer" containerID="2f4a9744870428aceb547b2acdf130704ce1aaa6370e1105462da3c72da4e168" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.526317 4931 scope.go:117] "RemoveContainer" containerID="424ff0eefa4783d3488bc19f3934cfec69b31ed4d156eca267b961eb0d363be6" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.557512 4931 scope.go:117] "RemoveContainer" containerID="43be7b60c2a00736cc6eb1df08fb2617062a5a1b05069e0aa41b60294b71b16c" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.607056 4931 scope.go:117] "RemoveContainer" containerID="15692c1b35f8b38884128c50d64f7fe3e0155bf28a22d7aceb44dcecc3b74210" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.631637 4931 scope.go:117] "RemoveContainer" containerID="1a58b66910dccd0a3d3aecf3a69cc3be05007daec35ed4f4da6ecaf7deb3050f" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.661679 4931 scope.go:117] "RemoveContainer" containerID="508e6e3003e86ee32a2b32dcec684271942a70c7d32070551d5605127eb8d9ad" Jan 30 05:31:49 crc kubenswrapper[4931]: I0130 05:31:49.710858 4931 scope.go:117] "RemoveContainer" containerID="dea51d6ee685a2470eaa0864347990ea744cedc85d00846568c67d56ba221ee1" Jan 30 05:31:57 crc kubenswrapper[4931]: I0130 05:31:57.362938 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:31:57 crc kubenswrapper[4931]: I0130 05:31:57.366305 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.387056 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bngkw"] Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390586 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-reaper" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390612 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-reaper" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390635 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-api" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390649 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-api" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390663 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390676 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390695 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-updater" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390707 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-updater" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390729 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="swift-recon-cron" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390740 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="swift-recon-cron" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390756 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390767 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-server" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390785 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="rsync" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390796 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="rsync" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390816 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390828 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390844 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390855 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-server" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390880 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-httpd" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390891 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-httpd" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390909 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390921 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390943 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker-log" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390960 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker-log" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.390980 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener-log" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.390997 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener-log" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391021 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a19500-eb44-455f-a8b7-7ee5375b87ef" containerName="collect-profiles" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391038 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a19500-eb44-455f-a8b7-7ee5375b87ef" containerName="collect-profiles" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391064 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391082 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391107 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391125 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391159 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391176 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391201 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391217 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391239 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391254 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-server" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391276 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391290 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391316 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-expirer" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391332 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-expirer" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391356 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391371 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391389 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-updater" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391404 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-updater" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391463 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391480 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: E0130 05:32:02.391510 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server-init" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391527 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server-init" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391854 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="623f3c8f-d741-4ba4-baca-905a13102f38" containerName="mariadb-account-create-update" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391912 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391939 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391964 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.391982 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker-log" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392001 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-expirer" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392030 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-updater" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392051 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-api" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392079 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392104 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="swift-recon-cron" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392144 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovs-vswitchd" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392181 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392206 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-replicator" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392225 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="rsync" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392252 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="object-auditor" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392271 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-reaper" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392293 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-updater" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392314 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener-log" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392334 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="container-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392354 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c7aeee-9023-433a-83d0-aa0e9942a0ed" containerName="barbican-worker" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392377 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f9790c-c395-4c72-b569-3140f703b56f" containerName="neutron-httpd" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392393 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac55021-a07e-443f-9ee9-e7516556b975" containerName="barbican-keystone-listener" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392415 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="5732e34e-6330-4a36-9082-dbb50eede9f2" containerName="ovsdb-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392474 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52577244-c181-4919-b5b0-040e229163db" containerName="account-server" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.392489 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a19500-eb44-455f-a8b7-7ee5375b87ef" containerName="collect-profiles" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.394153 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.421447 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bngkw"] Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.467178 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-catalog-content\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.467276 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fjwq\" (UniqueName: \"kubernetes.io/projected/567bd9dc-af96-410a-afd9-bda3e473d9af-kube-api-access-4fjwq\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.467612 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-utilities\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.569719 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-catalog-content\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.569832 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fjwq\" (UniqueName: \"kubernetes.io/projected/567bd9dc-af96-410a-afd9-bda3e473d9af-kube-api-access-4fjwq\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.570010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-utilities\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.570670 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-catalog-content\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.570835 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-utilities\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.611057 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fjwq\" (UniqueName: \"kubernetes.io/projected/567bd9dc-af96-410a-afd9-bda3e473d9af-kube-api-access-4fjwq\") pod \"redhat-operators-bngkw\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.729804 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:02 crc kubenswrapper[4931]: I0130 05:32:02.997664 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bngkw"] Jan 30 05:32:03 crc kubenswrapper[4931]: I0130 05:32:03.512991 4931 generic.go:334] "Generic (PLEG): container finished" podID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerID="62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db" exitCode=0 Jan 30 05:32:03 crc kubenswrapper[4931]: I0130 05:32:03.513107 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerDied","Data":"62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db"} Jan 30 05:32:03 crc kubenswrapper[4931]: I0130 05:32:03.513262 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerStarted","Data":"bfa56798d790bb83e1e8dd951ca707c64a2ef5d7129037e48b02eafae5e0e48f"} Jan 30 05:32:03 crc kubenswrapper[4931]: I0130 05:32:03.514697 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:32:04 crc kubenswrapper[4931]: I0130 05:32:04.531654 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerStarted","Data":"850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26"} Jan 30 05:32:05 crc kubenswrapper[4931]: I0130 05:32:05.546217 4931 generic.go:334] "Generic (PLEG): container finished" podID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerID="850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26" exitCode=0 Jan 30 05:32:05 crc kubenswrapper[4931]: I0130 05:32:05.546275 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerDied","Data":"850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26"} Jan 30 05:32:06 crc kubenswrapper[4931]: I0130 05:32:06.560636 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerStarted","Data":"7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c"} Jan 30 05:32:06 crc kubenswrapper[4931]: I0130 05:32:06.590253 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bngkw" podStartSLOduration=2.141310454 podStartE2EDuration="4.590087053s" podCreationTimestamp="2026-01-30 05:32:02 +0000 UTC" firstStartedPulling="2026-01-30 05:32:03.514381555 +0000 UTC m=+1458.884291822" lastFinishedPulling="2026-01-30 05:32:05.963158124 +0000 UTC m=+1461.333068421" observedRunningTime="2026-01-30 05:32:06.588004399 +0000 UTC m=+1461.957914706" watchObservedRunningTime="2026-01-30 05:32:06.590087053 +0000 UTC m=+1461.959997350" Jan 30 05:32:12 crc kubenswrapper[4931]: I0130 05:32:12.730142 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:12 crc kubenswrapper[4931]: I0130 05:32:12.730616 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:13 crc kubenswrapper[4931]: I0130 05:32:13.777902 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bngkw" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="registry-server" probeResult="failure" output=< Jan 30 05:32:13 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 05:32:13 crc kubenswrapper[4931]: > Jan 30 05:32:22 crc kubenswrapper[4931]: I0130 05:32:22.812899 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:22 crc kubenswrapper[4931]: I0130 05:32:22.906328 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:23 crc kubenswrapper[4931]: I0130 05:32:23.072103 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bngkw"] Jan 30 05:32:24 crc kubenswrapper[4931]: I0130 05:32:24.753454 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bngkw" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="registry-server" containerID="cri-o://7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c" gracePeriod=2 Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.287209 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.348716 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-catalog-content\") pod \"567bd9dc-af96-410a-afd9-bda3e473d9af\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.348801 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-utilities\") pod \"567bd9dc-af96-410a-afd9-bda3e473d9af\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.348920 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fjwq\" (UniqueName: \"kubernetes.io/projected/567bd9dc-af96-410a-afd9-bda3e473d9af-kube-api-access-4fjwq\") pod \"567bd9dc-af96-410a-afd9-bda3e473d9af\" (UID: \"567bd9dc-af96-410a-afd9-bda3e473d9af\") " Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.352266 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-utilities" (OuterVolumeSpecName: "utilities") pod "567bd9dc-af96-410a-afd9-bda3e473d9af" (UID: "567bd9dc-af96-410a-afd9-bda3e473d9af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.357044 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567bd9dc-af96-410a-afd9-bda3e473d9af-kube-api-access-4fjwq" (OuterVolumeSpecName: "kube-api-access-4fjwq") pod "567bd9dc-af96-410a-afd9-bda3e473d9af" (UID: "567bd9dc-af96-410a-afd9-bda3e473d9af"). InnerVolumeSpecName "kube-api-access-4fjwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.451065 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.451104 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fjwq\" (UniqueName: \"kubernetes.io/projected/567bd9dc-af96-410a-afd9-bda3e473d9af-kube-api-access-4fjwq\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.528168 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "567bd9dc-af96-410a-afd9-bda3e473d9af" (UID: "567bd9dc-af96-410a-afd9-bda3e473d9af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.552554 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567bd9dc-af96-410a-afd9-bda3e473d9af-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.774205 4931 generic.go:334] "Generic (PLEG): container finished" podID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerID="7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c" exitCode=0 Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.774267 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerDied","Data":"7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c"} Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.774309 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bngkw" event={"ID":"567bd9dc-af96-410a-afd9-bda3e473d9af","Type":"ContainerDied","Data":"bfa56798d790bb83e1e8dd951ca707c64a2ef5d7129037e48b02eafae5e0e48f"} Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.774341 4931 scope.go:117] "RemoveContainer" containerID="7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.774583 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bngkw" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.810289 4931 scope.go:117] "RemoveContainer" containerID="850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.834983 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bngkw"] Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.846403 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bngkw"] Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.852334 4931 scope.go:117] "RemoveContainer" containerID="62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.883369 4931 scope.go:117] "RemoveContainer" containerID="7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c" Jan 30 05:32:25 crc kubenswrapper[4931]: E0130 05:32:25.884049 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c\": container with ID starting with 7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c not found: ID does not exist" containerID="7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.884138 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c"} err="failed to get container status \"7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c\": rpc error: code = NotFound desc = could not find container \"7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c\": container with ID starting with 7b64b470a99e0f0df4935c5100bf4ee1e8fdeb10b1a5502d4d1d1b75be9b755c not found: ID does not exist" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.884178 4931 scope.go:117] "RemoveContainer" containerID="850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26" Jan 30 05:32:25 crc kubenswrapper[4931]: E0130 05:32:25.884819 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26\": container with ID starting with 850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26 not found: ID does not exist" containerID="850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.884856 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26"} err="failed to get container status \"850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26\": rpc error: code = NotFound desc = could not find container \"850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26\": container with ID starting with 850b640765f16d3f47d1e2cd35d91b07397292fd55c8906b2e0325eaaf41ca26 not found: ID does not exist" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.884888 4931 scope.go:117] "RemoveContainer" containerID="62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db" Jan 30 05:32:25 crc kubenswrapper[4931]: E0130 05:32:25.885471 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db\": container with ID starting with 62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db not found: ID does not exist" containerID="62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db" Jan 30 05:32:25 crc kubenswrapper[4931]: I0130 05:32:25.885535 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db"} err="failed to get container status \"62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db\": rpc error: code = NotFound desc = could not find container \"62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db\": container with ID starting with 62862ffef7443966c3af316e55c80f6a491b595913973cd1336120ed4addd5db not found: ID does not exist" Jan 30 05:32:27 crc kubenswrapper[4931]: I0130 05:32:27.363194 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:32:27 crc kubenswrapper[4931]: I0130 05:32:27.363593 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:32:27 crc kubenswrapper[4931]: I0130 05:32:27.434491 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" path="/var/lib/kubelet/pods/567bd9dc-af96-410a-afd9-bda3e473d9af/volumes" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.690241 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pdcrd"] Jan 30 05:32:34 crc kubenswrapper[4931]: E0130 05:32:34.691152 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="extract-utilities" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.691167 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="extract-utilities" Jan 30 05:32:34 crc kubenswrapper[4931]: E0130 05:32:34.691178 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="extract-content" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.691187 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="extract-content" Jan 30 05:32:34 crc kubenswrapper[4931]: E0130 05:32:34.691220 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="registry-server" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.691229 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="registry-server" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.691402 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="567bd9dc-af96-410a-afd9-bda3e473d9af" containerName="registry-server" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.692597 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.716145 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pdcrd"] Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.797634 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-catalog-content\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.797726 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-utilities\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.797773 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfjx\" (UniqueName: \"kubernetes.io/projected/c18016f0-c17f-4cc9-ada3-70547fdd56d5-kube-api-access-bzfjx\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.899117 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-utilities\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.899204 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfjx\" (UniqueName: \"kubernetes.io/projected/c18016f0-c17f-4cc9-ada3-70547fdd56d5-kube-api-access-bzfjx\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.899251 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-catalog-content\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.899866 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-catalog-content\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.899882 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-utilities\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:34 crc kubenswrapper[4931]: I0130 05:32:34.918030 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfjx\" (UniqueName: \"kubernetes.io/projected/c18016f0-c17f-4cc9-ada3-70547fdd56d5-kube-api-access-bzfjx\") pod \"community-operators-pdcrd\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:35 crc kubenswrapper[4931]: I0130 05:32:35.024608 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:35 crc kubenswrapper[4931]: I0130 05:32:35.557748 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pdcrd"] Jan 30 05:32:35 crc kubenswrapper[4931]: I0130 05:32:35.890353 4931 generic.go:334] "Generic (PLEG): container finished" podID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerID="7bebd804e544da219b011d2d1afda89cddb0ce0335d15b4c02c03698505a49a5" exitCode=0 Jan 30 05:32:35 crc kubenswrapper[4931]: I0130 05:32:35.890409 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerDied","Data":"7bebd804e544da219b011d2d1afda89cddb0ce0335d15b4c02c03698505a49a5"} Jan 30 05:32:35 crc kubenswrapper[4931]: I0130 05:32:35.890480 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerStarted","Data":"0d39ebb93c7c1261d2ef3495adffe2a9945e07bb2fddb670b37115156b3edff5"} Jan 30 05:32:36 crc kubenswrapper[4931]: I0130 05:32:36.900710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerStarted","Data":"9f887e3a8d3842c13ac5dc51c372e65ef00a63929109cc6e6a2c9b0bb12255ff"} Jan 30 05:32:37 crc kubenswrapper[4931]: I0130 05:32:37.914795 4931 generic.go:334] "Generic (PLEG): container finished" podID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerID="9f887e3a8d3842c13ac5dc51c372e65ef00a63929109cc6e6a2c9b0bb12255ff" exitCode=0 Jan 30 05:32:37 crc kubenswrapper[4931]: I0130 05:32:37.914873 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerDied","Data":"9f887e3a8d3842c13ac5dc51c372e65ef00a63929109cc6e6a2c9b0bb12255ff"} Jan 30 05:32:38 crc kubenswrapper[4931]: I0130 05:32:38.924667 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerStarted","Data":"8f836158176bf8715147188f37ab5cf8a46459cebe0bf751eef704d908bdceed"} Jan 30 05:32:38 crc kubenswrapper[4931]: I0130 05:32:38.947498 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pdcrd" podStartSLOduration=2.523688635 podStartE2EDuration="4.947477815s" podCreationTimestamp="2026-01-30 05:32:34 +0000 UTC" firstStartedPulling="2026-01-30 05:32:35.892588038 +0000 UTC m=+1491.262498325" lastFinishedPulling="2026-01-30 05:32:38.316377208 +0000 UTC m=+1493.686287505" observedRunningTime="2026-01-30 05:32:38.945605493 +0000 UTC m=+1494.315515760" watchObservedRunningTime="2026-01-30 05:32:38.947477815 +0000 UTC m=+1494.317388082" Jan 30 05:32:45 crc kubenswrapper[4931]: I0130 05:32:45.025360 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:45 crc kubenswrapper[4931]: I0130 05:32:45.025776 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:45 crc kubenswrapper[4931]: I0130 05:32:45.096906 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:46 crc kubenswrapper[4931]: I0130 05:32:46.059037 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:46 crc kubenswrapper[4931]: I0130 05:32:46.141026 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pdcrd"] Jan 30 05:32:48 crc kubenswrapper[4931]: I0130 05:32:48.017973 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pdcrd" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="registry-server" containerID="cri-o://8f836158176bf8715147188f37ab5cf8a46459cebe0bf751eef704d908bdceed" gracePeriod=2 Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.025340 4931 generic.go:334] "Generic (PLEG): container finished" podID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerID="8f836158176bf8715147188f37ab5cf8a46459cebe0bf751eef704d908bdceed" exitCode=0 Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.025398 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerDied","Data":"8f836158176bf8715147188f37ab5cf8a46459cebe0bf751eef704d908bdceed"} Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.308845 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.500935 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-catalog-content\") pod \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.501611 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-utilities\") pod \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.502054 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfjx\" (UniqueName: \"kubernetes.io/projected/c18016f0-c17f-4cc9-ada3-70547fdd56d5-kube-api-access-bzfjx\") pod \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\" (UID: \"c18016f0-c17f-4cc9-ada3-70547fdd56d5\") " Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.504108 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-utilities" (OuterVolumeSpecName: "utilities") pod "c18016f0-c17f-4cc9-ada3-70547fdd56d5" (UID: "c18016f0-c17f-4cc9-ada3-70547fdd56d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.511499 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18016f0-c17f-4cc9-ada3-70547fdd56d5-kube-api-access-bzfjx" (OuterVolumeSpecName: "kube-api-access-bzfjx") pod "c18016f0-c17f-4cc9-ada3-70547fdd56d5" (UID: "c18016f0-c17f-4cc9-ada3-70547fdd56d5"). InnerVolumeSpecName "kube-api-access-bzfjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.555691 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c18016f0-c17f-4cc9-ada3-70547fdd56d5" (UID: "c18016f0-c17f-4cc9-ada3-70547fdd56d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.603911 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.603952 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfjx\" (UniqueName: \"kubernetes.io/projected/c18016f0-c17f-4cc9-ada3-70547fdd56d5-kube-api-access-bzfjx\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:49 crc kubenswrapper[4931]: I0130 05:32:49.603970 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18016f0-c17f-4cc9-ada3-70547fdd56d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.041817 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdcrd" event={"ID":"c18016f0-c17f-4cc9-ada3-70547fdd56d5","Type":"ContainerDied","Data":"0d39ebb93c7c1261d2ef3495adffe2a9945e07bb2fddb670b37115156b3edff5"} Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.041895 4931 scope.go:117] "RemoveContainer" containerID="8f836158176bf8715147188f37ab5cf8a46459cebe0bf751eef704d908bdceed" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.042118 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdcrd" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.066104 4931 scope.go:117] "RemoveContainer" containerID="9f887e3a8d3842c13ac5dc51c372e65ef00a63929109cc6e6a2c9b0bb12255ff" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.091662 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pdcrd"] Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.102351 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pdcrd"] Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.117452 4931 scope.go:117] "RemoveContainer" containerID="7bebd804e544da219b011d2d1afda89cddb0ce0335d15b4c02c03698505a49a5" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.170154 4931 scope.go:117] "RemoveContainer" containerID="1ee4814b304dc2facdec4fe5a7ec548e21648d29d6f5eb9a0d58da2eecb4e24b" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.193271 4931 scope.go:117] "RemoveContainer" containerID="1e41b0f0000283bd1a29c28f4d8fdb74fdd5389e3ebd8804eac1db1375b10248" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.230198 4931 scope.go:117] "RemoveContainer" containerID="136922e5f994f3ec703ee0b76647539238ec38bef505c93a23f26dc8f73ef24d" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.259288 4931 scope.go:117] "RemoveContainer" containerID="703277ac00408ad9f7d1f58fc77ac68f5eef4a1090a051f9ca88ddf484b5fda4" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.318914 4931 scope.go:117] "RemoveContainer" containerID="00f6a2dd44878296bf4733164be83ce28b802aa8c1f8943860365c668511c527" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.346893 4931 scope.go:117] "RemoveContainer" containerID="d39f6f20169ccd7e0eea2f20181dc418bff8322ca21fcf82d4f5c3d022992a6f" Jan 30 05:32:50 crc kubenswrapper[4931]: I0130 05:32:50.396934 4931 scope.go:117] "RemoveContainer" containerID="5737a395d9d729d9146705b5cbd342fcc1f4ff9a1712777cd6b02a06ccdce9e4" Jan 30 05:32:51 crc kubenswrapper[4931]: I0130 05:32:51.440057 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" path="/var/lib/kubelet/pods/c18016f0-c17f-4cc9-ada3-70547fdd56d5/volumes" Jan 30 05:32:57 crc kubenswrapper[4931]: I0130 05:32:57.362904 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:32:57 crc kubenswrapper[4931]: I0130 05:32:57.364539 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:32:57 crc kubenswrapper[4931]: I0130 05:32:57.364631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:32:57 crc kubenswrapper[4931]: I0130 05:32:57.365561 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:32:57 crc kubenswrapper[4931]: I0130 05:32:57.365666 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" gracePeriod=600 Jan 30 05:32:57 crc kubenswrapper[4931]: E0130 05:32:57.510395 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:32:58 crc kubenswrapper[4931]: I0130 05:32:58.139510 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" exitCode=0 Jan 30 05:32:58 crc kubenswrapper[4931]: I0130 05:32:58.139601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0"} Jan 30 05:32:58 crc kubenswrapper[4931]: I0130 05:32:58.139665 4931 scope.go:117] "RemoveContainer" containerID="083c2726f719c1b6c228fc0d209a309a403985263c1ced3ea0982529442fd973" Jan 30 05:32:58 crc kubenswrapper[4931]: I0130 05:32:58.140634 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:32:58 crc kubenswrapper[4931]: E0130 05:32:58.141476 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.466127 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdmvh"] Jan 30 05:33:05 crc kubenswrapper[4931]: E0130 05:33:05.468402 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="extract-utilities" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.468478 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="extract-utilities" Jan 30 05:33:05 crc kubenswrapper[4931]: E0130 05:33:05.468513 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="registry-server" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.468533 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="registry-server" Jan 30 05:33:05 crc kubenswrapper[4931]: E0130 05:33:05.468562 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="extract-content" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.468578 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="extract-content" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.468949 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18016f0-c17f-4cc9-ada3-70547fdd56d5" containerName="registry-server" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.470817 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.498820 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdmvh"] Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.668766 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-catalog-content\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.668862 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-utilities\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.668936 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pcmx\" (UniqueName: \"kubernetes.io/projected/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-kube-api-access-8pcmx\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.770705 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-catalog-content\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.770794 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-utilities\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.770849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pcmx\" (UniqueName: \"kubernetes.io/projected/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-kube-api-access-8pcmx\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.771237 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-catalog-content\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.771592 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-utilities\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.797182 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pcmx\" (UniqueName: \"kubernetes.io/projected/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-kube-api-access-8pcmx\") pod \"certified-operators-kdmvh\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:05 crc kubenswrapper[4931]: I0130 05:33:05.810583 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:06 crc kubenswrapper[4931]: I0130 05:33:06.291885 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdmvh"] Jan 30 05:33:07 crc kubenswrapper[4931]: I0130 05:33:07.235106 4931 generic.go:334] "Generic (PLEG): container finished" podID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerID="ac8579f185e9d5a31fb735e240a7db9474963e4a3fc8a2610621483b60d98f53" exitCode=0 Jan 30 05:33:07 crc kubenswrapper[4931]: I0130 05:33:07.235152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerDied","Data":"ac8579f185e9d5a31fb735e240a7db9474963e4a3fc8a2610621483b60d98f53"} Jan 30 05:33:07 crc kubenswrapper[4931]: I0130 05:33:07.235182 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerStarted","Data":"a3a8b0b1662721461bf6495f03be27556288bb0c9d87899aed8cd07bec3d290d"} Jan 30 05:33:08 crc kubenswrapper[4931]: I0130 05:33:08.272788 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerStarted","Data":"d1c237bd63c74b3077c6a07d297620056ee6292b8cbe29e75c0b1d8fe17f2d15"} Jan 30 05:33:09 crc kubenswrapper[4931]: I0130 05:33:09.287387 4931 generic.go:334] "Generic (PLEG): container finished" podID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerID="d1c237bd63c74b3077c6a07d297620056ee6292b8cbe29e75c0b1d8fe17f2d15" exitCode=0 Jan 30 05:33:09 crc kubenswrapper[4931]: I0130 05:33:09.287446 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerDied","Data":"d1c237bd63c74b3077c6a07d297620056ee6292b8cbe29e75c0b1d8fe17f2d15"} Jan 30 05:33:09 crc kubenswrapper[4931]: I0130 05:33:09.423639 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:33:09 crc kubenswrapper[4931]: E0130 05:33:09.424510 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:33:10 crc kubenswrapper[4931]: I0130 05:33:10.300438 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerStarted","Data":"ae533e994fe609ff12ac93cc07298624016e31c68088cbe01149f6798c73cc4c"} Jan 30 05:33:10 crc kubenswrapper[4931]: I0130 05:33:10.326097 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdmvh" podStartSLOduration=2.761214968 podStartE2EDuration="5.32606879s" podCreationTimestamp="2026-01-30 05:33:05 +0000 UTC" firstStartedPulling="2026-01-30 05:33:07.237746704 +0000 UTC m=+1522.607656971" lastFinishedPulling="2026-01-30 05:33:09.802600506 +0000 UTC m=+1525.172510793" observedRunningTime="2026-01-30 05:33:10.319499697 +0000 UTC m=+1525.689409984" watchObservedRunningTime="2026-01-30 05:33:10.32606879 +0000 UTC m=+1525.695979087" Jan 30 05:33:15 crc kubenswrapper[4931]: I0130 05:33:15.810890 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:15 crc kubenswrapper[4931]: I0130 05:33:15.811279 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:15 crc kubenswrapper[4931]: I0130 05:33:15.901258 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:16 crc kubenswrapper[4931]: I0130 05:33:16.391356 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:16 crc kubenswrapper[4931]: I0130 05:33:16.445117 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdmvh"] Jan 30 05:33:18 crc kubenswrapper[4931]: I0130 05:33:18.367853 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdmvh" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="registry-server" containerID="cri-o://ae533e994fe609ff12ac93cc07298624016e31c68088cbe01149f6798c73cc4c" gracePeriod=2 Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.381499 4931 generic.go:334] "Generic (PLEG): container finished" podID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerID="ae533e994fe609ff12ac93cc07298624016e31c68088cbe01149f6798c73cc4c" exitCode=0 Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.381630 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerDied","Data":"ae533e994fe609ff12ac93cc07298624016e31c68088cbe01149f6798c73cc4c"} Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.381968 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdmvh" event={"ID":"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d","Type":"ContainerDied","Data":"a3a8b0b1662721461bf6495f03be27556288bb0c9d87899aed8cd07bec3d290d"} Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.381992 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a8b0b1662721461bf6495f03be27556288bb0c9d87899aed8cd07bec3d290d" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.434943 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.600605 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-utilities\") pod \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.600773 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pcmx\" (UniqueName: \"kubernetes.io/projected/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-kube-api-access-8pcmx\") pod \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.600959 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-catalog-content\") pod \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\" (UID: \"9f9d5f6a-c304-4ef4-aebb-9f346e7f786d\") " Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.602391 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-utilities" (OuterVolumeSpecName: "utilities") pod "9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" (UID: "9f9d5f6a-c304-4ef4-aebb-9f346e7f786d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.617023 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-kube-api-access-8pcmx" (OuterVolumeSpecName: "kube-api-access-8pcmx") pod "9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" (UID: "9f9d5f6a-c304-4ef4-aebb-9f346e7f786d"). InnerVolumeSpecName "kube-api-access-8pcmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.690697 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" (UID: "9f9d5f6a-c304-4ef4-aebb-9f346e7f786d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.703671 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.703710 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pcmx\" (UniqueName: \"kubernetes.io/projected/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-kube-api-access-8pcmx\") on node \"crc\" DevicePath \"\"" Jan 30 05:33:19 crc kubenswrapper[4931]: I0130 05:33:19.703723 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:33:20 crc kubenswrapper[4931]: I0130 05:33:20.391271 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdmvh" Jan 30 05:33:20 crc kubenswrapper[4931]: I0130 05:33:20.440976 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdmvh"] Jan 30 05:33:20 crc kubenswrapper[4931]: I0130 05:33:20.445921 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdmvh"] Jan 30 05:33:21 crc kubenswrapper[4931]: I0130 05:33:21.436634 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" path="/var/lib/kubelet/pods/9f9d5f6a-c304-4ef4-aebb-9f346e7f786d/volumes" Jan 30 05:33:24 crc kubenswrapper[4931]: I0130 05:33:24.422582 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:33:24 crc kubenswrapper[4931]: E0130 05:33:24.424013 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:33:37 crc kubenswrapper[4931]: I0130 05:33:37.422892 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:33:37 crc kubenswrapper[4931]: E0130 05:33:37.423847 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:33:49 crc kubenswrapper[4931]: I0130 05:33:49.422815 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:33:49 crc kubenswrapper[4931]: E0130 05:33:49.423909 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.536363 4931 scope.go:117] "RemoveContainer" containerID="daeb4e60a2f2e8b0ecc5573dd48689c8e466dc66250fe49e905723d105d79613" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.570470 4931 scope.go:117] "RemoveContainer" containerID="6c4ebb40e4402e95e337ac0e8eea0a4fb903b22dbcfc5ac614853d0c17f24e3a" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.628944 4931 scope.go:117] "RemoveContainer" containerID="3baca7478354f0fb4066c265761c1fd4465b993347daea084eeb2d40cd40bed6" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.655810 4931 scope.go:117] "RemoveContainer" containerID="c948d726013eb4e8273ef998118172023cae6536ca99db77a7f1ebd4884def12" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.684201 4931 scope.go:117] "RemoveContainer" containerID="c2771265ae8a990e0e69c0f116c64cc25eecd94f6e185173fb3394e2e6fbe468" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.742717 4931 scope.go:117] "RemoveContainer" containerID="ed1045d9c4b634bdebcb19b30994b9f7ac39021883a8b98a833d09018502f440" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.788222 4931 scope.go:117] "RemoveContainer" containerID="9398f7e713fb447b3a151d286be2d2910e4d8535fd421e906c46b8cc2c9a4728" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.816757 4931 scope.go:117] "RemoveContainer" containerID="2823dcc09d156bc746ffbc3ab196c3d6e136f453bf377837d7cce10861a168f4" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.851098 4931 scope.go:117] "RemoveContainer" containerID="976d06480a8d07dd149684c2767dbf90e61f0fd7efbc4d623ba32e7d83fb861e" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.884182 4931 scope.go:117] "RemoveContainer" containerID="edf9b3d1d8428caf5db14c3063b00d649e4d886f974003048a406d3bcf0b7c43" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.927004 4931 scope.go:117] "RemoveContainer" containerID="571155fa2c4a4cc11bc78f96b7b5b636cdda183726d54338d2bf0cc02d77f003" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.957347 4931 scope.go:117] "RemoveContainer" containerID="83b746b64ed4eb6c60d1909a3bd54f4f030ff6949e754157b24384ceb5419c06" Jan 30 05:33:50 crc kubenswrapper[4931]: I0130 05:33:50.984605 4931 scope.go:117] "RemoveContainer" containerID="5712d27fd9c195ed4c35f4530c38c5e87c6a63708aedb0fa792d34d9e26a0b9a" Jan 30 05:33:51 crc kubenswrapper[4931]: I0130 05:33:51.016317 4931 scope.go:117] "RemoveContainer" containerID="02a426537f79889d684c812318ea1dd0bd0af03a098fb1e7d47cd94e43353e1c" Jan 30 05:34:02 crc kubenswrapper[4931]: I0130 05:34:02.422826 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:34:02 crc kubenswrapper[4931]: E0130 05:34:02.423802 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.216875 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hjwsc"] Jan 30 05:34:12 crc kubenswrapper[4931]: E0130 05:34:12.217642 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="extract-content" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.217657 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="extract-content" Jan 30 05:34:12 crc kubenswrapper[4931]: E0130 05:34:12.217668 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="registry-server" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.217676 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="registry-server" Jan 30 05:34:12 crc kubenswrapper[4931]: E0130 05:34:12.217706 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="extract-utilities" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.217715 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="extract-utilities" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.217896 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9d5f6a-c304-4ef4-aebb-9f346e7f786d" containerName="registry-server" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.218988 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.239388 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjwsc"] Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.240514 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-catalog-content\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.240760 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-utilities\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.240837 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnf4\" (UniqueName: \"kubernetes.io/projected/b37cadec-51d4-44c5-bea0-fec0eec934a5-kube-api-access-smnf4\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.346034 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-utilities\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.346100 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnf4\" (UniqueName: \"kubernetes.io/projected/b37cadec-51d4-44c5-bea0-fec0eec934a5-kube-api-access-smnf4\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.346276 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-catalog-content\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.346962 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-catalog-content\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.347978 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-utilities\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.374555 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnf4\" (UniqueName: \"kubernetes.io/projected/b37cadec-51d4-44c5-bea0-fec0eec934a5-kube-api-access-smnf4\") pod \"redhat-marketplace-hjwsc\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.563394 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.818584 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjwsc"] Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.962943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerStarted","Data":"91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2"} Jan 30 05:34:12 crc kubenswrapper[4931]: I0130 05:34:12.962984 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerStarted","Data":"baa1bec2ab7247b413a344955d2ac6f2777451917fd1e7ba7fd3c0f693e3f21b"} Jan 30 05:34:13 crc kubenswrapper[4931]: I0130 05:34:13.976731 4931 generic.go:334] "Generic (PLEG): container finished" podID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerID="91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2" exitCode=0 Jan 30 05:34:13 crc kubenswrapper[4931]: I0130 05:34:13.976803 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerDied","Data":"91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2"} Jan 30 05:34:14 crc kubenswrapper[4931]: I0130 05:34:14.422598 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:34:14 crc kubenswrapper[4931]: E0130 05:34:14.423564 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:34:14 crc kubenswrapper[4931]: I0130 05:34:14.990944 4931 generic.go:334] "Generic (PLEG): container finished" podID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerID="e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e" exitCode=0 Jan 30 05:34:14 crc kubenswrapper[4931]: I0130 05:34:14.991014 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerDied","Data":"e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e"} Jan 30 05:34:16 crc kubenswrapper[4931]: I0130 05:34:16.021681 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerStarted","Data":"21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a"} Jan 30 05:34:16 crc kubenswrapper[4931]: I0130 05:34:16.057559 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hjwsc" podStartSLOduration=2.593710662 podStartE2EDuration="4.057536932s" podCreationTimestamp="2026-01-30 05:34:12 +0000 UTC" firstStartedPulling="2026-01-30 05:34:13.979573375 +0000 UTC m=+1589.349483672" lastFinishedPulling="2026-01-30 05:34:15.443399645 +0000 UTC m=+1590.813309942" observedRunningTime="2026-01-30 05:34:16.054162358 +0000 UTC m=+1591.424072645" watchObservedRunningTime="2026-01-30 05:34:16.057536932 +0000 UTC m=+1591.427447209" Jan 30 05:34:22 crc kubenswrapper[4931]: I0130 05:34:22.564829 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:22 crc kubenswrapper[4931]: I0130 05:34:22.565526 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:22 crc kubenswrapper[4931]: I0130 05:34:22.640030 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:23 crc kubenswrapper[4931]: I0130 05:34:23.159352 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:23 crc kubenswrapper[4931]: I0130 05:34:23.233741 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjwsc"] Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.100287 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hjwsc" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="registry-server" containerID="cri-o://21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a" gracePeriod=2 Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.568291 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.667764 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-catalog-content\") pod \"b37cadec-51d4-44c5-bea0-fec0eec934a5\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.667824 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smnf4\" (UniqueName: \"kubernetes.io/projected/b37cadec-51d4-44c5-bea0-fec0eec934a5-kube-api-access-smnf4\") pod \"b37cadec-51d4-44c5-bea0-fec0eec934a5\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.667935 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-utilities\") pod \"b37cadec-51d4-44c5-bea0-fec0eec934a5\" (UID: \"b37cadec-51d4-44c5-bea0-fec0eec934a5\") " Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.679107 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-utilities" (OuterVolumeSpecName: "utilities") pod "b37cadec-51d4-44c5-bea0-fec0eec934a5" (UID: "b37cadec-51d4-44c5-bea0-fec0eec934a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.704630 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37cadec-51d4-44c5-bea0-fec0eec934a5-kube-api-access-smnf4" (OuterVolumeSpecName: "kube-api-access-smnf4") pod "b37cadec-51d4-44c5-bea0-fec0eec934a5" (UID: "b37cadec-51d4-44c5-bea0-fec0eec934a5"). InnerVolumeSpecName "kube-api-access-smnf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.712060 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b37cadec-51d4-44c5-bea0-fec0eec934a5" (UID: "b37cadec-51d4-44c5-bea0-fec0eec934a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.773962 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.774028 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smnf4\" (UniqueName: \"kubernetes.io/projected/b37cadec-51d4-44c5-bea0-fec0eec934a5-kube-api-access-smnf4\") on node \"crc\" DevicePath \"\"" Jan 30 05:34:25 crc kubenswrapper[4931]: I0130 05:34:25.774058 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b37cadec-51d4-44c5-bea0-fec0eec934a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.115352 4931 generic.go:334] "Generic (PLEG): container finished" podID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerID="21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a" exitCode=0 Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.115414 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerDied","Data":"21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a"} Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.115485 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hjwsc" event={"ID":"b37cadec-51d4-44c5-bea0-fec0eec934a5","Type":"ContainerDied","Data":"baa1bec2ab7247b413a344955d2ac6f2777451917fd1e7ba7fd3c0f693e3f21b"} Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.115520 4931 scope.go:117] "RemoveContainer" containerID="21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.115672 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hjwsc" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.149670 4931 scope.go:117] "RemoveContainer" containerID="e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.177751 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjwsc"] Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.193090 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hjwsc"] Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.196476 4931 scope.go:117] "RemoveContainer" containerID="91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.228682 4931 scope.go:117] "RemoveContainer" containerID="21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a" Jan 30 05:34:26 crc kubenswrapper[4931]: E0130 05:34:26.233261 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a\": container with ID starting with 21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a not found: ID does not exist" containerID="21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.233345 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a"} err="failed to get container status \"21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a\": rpc error: code = NotFound desc = could not find container \"21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a\": container with ID starting with 21eb531868f45fbe6cad91e5b677c4767f86767d02e4c81b9388507e75b39c8a not found: ID does not exist" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.233396 4931 scope.go:117] "RemoveContainer" containerID="e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e" Jan 30 05:34:26 crc kubenswrapper[4931]: E0130 05:34:26.234004 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e\": container with ID starting with e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e not found: ID does not exist" containerID="e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.234049 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e"} err="failed to get container status \"e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e\": rpc error: code = NotFound desc = could not find container \"e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e\": container with ID starting with e2a32dfc112f9b116108a13a5d3f36ea9b3d3bc48e7e7684e30ebf5e91a98e3e not found: ID does not exist" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.234077 4931 scope.go:117] "RemoveContainer" containerID="91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2" Jan 30 05:34:26 crc kubenswrapper[4931]: E0130 05:34:26.234692 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2\": container with ID starting with 91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2 not found: ID does not exist" containerID="91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2" Jan 30 05:34:26 crc kubenswrapper[4931]: I0130 05:34:26.234761 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2"} err="failed to get container status \"91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2\": rpc error: code = NotFound desc = could not find container \"91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2\": container with ID starting with 91ca3a00bf52ac983475cb121e8ea3da1cb7835cb041680436c65b90998554f2 not found: ID does not exist" Jan 30 05:34:27 crc kubenswrapper[4931]: I0130 05:34:27.422914 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:34:27 crc kubenswrapper[4931]: E0130 05:34:27.423294 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:34:27 crc kubenswrapper[4931]: I0130 05:34:27.439704 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" path="/var/lib/kubelet/pods/b37cadec-51d4-44c5-bea0-fec0eec934a5/volumes" Jan 30 05:34:39 crc kubenswrapper[4931]: I0130 05:34:39.421468 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:34:39 crc kubenswrapper[4931]: E0130 05:34:39.422355 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:34:50 crc kubenswrapper[4931]: I0130 05:34:50.422791 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:34:50 crc kubenswrapper[4931]: E0130 05:34:50.424065 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.304017 4931 scope.go:117] "RemoveContainer" containerID="056aa11a16b72fe7fde4370093154af79d24b07c3142cb8943c78be2016d3fc6" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.350553 4931 scope.go:117] "RemoveContainer" containerID="346e5462e41c54c8f5c2422490f080d2b64f85c405ea5cc5337aa66fee775153" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.427592 4931 scope.go:117] "RemoveContainer" containerID="a729151ede12640ea81c41f5d7f2d36efd861e7a4d31b991fe42dd4d2139fbe2" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.458521 4931 scope.go:117] "RemoveContainer" containerID="0f6848e1ccd25c33da13cfce62f451555f794b35623ff124d320281a39cb9911" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.486838 4931 scope.go:117] "RemoveContainer" containerID="a32ca29963fb38a6014b8500b2c2495801c36c3f1563f62dcc7d71405aa5c328" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.547074 4931 scope.go:117] "RemoveContainer" containerID="62da5f526098b3b9f5437a81119156f87878963a1c44c026236d9b63e20bbac5" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.574939 4931 scope.go:117] "RemoveContainer" containerID="9186f065f875b33ba8c3817977c815fff7b67dc7404ddb06ea6a85fb64800755" Jan 30 05:34:51 crc kubenswrapper[4931]: I0130 05:34:51.604697 4931 scope.go:117] "RemoveContainer" containerID="6fe17572613dbd341b30bd762c1b5735b49c432308fe3f9a9ede6d5185282afe" Jan 30 05:35:03 crc kubenswrapper[4931]: I0130 05:35:03.422878 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:35:03 crc kubenswrapper[4931]: E0130 05:35:03.424107 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:35:14 crc kubenswrapper[4931]: I0130 05:35:14.422756 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:35:14 crc kubenswrapper[4931]: E0130 05:35:14.423827 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:35:25 crc kubenswrapper[4931]: I0130 05:35:25.428948 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:35:25 crc kubenswrapper[4931]: E0130 05:35:25.430069 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:35:36 crc kubenswrapper[4931]: I0130 05:35:36.422754 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:35:36 crc kubenswrapper[4931]: E0130 05:35:36.423878 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:35:47 crc kubenswrapper[4931]: I0130 05:35:47.422779 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:35:47 crc kubenswrapper[4931]: E0130 05:35:47.424029 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:35:51 crc kubenswrapper[4931]: I0130 05:35:51.768048 4931 scope.go:117] "RemoveContainer" containerID="25cc12087ab98d0fc79e679c4de5be61f557329f293a5a68393ba8b20a57c428" Jan 30 05:35:59 crc kubenswrapper[4931]: I0130 05:35:59.423059 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:35:59 crc kubenswrapper[4931]: E0130 05:35:59.423776 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:36:14 crc kubenswrapper[4931]: I0130 05:36:14.422657 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:36:14 crc kubenswrapper[4931]: E0130 05:36:14.423931 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:36:29 crc kubenswrapper[4931]: I0130 05:36:29.422393 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:36:29 crc kubenswrapper[4931]: E0130 05:36:29.423485 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:36:43 crc kubenswrapper[4931]: I0130 05:36:43.423494 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:36:43 crc kubenswrapper[4931]: E0130 05:36:43.424576 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:36:56 crc kubenswrapper[4931]: I0130 05:36:56.421831 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:36:56 crc kubenswrapper[4931]: E0130 05:36:56.422881 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:37:11 crc kubenswrapper[4931]: I0130 05:37:11.422929 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:37:11 crc kubenswrapper[4931]: E0130 05:37:11.424138 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:37:24 crc kubenswrapper[4931]: I0130 05:37:24.422323 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:37:24 crc kubenswrapper[4931]: E0130 05:37:24.422943 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:37:38 crc kubenswrapper[4931]: I0130 05:37:38.422091 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:37:38 crc kubenswrapper[4931]: E0130 05:37:38.423135 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:37:51 crc kubenswrapper[4931]: I0130 05:37:51.422060 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:37:51 crc kubenswrapper[4931]: E0130 05:37:51.423136 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:38:04 crc kubenswrapper[4931]: I0130 05:38:04.422113 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:38:05 crc kubenswrapper[4931]: I0130 05:38:05.265683 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"b0bb1a2bae55022b0965d85460d3f0d10d63b5551af2e94575d14b0dc028f44b"} Jan 30 05:39:51 crc kubenswrapper[4931]: I0130 05:39:51.888405 4931 scope.go:117] "RemoveContainer" containerID="ae533e994fe609ff12ac93cc07298624016e31c68088cbe01149f6798c73cc4c" Jan 30 05:39:51 crc kubenswrapper[4931]: I0130 05:39:51.934200 4931 scope.go:117] "RemoveContainer" containerID="d1c237bd63c74b3077c6a07d297620056ee6292b8cbe29e75c0b1d8fe17f2d15" Jan 30 05:39:51 crc kubenswrapper[4931]: I0130 05:39:51.968253 4931 scope.go:117] "RemoveContainer" containerID="ac8579f185e9d5a31fb735e240a7db9474963e4a3fc8a2610621483b60d98f53" Jan 30 05:40:27 crc kubenswrapper[4931]: I0130 05:40:27.363768 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:40:27 crc kubenswrapper[4931]: I0130 05:40:27.364282 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:40:57 crc kubenswrapper[4931]: I0130 05:40:57.363050 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:40:57 crc kubenswrapper[4931]: I0130 05:40:57.363676 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:41:27 crc kubenswrapper[4931]: I0130 05:41:27.362922 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:41:27 crc kubenswrapper[4931]: I0130 05:41:27.363690 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:41:27 crc kubenswrapper[4931]: I0130 05:41:27.363764 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:41:27 crc kubenswrapper[4931]: I0130 05:41:27.364692 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0bb1a2bae55022b0965d85460d3f0d10d63b5551af2e94575d14b0dc028f44b"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:41:27 crc kubenswrapper[4931]: I0130 05:41:27.364820 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://b0bb1a2bae55022b0965d85460d3f0d10d63b5551af2e94575d14b0dc028f44b" gracePeriod=600 Jan 30 05:41:28 crc kubenswrapper[4931]: I0130 05:41:28.190256 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="b0bb1a2bae55022b0965d85460d3f0d10d63b5551af2e94575d14b0dc028f44b" exitCode=0 Jan 30 05:41:28 crc kubenswrapper[4931]: I0130 05:41:28.190328 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"b0bb1a2bae55022b0965d85460d3f0d10d63b5551af2e94575d14b0dc028f44b"} Jan 30 05:41:28 crc kubenswrapper[4931]: I0130 05:41:28.190679 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4"} Jan 30 05:41:28 crc kubenswrapper[4931]: I0130 05:41:28.190713 4931 scope.go:117] "RemoveContainer" containerID="245db06188d68a7c7dde2f80f6002515bda5e496caa936666637ab4b8b52d5a0" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.272293 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7c2mf"] Jan 30 05:42:58 crc kubenswrapper[4931]: E0130 05:42:58.273287 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="extract-utilities" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.273308 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="extract-utilities" Jan 30 05:42:58 crc kubenswrapper[4931]: E0130 05:42:58.273341 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="registry-server" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.273357 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="registry-server" Jan 30 05:42:58 crc kubenswrapper[4931]: E0130 05:42:58.273380 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="extract-content" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.273393 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="extract-content" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.273673 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37cadec-51d4-44c5-bea0-fec0eec934a5" containerName="registry-server" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.275460 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.290347 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7c2mf"] Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.382880 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9fv\" (UniqueName: \"kubernetes.io/projected/857de757-e591-4d21-8c09-df06fe672113-kube-api-access-rh9fv\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.383011 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-catalog-content\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.383068 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-utilities\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.484295 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9fv\" (UniqueName: \"kubernetes.io/projected/857de757-e591-4d21-8c09-df06fe672113-kube-api-access-rh9fv\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.484450 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-catalog-content\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.484508 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-utilities\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.485331 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-catalog-content\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.485400 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-utilities\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.525482 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9fv\" (UniqueName: \"kubernetes.io/projected/857de757-e591-4d21-8c09-df06fe672113-kube-api-access-rh9fv\") pod \"community-operators-7c2mf\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:58 crc kubenswrapper[4931]: I0130 05:42:58.608080 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:42:59 crc kubenswrapper[4931]: I0130 05:42:59.065783 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7c2mf"] Jan 30 05:42:59 crc kubenswrapper[4931]: I0130 05:42:59.324857 4931 generic.go:334] "Generic (PLEG): container finished" podID="857de757-e591-4d21-8c09-df06fe672113" containerID="f6a175af3e413d326c864a9a3c6f8a72ada9a5572ccc2a95480d4ece4789f1a7" exitCode=0 Jan 30 05:42:59 crc kubenswrapper[4931]: I0130 05:42:59.324923 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerDied","Data":"f6a175af3e413d326c864a9a3c6f8a72ada9a5572ccc2a95480d4ece4789f1a7"} Jan 30 05:42:59 crc kubenswrapper[4931]: I0130 05:42:59.324960 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerStarted","Data":"0fd0f29f39dfc2f8fc36c20641bb3dc4341ba8d334f88c8b701b78ba022b2b94"} Jan 30 05:42:59 crc kubenswrapper[4931]: I0130 05:42:59.327564 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:43:00 crc kubenswrapper[4931]: I0130 05:43:00.335106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerStarted","Data":"b672890edb88cc835c7a676cc899fe42c12ea91373683fb907536ec1d8955df4"} Jan 30 05:43:01 crc kubenswrapper[4931]: I0130 05:43:01.346526 4931 generic.go:334] "Generic (PLEG): container finished" podID="857de757-e591-4d21-8c09-df06fe672113" containerID="b672890edb88cc835c7a676cc899fe42c12ea91373683fb907536ec1d8955df4" exitCode=0 Jan 30 05:43:01 crc kubenswrapper[4931]: I0130 05:43:01.346615 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerDied","Data":"b672890edb88cc835c7a676cc899fe42c12ea91373683fb907536ec1d8955df4"} Jan 30 05:43:02 crc kubenswrapper[4931]: I0130 05:43:02.357444 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerStarted","Data":"e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6"} Jan 30 05:43:02 crc kubenswrapper[4931]: I0130 05:43:02.382030 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7c2mf" podStartSLOduration=1.974811441 podStartE2EDuration="4.382008155s" podCreationTimestamp="2026-01-30 05:42:58 +0000 UTC" firstStartedPulling="2026-01-30 05:42:59.327188684 +0000 UTC m=+2114.697098971" lastFinishedPulling="2026-01-30 05:43:01.734385418 +0000 UTC m=+2117.104295685" observedRunningTime="2026-01-30 05:43:02.375075085 +0000 UTC m=+2117.744985362" watchObservedRunningTime="2026-01-30 05:43:02.382008155 +0000 UTC m=+2117.751918422" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.608510 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.609289 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.647026 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pdkc7"] Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.649526 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.664229 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pdkc7"] Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.695069 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.730256 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ktl6\" (UniqueName: \"kubernetes.io/projected/e047952b-acbc-4cc4-b175-8a23b1926766-kube-api-access-6ktl6\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.730742 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-utilities\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.730913 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-catalog-content\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.832053 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-catalog-content\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.832124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktl6\" (UniqueName: \"kubernetes.io/projected/e047952b-acbc-4cc4-b175-8a23b1926766-kube-api-access-6ktl6\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.832203 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-utilities\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.832777 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-utilities\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.832776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-catalog-content\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.861930 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ktl6\" (UniqueName: \"kubernetes.io/projected/e047952b-acbc-4cc4-b175-8a23b1926766-kube-api-access-6ktl6\") pod \"certified-operators-pdkc7\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:08 crc kubenswrapper[4931]: I0130 05:43:08.974526 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:09 crc kubenswrapper[4931]: I0130 05:43:09.493839 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pdkc7"] Jan 30 05:43:09 crc kubenswrapper[4931]: I0130 05:43:09.521202 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:43:10 crc kubenswrapper[4931]: I0130 05:43:10.434504 4931 generic.go:334] "Generic (PLEG): container finished" podID="e047952b-acbc-4cc4-b175-8a23b1926766" containerID="c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb" exitCode=0 Jan 30 05:43:10 crc kubenswrapper[4931]: I0130 05:43:10.435891 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerDied","Data":"c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb"} Jan 30 05:43:10 crc kubenswrapper[4931]: I0130 05:43:10.435915 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerStarted","Data":"43370026d1dd1ea6668c33c998acaf57c52537d0f5113eb38c1504292cd9a450"} Jan 30 05:43:11 crc kubenswrapper[4931]: I0130 05:43:11.447532 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerStarted","Data":"0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f"} Jan 30 05:43:12 crc kubenswrapper[4931]: I0130 05:43:12.456152 4931 generic.go:334] "Generic (PLEG): container finished" podID="e047952b-acbc-4cc4-b175-8a23b1926766" containerID="0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f" exitCode=0 Jan 30 05:43:12 crc kubenswrapper[4931]: I0130 05:43:12.456225 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerDied","Data":"0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f"} Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.449263 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g84bs"] Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.450876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.466272 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerStarted","Data":"d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44"} Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.493381 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pdkc7" podStartSLOduration=3.078526799 podStartE2EDuration="5.49335773s" podCreationTimestamp="2026-01-30 05:43:08 +0000 UTC" firstStartedPulling="2026-01-30 05:43:10.436674949 +0000 UTC m=+2125.806585196" lastFinishedPulling="2026-01-30 05:43:12.85150588 +0000 UTC m=+2128.221416127" observedRunningTime="2026-01-30 05:43:13.492672411 +0000 UTC m=+2128.862582708" watchObservedRunningTime="2026-01-30 05:43:13.49335773 +0000 UTC m=+2128.863268017" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.509861 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-utilities\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.510071 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br68t\" (UniqueName: \"kubernetes.io/projected/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-kube-api-access-br68t\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.510317 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-catalog-content\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.511765 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g84bs"] Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.611867 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br68t\" (UniqueName: \"kubernetes.io/projected/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-kube-api-access-br68t\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.611983 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-catalog-content\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.612036 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-utilities\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.612593 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-catalog-content\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.612693 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-utilities\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.635664 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br68t\" (UniqueName: \"kubernetes.io/projected/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-kube-api-access-br68t\") pod \"redhat-operators-g84bs\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:13 crc kubenswrapper[4931]: I0130 05:43:13.774182 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:14 crc kubenswrapper[4931]: I0130 05:43:14.217581 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g84bs"] Jan 30 05:43:14 crc kubenswrapper[4931]: W0130 05:43:14.220844 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7f649a_ec93_4f68_a9c4_a3f979bd4394.slice/crio-342b5a1e36ce4641c2defe1d001ddf0e69d00817239c0204b0c40243055ce4f6 WatchSource:0}: Error finding container 342b5a1e36ce4641c2defe1d001ddf0e69d00817239c0204b0c40243055ce4f6: Status 404 returned error can't find the container with id 342b5a1e36ce4641c2defe1d001ddf0e69d00817239c0204b0c40243055ce4f6 Jan 30 05:43:14 crc kubenswrapper[4931]: I0130 05:43:14.473222 4931 generic.go:334] "Generic (PLEG): container finished" podID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerID="afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23" exitCode=0 Jan 30 05:43:14 crc kubenswrapper[4931]: I0130 05:43:14.473340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerDied","Data":"afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23"} Jan 30 05:43:14 crc kubenswrapper[4931]: I0130 05:43:14.473364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerStarted","Data":"342b5a1e36ce4641c2defe1d001ddf0e69d00817239c0204b0c40243055ce4f6"} Jan 30 05:43:15 crc kubenswrapper[4931]: I0130 05:43:15.499625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerStarted","Data":"02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a"} Jan 30 05:43:16 crc kubenswrapper[4931]: I0130 05:43:16.510130 4931 generic.go:334] "Generic (PLEG): container finished" podID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerID="02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a" exitCode=0 Jan 30 05:43:16 crc kubenswrapper[4931]: I0130 05:43:16.510176 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerDied","Data":"02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a"} Jan 30 05:43:17 crc kubenswrapper[4931]: I0130 05:43:17.519021 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerStarted","Data":"f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1"} Jan 30 05:43:17 crc kubenswrapper[4931]: I0130 05:43:17.542494 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g84bs" podStartSLOduration=2.042985569 podStartE2EDuration="4.542468068s" podCreationTimestamp="2026-01-30 05:43:13 +0000 UTC" firstStartedPulling="2026-01-30 05:43:14.474620851 +0000 UTC m=+2129.844531108" lastFinishedPulling="2026-01-30 05:43:16.97410335 +0000 UTC m=+2132.344013607" observedRunningTime="2026-01-30 05:43:17.53996217 +0000 UTC m=+2132.909872437" watchObservedRunningTime="2026-01-30 05:43:17.542468068 +0000 UTC m=+2132.912378345" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.238160 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7c2mf"] Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.238841 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7c2mf" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="registry-server" containerID="cri-o://e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6" gracePeriod=2 Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.565102 4931 generic.go:334] "Generic (PLEG): container finished" podID="857de757-e591-4d21-8c09-df06fe672113" containerID="e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6" exitCode=0 Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.567302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerDied","Data":"e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6"} Jan 30 05:43:18 crc kubenswrapper[4931]: E0130 05:43:18.612791 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6 is running failed: container process not found" containerID="e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:43:18 crc kubenswrapper[4931]: E0130 05:43:18.620088 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6 is running failed: container process not found" containerID="e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:43:18 crc kubenswrapper[4931]: E0130 05:43:18.623629 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6 is running failed: container process not found" containerID="e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:43:18 crc kubenswrapper[4931]: E0130 05:43:18.623661 4931 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-7c2mf" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="registry-server" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.773844 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.912210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-utilities\") pod \"857de757-e591-4d21-8c09-df06fe672113\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.912316 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-catalog-content\") pod \"857de757-e591-4d21-8c09-df06fe672113\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.912509 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh9fv\" (UniqueName: \"kubernetes.io/projected/857de757-e591-4d21-8c09-df06fe672113-kube-api-access-rh9fv\") pod \"857de757-e591-4d21-8c09-df06fe672113\" (UID: \"857de757-e591-4d21-8c09-df06fe672113\") " Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.915779 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-utilities" (OuterVolumeSpecName: "utilities") pod "857de757-e591-4d21-8c09-df06fe672113" (UID: "857de757-e591-4d21-8c09-df06fe672113"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.920712 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857de757-e591-4d21-8c09-df06fe672113-kube-api-access-rh9fv" (OuterVolumeSpecName: "kube-api-access-rh9fv") pod "857de757-e591-4d21-8c09-df06fe672113" (UID: "857de757-e591-4d21-8c09-df06fe672113"). InnerVolumeSpecName "kube-api-access-rh9fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.975524 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.976349 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:18 crc kubenswrapper[4931]: I0130 05:43:18.988156 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "857de757-e591-4d21-8c09-df06fe672113" (UID: "857de757-e591-4d21-8c09-df06fe672113"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.014359 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh9fv\" (UniqueName: \"kubernetes.io/projected/857de757-e591-4d21-8c09-df06fe672113-kube-api-access-rh9fv\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.014391 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.014403 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857de757-e591-4d21-8c09-df06fe672113-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.032236 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.584499 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7c2mf" event={"ID":"857de757-e591-4d21-8c09-df06fe672113","Type":"ContainerDied","Data":"0fd0f29f39dfc2f8fc36c20641bb3dc4341ba8d334f88c8b701b78ba022b2b94"} Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.584568 4931 scope.go:117] "RemoveContainer" containerID="e12e6d3ec4535e7b05b1464e713c13987b8f0fa47679ea5aab67011140b54df6" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.584562 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7c2mf" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.609970 4931 scope.go:117] "RemoveContainer" containerID="b672890edb88cc835c7a676cc899fe42c12ea91373683fb907536ec1d8955df4" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.616021 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7c2mf"] Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.624507 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7c2mf"] Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.633610 4931 scope.go:117] "RemoveContainer" containerID="f6a175af3e413d326c864a9a3c6f8a72ada9a5572ccc2a95480d4ece4789f1a7" Jan 30 05:43:19 crc kubenswrapper[4931]: I0130 05:43:19.648084 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:21 crc kubenswrapper[4931]: I0130 05:43:21.436710 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="857de757-e591-4d21-8c09-df06fe672113" path="/var/lib/kubelet/pods/857de757-e591-4d21-8c09-df06fe672113/volumes" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.032608 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pdkc7"] Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.033662 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pdkc7" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="registry-server" containerID="cri-o://d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44" gracePeriod=2 Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.505400 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.586957 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-utilities\") pod \"e047952b-acbc-4cc4-b175-8a23b1926766\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.587039 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-catalog-content\") pod \"e047952b-acbc-4cc4-b175-8a23b1926766\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.587068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ktl6\" (UniqueName: \"kubernetes.io/projected/e047952b-acbc-4cc4-b175-8a23b1926766-kube-api-access-6ktl6\") pod \"e047952b-acbc-4cc4-b175-8a23b1926766\" (UID: \"e047952b-acbc-4cc4-b175-8a23b1926766\") " Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.588154 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-utilities" (OuterVolumeSpecName: "utilities") pod "e047952b-acbc-4cc4-b175-8a23b1926766" (UID: "e047952b-acbc-4cc4-b175-8a23b1926766"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.592842 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e047952b-acbc-4cc4-b175-8a23b1926766-kube-api-access-6ktl6" (OuterVolumeSpecName: "kube-api-access-6ktl6") pod "e047952b-acbc-4cc4-b175-8a23b1926766" (UID: "e047952b-acbc-4cc4-b175-8a23b1926766"). InnerVolumeSpecName "kube-api-access-6ktl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.614497 4931 generic.go:334] "Generic (PLEG): container finished" podID="e047952b-acbc-4cc4-b175-8a23b1926766" containerID="d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44" exitCode=0 Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.614541 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerDied","Data":"d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44"} Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.614555 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdkc7" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.614577 4931 scope.go:117] "RemoveContainer" containerID="d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.614567 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdkc7" event={"ID":"e047952b-acbc-4cc4-b175-8a23b1926766","Type":"ContainerDied","Data":"43370026d1dd1ea6668c33c998acaf57c52537d0f5113eb38c1504292cd9a450"} Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.640905 4931 scope.go:117] "RemoveContainer" containerID="0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.652071 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e047952b-acbc-4cc4-b175-8a23b1926766" (UID: "e047952b-acbc-4cc4-b175-8a23b1926766"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.660841 4931 scope.go:117] "RemoveContainer" containerID="c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.689578 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.689659 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e047952b-acbc-4cc4-b175-8a23b1926766-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.689688 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ktl6\" (UniqueName: \"kubernetes.io/projected/e047952b-acbc-4cc4-b175-8a23b1926766-kube-api-access-6ktl6\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.694352 4931 scope.go:117] "RemoveContainer" containerID="d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44" Jan 30 05:43:23 crc kubenswrapper[4931]: E0130 05:43:23.695014 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44\": container with ID starting with d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44 not found: ID does not exist" containerID="d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.695107 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44"} err="failed to get container status \"d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44\": rpc error: code = NotFound desc = could not find container \"d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44\": container with ID starting with d05f22ac35dab00e228e135d0c3cbe5eb830af18c1f8f76ac91f8c4d34517c44 not found: ID does not exist" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.695180 4931 scope.go:117] "RemoveContainer" containerID="0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f" Jan 30 05:43:23 crc kubenswrapper[4931]: E0130 05:43:23.695797 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f\": container with ID starting with 0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f not found: ID does not exist" containerID="0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.695858 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f"} err="failed to get container status \"0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f\": rpc error: code = NotFound desc = could not find container \"0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f\": container with ID starting with 0429a78ef978a72eeb5b6f269db910d667fae0bc20b8052dab294c709b1d712f not found: ID does not exist" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.695899 4931 scope.go:117] "RemoveContainer" containerID="c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb" Jan 30 05:43:23 crc kubenswrapper[4931]: E0130 05:43:23.696459 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb\": container with ID starting with c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb not found: ID does not exist" containerID="c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.696508 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb"} err="failed to get container status \"c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb\": rpc error: code = NotFound desc = could not find container \"c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb\": container with ID starting with c71840922cabbd330c9f37cc39b32a1a7643783adad5e2815015183517a80dcb not found: ID does not exist" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.775090 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.775166 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.952302 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pdkc7"] Jan 30 05:43:23 crc kubenswrapper[4931]: I0130 05:43:23.964181 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pdkc7"] Jan 30 05:43:24 crc kubenswrapper[4931]: I0130 05:43:24.835607 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g84bs" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="registry-server" probeResult="failure" output=< Jan 30 05:43:24 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 05:43:24 crc kubenswrapper[4931]: > Jan 30 05:43:25 crc kubenswrapper[4931]: I0130 05:43:25.438765 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" path="/var/lib/kubelet/pods/e047952b-acbc-4cc4-b175-8a23b1926766/volumes" Jan 30 05:43:27 crc kubenswrapper[4931]: I0130 05:43:27.363144 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:43:27 crc kubenswrapper[4931]: I0130 05:43:27.363207 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:43:33 crc kubenswrapper[4931]: I0130 05:43:33.851008 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:33 crc kubenswrapper[4931]: I0130 05:43:33.920940 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:34 crc kubenswrapper[4931]: I0130 05:43:34.099538 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g84bs"] Jan 30 05:43:35 crc kubenswrapper[4931]: I0130 05:43:35.737009 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g84bs" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="registry-server" containerID="cri-o://f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1" gracePeriod=2 Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.259747 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.394829 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-utilities\") pod \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.394984 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br68t\" (UniqueName: \"kubernetes.io/projected/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-kube-api-access-br68t\") pod \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.395060 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-catalog-content\") pod \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\" (UID: \"4d7f649a-ec93-4f68-a9c4-a3f979bd4394\") " Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.396391 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-utilities" (OuterVolumeSpecName: "utilities") pod "4d7f649a-ec93-4f68-a9c4-a3f979bd4394" (UID: "4d7f649a-ec93-4f68-a9c4-a3f979bd4394"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.405360 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-kube-api-access-br68t" (OuterVolumeSpecName: "kube-api-access-br68t") pod "4d7f649a-ec93-4f68-a9c4-a3f979bd4394" (UID: "4d7f649a-ec93-4f68-a9c4-a3f979bd4394"). InnerVolumeSpecName "kube-api-access-br68t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.498463 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br68t\" (UniqueName: \"kubernetes.io/projected/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-kube-api-access-br68t\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.499731 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.585488 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d7f649a-ec93-4f68-a9c4-a3f979bd4394" (UID: "4d7f649a-ec93-4f68-a9c4-a3f979bd4394"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.601412 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7f649a-ec93-4f68-a9c4-a3f979bd4394-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.747954 4931 generic.go:334] "Generic (PLEG): container finished" podID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerID="f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1" exitCode=0 Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.747996 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerDied","Data":"f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1"} Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.748022 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g84bs" event={"ID":"4d7f649a-ec93-4f68-a9c4-a3f979bd4394","Type":"ContainerDied","Data":"342b5a1e36ce4641c2defe1d001ddf0e69d00817239c0204b0c40243055ce4f6"} Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.748040 4931 scope.go:117] "RemoveContainer" containerID="f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.748068 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g84bs" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.780202 4931 scope.go:117] "RemoveContainer" containerID="02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.816015 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g84bs"] Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.828909 4931 scope.go:117] "RemoveContainer" containerID="afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.830373 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g84bs"] Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.855983 4931 scope.go:117] "RemoveContainer" containerID="f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1" Jan 30 05:43:36 crc kubenswrapper[4931]: E0130 05:43:36.856684 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1\": container with ID starting with f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1 not found: ID does not exist" containerID="f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.856754 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1"} err="failed to get container status \"f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1\": rpc error: code = NotFound desc = could not find container \"f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1\": container with ID starting with f081a7f06933819e7bd133a095a975cc0b7391cf59d391c4be5b65593201afa1 not found: ID does not exist" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.856797 4931 scope.go:117] "RemoveContainer" containerID="02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a" Jan 30 05:43:36 crc kubenswrapper[4931]: E0130 05:43:36.857517 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a\": container with ID starting with 02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a not found: ID does not exist" containerID="02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.857564 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a"} err="failed to get container status \"02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a\": rpc error: code = NotFound desc = could not find container \"02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a\": container with ID starting with 02a8105a0301b9a28a0523b19255e5044e5924cf2163dee95ec23a2ca278682a not found: ID does not exist" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.857632 4931 scope.go:117] "RemoveContainer" containerID="afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23" Jan 30 05:43:36 crc kubenswrapper[4931]: E0130 05:43:36.858147 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23\": container with ID starting with afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23 not found: ID does not exist" containerID="afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23" Jan 30 05:43:36 crc kubenswrapper[4931]: I0130 05:43:36.858201 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23"} err="failed to get container status \"afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23\": rpc error: code = NotFound desc = could not find container \"afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23\": container with ID starting with afb997e59210abc4d4f09acd8d97652f2288e9e4b55bf2612337d1c72bd2ad23 not found: ID does not exist" Jan 30 05:43:37 crc kubenswrapper[4931]: I0130 05:43:37.439501 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" path="/var/lib/kubelet/pods/4d7f649a-ec93-4f68-a9c4-a3f979bd4394/volumes" Jan 30 05:43:57 crc kubenswrapper[4931]: I0130 05:43:57.363413 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:43:57 crc kubenswrapper[4931]: I0130 05:43:57.364698 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:44:27 crc kubenswrapper[4931]: I0130 05:44:27.363618 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:44:27 crc kubenswrapper[4931]: I0130 05:44:27.364300 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:44:27 crc kubenswrapper[4931]: I0130 05:44:27.364368 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:44:27 crc kubenswrapper[4931]: I0130 05:44:27.365310 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:44:27 crc kubenswrapper[4931]: I0130 05:44:27.365471 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" gracePeriod=600 Jan 30 05:44:27 crc kubenswrapper[4931]: E0130 05:44:27.489906 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:44:28 crc kubenswrapper[4931]: I0130 05:44:28.254484 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" exitCode=0 Jan 30 05:44:28 crc kubenswrapper[4931]: I0130 05:44:28.254532 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4"} Jan 30 05:44:28 crc kubenswrapper[4931]: I0130 05:44:28.255520 4931 scope.go:117] "RemoveContainer" containerID="b0bb1a2bae55022b0965d85460d3f0d10d63b5551af2e94575d14b0dc028f44b" Jan 30 05:44:28 crc kubenswrapper[4931]: I0130 05:44:28.256730 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:44:28 crc kubenswrapper[4931]: E0130 05:44:28.257333 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:44:39 crc kubenswrapper[4931]: I0130 05:44:39.422341 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:44:39 crc kubenswrapper[4931]: E0130 05:44:39.423615 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.992702 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwhpr"] Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993412 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="extract-utilities" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993458 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="extract-utilities" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993487 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993500 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993515 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993530 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993558 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="extract-utilities" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993572 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="extract-utilities" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993590 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="extract-content" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993602 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="extract-content" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993621 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="extract-content" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993635 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="extract-content" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993658 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="extract-content" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993670 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="extract-content" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993696 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="extract-utilities" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993709 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="extract-utilities" Jan 30 05:44:41 crc kubenswrapper[4931]: E0130 05:44:41.993731 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993742 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993957 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e047952b-acbc-4cc4-b175-8a23b1926766" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.993990 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="857de757-e591-4d21-8c09-df06fe672113" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.994019 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7f649a-ec93-4f68-a9c4-a3f979bd4394" containerName="registry-server" Jan 30 05:44:41 crc kubenswrapper[4931]: I0130 05:44:41.995657 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.014146 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwhpr"] Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.067722 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-catalog-content\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.067873 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-utilities\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.067928 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9dg\" (UniqueName: \"kubernetes.io/projected/689df455-3e6e-462f-bb80-862257e72f80-kube-api-access-cq9dg\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.171254 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-utilities\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.169388 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-utilities\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.171519 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9dg\" (UniqueName: \"kubernetes.io/projected/689df455-3e6e-462f-bb80-862257e72f80-kube-api-access-cq9dg\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.172493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-catalog-content\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.173017 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-catalog-content\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.207081 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9dg\" (UniqueName: \"kubernetes.io/projected/689df455-3e6e-462f-bb80-862257e72f80-kube-api-access-cq9dg\") pod \"redhat-marketplace-kwhpr\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.320898 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:42 crc kubenswrapper[4931]: I0130 05:44:42.598824 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwhpr"] Jan 30 05:44:43 crc kubenswrapper[4931]: I0130 05:44:43.408640 4931 generic.go:334] "Generic (PLEG): container finished" podID="689df455-3e6e-462f-bb80-862257e72f80" containerID="fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff" exitCode=0 Jan 30 05:44:43 crc kubenswrapper[4931]: I0130 05:44:43.410388 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerDied","Data":"fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff"} Jan 30 05:44:43 crc kubenswrapper[4931]: I0130 05:44:43.410511 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerStarted","Data":"167a7f0ace8b81d5e35d65f735e8b0cbec9a1392bc6ad83cb44e5292d753d103"} Jan 30 05:44:44 crc kubenswrapper[4931]: I0130 05:44:44.422598 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerStarted","Data":"51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a"} Jan 30 05:44:45 crc kubenswrapper[4931]: I0130 05:44:45.439947 4931 generic.go:334] "Generic (PLEG): container finished" podID="689df455-3e6e-462f-bb80-862257e72f80" containerID="51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a" exitCode=0 Jan 30 05:44:45 crc kubenswrapper[4931]: I0130 05:44:45.440001 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerDied","Data":"51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a"} Jan 30 05:44:46 crc kubenswrapper[4931]: I0130 05:44:46.455106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerStarted","Data":"f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6"} Jan 30 05:44:46 crc kubenswrapper[4931]: I0130 05:44:46.488540 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwhpr" podStartSLOduration=2.812061339 podStartE2EDuration="5.488515281s" podCreationTimestamp="2026-01-30 05:44:41 +0000 UTC" firstStartedPulling="2026-01-30 05:44:43.410902333 +0000 UTC m=+2218.780812630" lastFinishedPulling="2026-01-30 05:44:46.087356275 +0000 UTC m=+2221.457266572" observedRunningTime="2026-01-30 05:44:46.486981829 +0000 UTC m=+2221.856892126" watchObservedRunningTime="2026-01-30 05:44:46.488515281 +0000 UTC m=+2221.858425568" Jan 30 05:44:52 crc kubenswrapper[4931]: I0130 05:44:52.322512 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:52 crc kubenswrapper[4931]: I0130 05:44:52.322887 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:52 crc kubenswrapper[4931]: I0130 05:44:52.406944 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:52 crc kubenswrapper[4931]: I0130 05:44:52.581604 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:52 crc kubenswrapper[4931]: I0130 05:44:52.662007 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwhpr"] Jan 30 05:44:54 crc kubenswrapper[4931]: I0130 05:44:54.423005 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:44:54 crc kubenswrapper[4931]: E0130 05:44:54.423505 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:44:54 crc kubenswrapper[4931]: I0130 05:44:54.520510 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwhpr" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="registry-server" containerID="cri-o://f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6" gracePeriod=2 Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.035307 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.104207 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-catalog-content\") pod \"689df455-3e6e-462f-bb80-862257e72f80\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.104301 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq9dg\" (UniqueName: \"kubernetes.io/projected/689df455-3e6e-462f-bb80-862257e72f80-kube-api-access-cq9dg\") pod \"689df455-3e6e-462f-bb80-862257e72f80\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.104377 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-utilities\") pod \"689df455-3e6e-462f-bb80-862257e72f80\" (UID: \"689df455-3e6e-462f-bb80-862257e72f80\") " Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.106035 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-utilities" (OuterVolumeSpecName: "utilities") pod "689df455-3e6e-462f-bb80-862257e72f80" (UID: "689df455-3e6e-462f-bb80-862257e72f80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.111295 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689df455-3e6e-462f-bb80-862257e72f80-kube-api-access-cq9dg" (OuterVolumeSpecName: "kube-api-access-cq9dg") pod "689df455-3e6e-462f-bb80-862257e72f80" (UID: "689df455-3e6e-462f-bb80-862257e72f80"). InnerVolumeSpecName "kube-api-access-cq9dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.145570 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "689df455-3e6e-462f-bb80-862257e72f80" (UID: "689df455-3e6e-462f-bb80-862257e72f80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.205574 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.205632 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq9dg\" (UniqueName: \"kubernetes.io/projected/689df455-3e6e-462f-bb80-862257e72f80-kube-api-access-cq9dg\") on node \"crc\" DevicePath \"\"" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.205653 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/689df455-3e6e-462f-bb80-862257e72f80-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.535146 4931 generic.go:334] "Generic (PLEG): container finished" podID="689df455-3e6e-462f-bb80-862257e72f80" containerID="f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6" exitCode=0 Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.535374 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerDied","Data":"f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6"} Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.535490 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwhpr" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.535705 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwhpr" event={"ID":"689df455-3e6e-462f-bb80-862257e72f80","Type":"ContainerDied","Data":"167a7f0ace8b81d5e35d65f735e8b0cbec9a1392bc6ad83cb44e5292d753d103"} Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.535749 4931 scope.go:117] "RemoveContainer" containerID="f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.568717 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwhpr"] Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.585041 4931 scope.go:117] "RemoveContainer" containerID="51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.594387 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwhpr"] Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.620920 4931 scope.go:117] "RemoveContainer" containerID="fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.649612 4931 scope.go:117] "RemoveContainer" containerID="f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6" Jan 30 05:44:55 crc kubenswrapper[4931]: E0130 05:44:55.650274 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6\": container with ID starting with f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6 not found: ID does not exist" containerID="f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.650440 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6"} err="failed to get container status \"f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6\": rpc error: code = NotFound desc = could not find container \"f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6\": container with ID starting with f20934b49aab3b3104e1fd8262b01dfc8e53d096dd53c230a1c774602e8f67e6 not found: ID does not exist" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.650475 4931 scope.go:117] "RemoveContainer" containerID="51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a" Jan 30 05:44:55 crc kubenswrapper[4931]: E0130 05:44:55.651069 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a\": container with ID starting with 51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a not found: ID does not exist" containerID="51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.651109 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a"} err="failed to get container status \"51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a\": rpc error: code = NotFound desc = could not find container \"51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a\": container with ID starting with 51989f346674ebcf63e05a7e391b45094c69a90b5d75791219a49a5e5c5dea5a not found: ID does not exist" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.651141 4931 scope.go:117] "RemoveContainer" containerID="fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff" Jan 30 05:44:55 crc kubenswrapper[4931]: E0130 05:44:55.651708 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff\": container with ID starting with fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff not found: ID does not exist" containerID="fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff" Jan 30 05:44:55 crc kubenswrapper[4931]: I0130 05:44:55.651744 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff"} err="failed to get container status \"fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff\": rpc error: code = NotFound desc = could not find container \"fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff\": container with ID starting with fd23b52f69bb492366f239daeef8b41fa8300afb3a19cda8e307c939758a28ff not found: ID does not exist" Jan 30 05:44:57 crc kubenswrapper[4931]: I0130 05:44:57.437612 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689df455-3e6e-462f-bb80-862257e72f80" path="/var/lib/kubelet/pods/689df455-3e6e-462f-bb80-862257e72f80/volumes" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.163059 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n"] Jan 30 05:45:00 crc kubenswrapper[4931]: E0130 05:45:00.164166 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="extract-utilities" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.164202 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="extract-utilities" Jan 30 05:45:00 crc kubenswrapper[4931]: E0130 05:45:00.164237 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="registry-server" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.164249 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="registry-server" Jan 30 05:45:00 crc kubenswrapper[4931]: E0130 05:45:00.164300 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="extract-content" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.164316 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="extract-content" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.164645 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="689df455-3e6e-462f-bb80-862257e72f80" containerName="registry-server" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.165621 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.168362 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.172665 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.175024 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n"] Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.286702 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71ad7b66-28c5-436b-9dc4-86be3d48787b-secret-volume\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.286817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hpwj\" (UniqueName: \"kubernetes.io/projected/71ad7b66-28c5-436b-9dc4-86be3d48787b-kube-api-access-7hpwj\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.286919 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71ad7b66-28c5-436b-9dc4-86be3d48787b-config-volume\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.387893 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hpwj\" (UniqueName: \"kubernetes.io/projected/71ad7b66-28c5-436b-9dc4-86be3d48787b-kube-api-access-7hpwj\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.388029 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71ad7b66-28c5-436b-9dc4-86be3d48787b-config-volume\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.388144 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71ad7b66-28c5-436b-9dc4-86be3d48787b-secret-volume\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.389540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71ad7b66-28c5-436b-9dc4-86be3d48787b-config-volume\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.400468 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71ad7b66-28c5-436b-9dc4-86be3d48787b-secret-volume\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.406738 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hpwj\" (UniqueName: \"kubernetes.io/projected/71ad7b66-28c5-436b-9dc4-86be3d48787b-kube-api-access-7hpwj\") pod \"collect-profiles-29495865-q2t6n\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.499842 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:00 crc kubenswrapper[4931]: I0130 05:45:00.996270 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n"] Jan 30 05:45:01 crc kubenswrapper[4931]: I0130 05:45:01.586818 4931 generic.go:334] "Generic (PLEG): container finished" podID="71ad7b66-28c5-436b-9dc4-86be3d48787b" containerID="0262628a4935b4dc10f986d98e7493ff62eab4841805fce6eb8783a9ef5f62e3" exitCode=0 Jan 30 05:45:01 crc kubenswrapper[4931]: I0130 05:45:01.586947 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" event={"ID":"71ad7b66-28c5-436b-9dc4-86be3d48787b","Type":"ContainerDied","Data":"0262628a4935b4dc10f986d98e7493ff62eab4841805fce6eb8783a9ef5f62e3"} Jan 30 05:45:01 crc kubenswrapper[4931]: I0130 05:45:01.588842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" event={"ID":"71ad7b66-28c5-436b-9dc4-86be3d48787b","Type":"ContainerStarted","Data":"03eafa58de30069d7ac5ebf94066c455d3671932c3763fd86e4f19f69f3d9e4c"} Jan 30 05:45:02 crc kubenswrapper[4931]: I0130 05:45:02.966868 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.029596 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71ad7b66-28c5-436b-9dc4-86be3d48787b-secret-volume\") pod \"71ad7b66-28c5-436b-9dc4-86be3d48787b\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.029690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71ad7b66-28c5-436b-9dc4-86be3d48787b-config-volume\") pod \"71ad7b66-28c5-436b-9dc4-86be3d48787b\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.029751 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hpwj\" (UniqueName: \"kubernetes.io/projected/71ad7b66-28c5-436b-9dc4-86be3d48787b-kube-api-access-7hpwj\") pod \"71ad7b66-28c5-436b-9dc4-86be3d48787b\" (UID: \"71ad7b66-28c5-436b-9dc4-86be3d48787b\") " Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.032900 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ad7b66-28c5-436b-9dc4-86be3d48787b-config-volume" (OuterVolumeSpecName: "config-volume") pod "71ad7b66-28c5-436b-9dc4-86be3d48787b" (UID: "71ad7b66-28c5-436b-9dc4-86be3d48787b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.037996 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ad7b66-28c5-436b-9dc4-86be3d48787b-kube-api-access-7hpwj" (OuterVolumeSpecName: "kube-api-access-7hpwj") pod "71ad7b66-28c5-436b-9dc4-86be3d48787b" (UID: "71ad7b66-28c5-436b-9dc4-86be3d48787b"). InnerVolumeSpecName "kube-api-access-7hpwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.039980 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ad7b66-28c5-436b-9dc4-86be3d48787b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71ad7b66-28c5-436b-9dc4-86be3d48787b" (UID: "71ad7b66-28c5-436b-9dc4-86be3d48787b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.132098 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71ad7b66-28c5-436b-9dc4-86be3d48787b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.132144 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71ad7b66-28c5-436b-9dc4-86be3d48787b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.132164 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hpwj\" (UniqueName: \"kubernetes.io/projected/71ad7b66-28c5-436b-9dc4-86be3d48787b-kube-api-access-7hpwj\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.608351 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" event={"ID":"71ad7b66-28c5-436b-9dc4-86be3d48787b","Type":"ContainerDied","Data":"03eafa58de30069d7ac5ebf94066c455d3671932c3763fd86e4f19f69f3d9e4c"} Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.608710 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03eafa58de30069d7ac5ebf94066c455d3671932c3763fd86e4f19f69f3d9e4c" Jan 30 05:45:03 crc kubenswrapper[4931]: I0130 05:45:03.608491 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n" Jan 30 05:45:04 crc kubenswrapper[4931]: I0130 05:45:04.087099 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g"] Jan 30 05:45:04 crc kubenswrapper[4931]: I0130 05:45:04.096382 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-5cp8g"] Jan 30 05:45:05 crc kubenswrapper[4931]: I0130 05:45:05.442028 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:45:05 crc kubenswrapper[4931]: E0130 05:45:05.442579 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:45:05 crc kubenswrapper[4931]: I0130 05:45:05.460844 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8f99a6-f163-4720-8eb4-bc8607753d79" path="/var/lib/kubelet/pods/1a8f99a6-f163-4720-8eb4-bc8607753d79/volumes" Jan 30 05:45:18 crc kubenswrapper[4931]: I0130 05:45:18.421922 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:45:18 crc kubenswrapper[4931]: E0130 05:45:18.422873 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:45:29 crc kubenswrapper[4931]: I0130 05:45:29.423066 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:45:29 crc kubenswrapper[4931]: E0130 05:45:29.424362 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:45:41 crc kubenswrapper[4931]: I0130 05:45:41.422840 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:45:41 crc kubenswrapper[4931]: E0130 05:45:41.423912 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:45:52 crc kubenswrapper[4931]: I0130 05:45:52.206469 4931 scope.go:117] "RemoveContainer" containerID="76f686a64b7bcba52e9cf572d78b41631a5873f435dbdf098126fe32ac5ccc3f" Jan 30 05:45:53 crc kubenswrapper[4931]: I0130 05:45:53.422217 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:45:53 crc kubenswrapper[4931]: E0130 05:45:53.422773 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:46:07 crc kubenswrapper[4931]: I0130 05:46:07.422925 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:46:07 crc kubenswrapper[4931]: E0130 05:46:07.424275 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:46:22 crc kubenswrapper[4931]: I0130 05:46:22.421891 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:46:22 crc kubenswrapper[4931]: E0130 05:46:22.423016 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:46:34 crc kubenswrapper[4931]: I0130 05:46:34.422856 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:46:34 crc kubenswrapper[4931]: E0130 05:46:34.423877 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:46:47 crc kubenswrapper[4931]: I0130 05:46:47.422220 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:46:47 crc kubenswrapper[4931]: E0130 05:46:47.423270 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:47:02 crc kubenswrapper[4931]: I0130 05:47:02.421622 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:47:02 crc kubenswrapper[4931]: E0130 05:47:02.422246 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:47:15 crc kubenswrapper[4931]: I0130 05:47:15.429283 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:47:15 crc kubenswrapper[4931]: E0130 05:47:15.430717 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:47:30 crc kubenswrapper[4931]: I0130 05:47:30.422743 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:47:30 crc kubenswrapper[4931]: E0130 05:47:30.423453 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:47:45 crc kubenswrapper[4931]: I0130 05:47:45.434535 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:47:45 crc kubenswrapper[4931]: E0130 05:47:45.436314 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:47:58 crc kubenswrapper[4931]: I0130 05:47:58.437235 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:47:58 crc kubenswrapper[4931]: E0130 05:47:58.438473 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:48:12 crc kubenswrapper[4931]: I0130 05:48:12.422400 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:48:12 crc kubenswrapper[4931]: E0130 05:48:12.423598 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:48:25 crc kubenswrapper[4931]: I0130 05:48:25.429987 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:48:25 crc kubenswrapper[4931]: E0130 05:48:25.431259 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:48:39 crc kubenswrapper[4931]: I0130 05:48:39.421920 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:48:39 crc kubenswrapper[4931]: E0130 05:48:39.423008 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:48:52 crc kubenswrapper[4931]: I0130 05:48:52.422795 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:48:52 crc kubenswrapper[4931]: E0130 05:48:52.423826 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:49:03 crc kubenswrapper[4931]: I0130 05:49:03.423143 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:49:03 crc kubenswrapper[4931]: E0130 05:49:03.424109 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:49:16 crc kubenswrapper[4931]: I0130 05:49:16.422090 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:49:16 crc kubenswrapper[4931]: E0130 05:49:16.422921 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:49:31 crc kubenswrapper[4931]: I0130 05:49:31.422635 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:49:32 crc kubenswrapper[4931]: I0130 05:49:32.183601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"3bffa97502e128e6feab5bb4faa23ada0376dcd29bfbd235484da7266eed26ff"} Jan 30 05:51:57 crc kubenswrapper[4931]: I0130 05:51:57.362974 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:51:57 crc kubenswrapper[4931]: I0130 05:51:57.364606 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:52:27 crc kubenswrapper[4931]: I0130 05:52:27.363873 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:52:27 crc kubenswrapper[4931]: I0130 05:52:27.364577 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:52:57 crc kubenswrapper[4931]: I0130 05:52:57.363183 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:52:57 crc kubenswrapper[4931]: I0130 05:52:57.363782 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:52:57 crc kubenswrapper[4931]: I0130 05:52:57.363838 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:52:57 crc kubenswrapper[4931]: I0130 05:52:57.364529 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bffa97502e128e6feab5bb4faa23ada0376dcd29bfbd235484da7266eed26ff"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:52:57 crc kubenswrapper[4931]: I0130 05:52:57.364605 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://3bffa97502e128e6feab5bb4faa23ada0376dcd29bfbd235484da7266eed26ff" gracePeriod=600 Jan 30 05:52:58 crc kubenswrapper[4931]: I0130 05:52:58.137506 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="3bffa97502e128e6feab5bb4faa23ada0376dcd29bfbd235484da7266eed26ff" exitCode=0 Jan 30 05:52:58 crc kubenswrapper[4931]: I0130 05:52:58.137614 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"3bffa97502e128e6feab5bb4faa23ada0376dcd29bfbd235484da7266eed26ff"} Jan 30 05:52:58 crc kubenswrapper[4931]: I0130 05:52:58.137885 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb"} Jan 30 05:52:58 crc kubenswrapper[4931]: I0130 05:52:58.137910 4931 scope.go:117] "RemoveContainer" containerID="e584eab46a61f7359d189dfe47f467606f0c8ed3cbb55e61552452b295cc93b4" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.702335 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j5fj9"] Jan 30 05:53:08 crc kubenswrapper[4931]: E0130 05:53:08.703718 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ad7b66-28c5-436b-9dc4-86be3d48787b" containerName="collect-profiles" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.703742 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ad7b66-28c5-436b-9dc4-86be3d48787b" containerName="collect-profiles" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.703991 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ad7b66-28c5-436b-9dc4-86be3d48787b" containerName="collect-profiles" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.705696 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.730593 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5fj9"] Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.852688 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt87w\" (UniqueName: \"kubernetes.io/projected/c5cc4dba-5433-4509-bf60-d080a781977b-kube-api-access-pt87w\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.852824 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-catalog-content\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.852971 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-utilities\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.954108 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt87w\" (UniqueName: \"kubernetes.io/projected/c5cc4dba-5433-4509-bf60-d080a781977b-kube-api-access-pt87w\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.954165 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-catalog-content\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.954257 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-utilities\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.954783 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-utilities\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.954832 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-catalog-content\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:08 crc kubenswrapper[4931]: I0130 05:53:08.988384 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt87w\" (UniqueName: \"kubernetes.io/projected/c5cc4dba-5433-4509-bf60-d080a781977b-kube-api-access-pt87w\") pod \"community-operators-j5fj9\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.039262 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.268284 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7p6c"] Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.270808 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.280598 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7p6c"] Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.359560 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-catalog-content\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.359618 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-utilities\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.359648 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xvf\" (UniqueName: \"kubernetes.io/projected/962ef2ba-af31-4ef4-a699-dd69242ec082-kube-api-access-h9xvf\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.461087 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-catalog-content\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.461143 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-utilities\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.461175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xvf\" (UniqueName: \"kubernetes.io/projected/962ef2ba-af31-4ef4-a699-dd69242ec082-kube-api-access-h9xvf\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.461675 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-catalog-content\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.461813 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-utilities\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.479254 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xvf\" (UniqueName: \"kubernetes.io/projected/962ef2ba-af31-4ef4-a699-dd69242ec082-kube-api-access-h9xvf\") pod \"certified-operators-v7p6c\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.540714 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5fj9"] Jan 30 05:53:09 crc kubenswrapper[4931]: I0130 05:53:09.595709 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.056421 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7p6c"] Jan 30 05:53:10 crc kubenswrapper[4931]: W0130 05:53:10.061531 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod962ef2ba_af31_4ef4_a699_dd69242ec082.slice/crio-480315bed4d6d9f6b661589af2e73b383826b6ce68cbba795e0ca02d6721c48e WatchSource:0}: Error finding container 480315bed4d6d9f6b661589af2e73b383826b6ce68cbba795e0ca02d6721c48e: Status 404 returned error can't find the container with id 480315bed4d6d9f6b661589af2e73b383826b6ce68cbba795e0ca02d6721c48e Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.242539 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerStarted","Data":"fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc"} Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.242580 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerStarted","Data":"480315bed4d6d9f6b661589af2e73b383826b6ce68cbba795e0ca02d6721c48e"} Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.251383 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5cc4dba-5433-4509-bf60-d080a781977b" containerID="c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266" exitCode=0 Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.251438 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerDied","Data":"c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266"} Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.251461 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerStarted","Data":"38a774b986b60a9f602dd779e9c808da57377b8f459ce6e5789f360ff4050322"} Jan 30 05:53:10 crc kubenswrapper[4931]: I0130 05:53:10.253689 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:53:11 crc kubenswrapper[4931]: I0130 05:53:11.264463 4931 generic.go:334] "Generic (PLEG): container finished" podID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerID="fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc" exitCode=0 Jan 30 05:53:11 crc kubenswrapper[4931]: I0130 05:53:11.264564 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerDied","Data":"fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc"} Jan 30 05:53:11 crc kubenswrapper[4931]: I0130 05:53:11.270735 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerStarted","Data":"aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1"} Jan 30 05:53:12 crc kubenswrapper[4931]: I0130 05:53:12.283320 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerStarted","Data":"fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852"} Jan 30 05:53:12 crc kubenswrapper[4931]: I0130 05:53:12.288140 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5cc4dba-5433-4509-bf60-d080a781977b" containerID="aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1" exitCode=0 Jan 30 05:53:12 crc kubenswrapper[4931]: I0130 05:53:12.288195 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerDied","Data":"aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1"} Jan 30 05:53:13 crc kubenswrapper[4931]: I0130 05:53:13.300715 4931 generic.go:334] "Generic (PLEG): container finished" podID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerID="fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852" exitCode=0 Jan 30 05:53:13 crc kubenswrapper[4931]: I0130 05:53:13.300771 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerDied","Data":"fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852"} Jan 30 05:53:13 crc kubenswrapper[4931]: I0130 05:53:13.304927 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerStarted","Data":"e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886"} Jan 30 05:53:13 crc kubenswrapper[4931]: I0130 05:53:13.377109 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j5fj9" podStartSLOduration=2.932372466 podStartE2EDuration="5.377084499s" podCreationTimestamp="2026-01-30 05:53:08 +0000 UTC" firstStartedPulling="2026-01-30 05:53:10.253498883 +0000 UTC m=+2725.623409140" lastFinishedPulling="2026-01-30 05:53:12.698210886 +0000 UTC m=+2728.068121173" observedRunningTime="2026-01-30 05:53:13.360851607 +0000 UTC m=+2728.730761894" watchObservedRunningTime="2026-01-30 05:53:13.377084499 +0000 UTC m=+2728.746994796" Jan 30 05:53:14 crc kubenswrapper[4931]: I0130 05:53:14.315521 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerStarted","Data":"98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7"} Jan 30 05:53:14 crc kubenswrapper[4931]: I0130 05:53:14.344754 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7p6c" podStartSLOduration=2.927179895 podStartE2EDuration="5.344736692s" podCreationTimestamp="2026-01-30 05:53:09 +0000 UTC" firstStartedPulling="2026-01-30 05:53:11.267573079 +0000 UTC m=+2726.637483386" lastFinishedPulling="2026-01-30 05:53:13.685129916 +0000 UTC m=+2729.055040183" observedRunningTime="2026-01-30 05:53:14.340409821 +0000 UTC m=+2729.710320068" watchObservedRunningTime="2026-01-30 05:53:14.344736692 +0000 UTC m=+2729.714646949" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.040216 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.041035 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.104452 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.435572 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.597267 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.597358 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.646500 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:19 crc kubenswrapper[4931]: I0130 05:53:19.659040 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5fj9"] Jan 30 05:53:20 crc kubenswrapper[4931]: I0130 05:53:20.422980 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:21 crc kubenswrapper[4931]: I0130 05:53:21.376189 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j5fj9" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="registry-server" containerID="cri-o://e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886" gracePeriod=2 Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.044853 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.073691 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7p6c"] Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.158398 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-utilities\") pod \"c5cc4dba-5433-4509-bf60-d080a781977b\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.158549 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt87w\" (UniqueName: \"kubernetes.io/projected/c5cc4dba-5433-4509-bf60-d080a781977b-kube-api-access-pt87w\") pod \"c5cc4dba-5433-4509-bf60-d080a781977b\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.158697 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-catalog-content\") pod \"c5cc4dba-5433-4509-bf60-d080a781977b\" (UID: \"c5cc4dba-5433-4509-bf60-d080a781977b\") " Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.160275 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-utilities" (OuterVolumeSpecName: "utilities") pod "c5cc4dba-5433-4509-bf60-d080a781977b" (UID: "c5cc4dba-5433-4509-bf60-d080a781977b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.167013 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc4dba-5433-4509-bf60-d080a781977b-kube-api-access-pt87w" (OuterVolumeSpecName: "kube-api-access-pt87w") pod "c5cc4dba-5433-4509-bf60-d080a781977b" (UID: "c5cc4dba-5433-4509-bf60-d080a781977b"). InnerVolumeSpecName "kube-api-access-pt87w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.243922 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5cc4dba-5433-4509-bf60-d080a781977b" (UID: "c5cc4dba-5433-4509-bf60-d080a781977b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.261060 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.261108 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt87w\" (UniqueName: \"kubernetes.io/projected/c5cc4dba-5433-4509-bf60-d080a781977b-kube-api-access-pt87w\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.261129 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cc4dba-5433-4509-bf60-d080a781977b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.391114 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5cc4dba-5433-4509-bf60-d080a781977b" containerID="e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886" exitCode=0 Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.391200 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fj9" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.391204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerDied","Data":"e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886"} Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.391390 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fj9" event={"ID":"c5cc4dba-5433-4509-bf60-d080a781977b","Type":"ContainerDied","Data":"38a774b986b60a9f602dd779e9c808da57377b8f459ce6e5789f360ff4050322"} Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.391456 4931 scope.go:117] "RemoveContainer" containerID="e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.392151 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7p6c" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="registry-server" containerID="cri-o://98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7" gracePeriod=2 Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.441764 4931 scope.go:117] "RemoveContainer" containerID="aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.451732 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5fj9"] Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.463447 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j5fj9"] Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.465670 4931 scope.go:117] "RemoveContainer" containerID="c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.585452 4931 scope.go:117] "RemoveContainer" containerID="e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886" Jan 30 05:53:22 crc kubenswrapper[4931]: E0130 05:53:22.586011 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886\": container with ID starting with e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886 not found: ID does not exist" containerID="e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.586052 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886"} err="failed to get container status \"e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886\": rpc error: code = NotFound desc = could not find container \"e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886\": container with ID starting with e3eeafbbfced90de8534642c23434c1e4d70db646b394266f1065f08c747a886 not found: ID does not exist" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.586077 4931 scope.go:117] "RemoveContainer" containerID="aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1" Jan 30 05:53:22 crc kubenswrapper[4931]: E0130 05:53:22.586711 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1\": container with ID starting with aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1 not found: ID does not exist" containerID="aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.586774 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1"} err="failed to get container status \"aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1\": rpc error: code = NotFound desc = could not find container \"aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1\": container with ID starting with aeb9770910e3b9f6716b8f9e0244c9ac9f49c525029df601a9cad4898f3628c1 not found: ID does not exist" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.586816 4931 scope.go:117] "RemoveContainer" containerID="c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266" Jan 30 05:53:22 crc kubenswrapper[4931]: E0130 05:53:22.587153 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266\": container with ID starting with c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266 not found: ID does not exist" containerID="c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.587184 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266"} err="failed to get container status \"c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266\": rpc error: code = NotFound desc = could not find container \"c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266\": container with ID starting with c3c52c593d718f3b446da1c8d4b7a0369c62a74499d34ed1819c126fb7db7266 not found: ID does not exist" Jan 30 05:53:22 crc kubenswrapper[4931]: I0130 05:53:22.973378 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.086579 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-catalog-content\") pod \"962ef2ba-af31-4ef4-a699-dd69242ec082\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.086681 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9xvf\" (UniqueName: \"kubernetes.io/projected/962ef2ba-af31-4ef4-a699-dd69242ec082-kube-api-access-h9xvf\") pod \"962ef2ba-af31-4ef4-a699-dd69242ec082\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.086727 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-utilities\") pod \"962ef2ba-af31-4ef4-a699-dd69242ec082\" (UID: \"962ef2ba-af31-4ef4-a699-dd69242ec082\") " Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.087687 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-utilities" (OuterVolumeSpecName: "utilities") pod "962ef2ba-af31-4ef4-a699-dd69242ec082" (UID: "962ef2ba-af31-4ef4-a699-dd69242ec082"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.090604 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962ef2ba-af31-4ef4-a699-dd69242ec082-kube-api-access-h9xvf" (OuterVolumeSpecName: "kube-api-access-h9xvf") pod "962ef2ba-af31-4ef4-a699-dd69242ec082" (UID: "962ef2ba-af31-4ef4-a699-dd69242ec082"). InnerVolumeSpecName "kube-api-access-h9xvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.145085 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "962ef2ba-af31-4ef4-a699-dd69242ec082" (UID: "962ef2ba-af31-4ef4-a699-dd69242ec082"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.188948 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.189013 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9xvf\" (UniqueName: \"kubernetes.io/projected/962ef2ba-af31-4ef4-a699-dd69242ec082-kube-api-access-h9xvf\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.189038 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962ef2ba-af31-4ef4-a699-dd69242ec082-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.404616 4931 generic.go:334] "Generic (PLEG): container finished" podID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerID="98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7" exitCode=0 Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.404683 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerDied","Data":"98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7"} Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.404744 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7p6c" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.405156 4931 scope.go:117] "RemoveContainer" containerID="98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.405132 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7p6c" event={"ID":"962ef2ba-af31-4ef4-a699-dd69242ec082","Type":"ContainerDied","Data":"480315bed4d6d9f6b661589af2e73b383826b6ce68cbba795e0ca02d6721c48e"} Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.437248 4931 scope.go:117] "RemoveContainer" containerID="fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.442197 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" path="/var/lib/kubelet/pods/c5cc4dba-5433-4509-bf60-d080a781977b/volumes" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.470603 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7p6c"] Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.481664 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7p6c"] Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.491911 4931 scope.go:117] "RemoveContainer" containerID="fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.515729 4931 scope.go:117] "RemoveContainer" containerID="98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7" Jan 30 05:53:23 crc kubenswrapper[4931]: E0130 05:53:23.516824 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7\": container with ID starting with 98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7 not found: ID does not exist" containerID="98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.516863 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7"} err="failed to get container status \"98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7\": rpc error: code = NotFound desc = could not find container \"98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7\": container with ID starting with 98a4f16a2d3e2b5814fd2d6e8656d4f876ee224f408b34699b7569c875dbc6f7 not found: ID does not exist" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.516889 4931 scope.go:117] "RemoveContainer" containerID="fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852" Jan 30 05:53:23 crc kubenswrapper[4931]: E0130 05:53:23.517733 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852\": container with ID starting with fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852 not found: ID does not exist" containerID="fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.517886 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852"} err="failed to get container status \"fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852\": rpc error: code = NotFound desc = could not find container \"fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852\": container with ID starting with fa57e4401c4a867d3d80c1c7b2e0c5a34a1770e292a6ee56501504b6b128a852 not found: ID does not exist" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.517958 4931 scope.go:117] "RemoveContainer" containerID="fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc" Jan 30 05:53:23 crc kubenswrapper[4931]: E0130 05:53:23.518869 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc\": container with ID starting with fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc not found: ID does not exist" containerID="fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc" Jan 30 05:53:23 crc kubenswrapper[4931]: I0130 05:53:23.518919 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc"} err="failed to get container status \"fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc\": rpc error: code = NotFound desc = could not find container \"fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc\": container with ID starting with fc423a2426262ce36e49f1537c46892fde45a02e00935ca420e4803025a86ebc not found: ID does not exist" Jan 30 05:53:25 crc kubenswrapper[4931]: I0130 05:53:25.439234 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" path="/var/lib/kubelet/pods/962ef2ba-af31-4ef4-a699-dd69242ec082/volumes" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.116272 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xxts8"] Jan 30 05:54:23 crc kubenswrapper[4931]: E0130 05:54:23.117451 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="extract-content" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117474 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="extract-content" Jan 30 05:54:23 crc kubenswrapper[4931]: E0130 05:54:23.117491 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="registry-server" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117501 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="registry-server" Jan 30 05:54:23 crc kubenswrapper[4931]: E0130 05:54:23.117529 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="extract-content" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117540 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="extract-content" Jan 30 05:54:23 crc kubenswrapper[4931]: E0130 05:54:23.117565 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="extract-utilities" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117575 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="extract-utilities" Jan 30 05:54:23 crc kubenswrapper[4931]: E0130 05:54:23.117596 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="extract-utilities" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117606 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="extract-utilities" Jan 30 05:54:23 crc kubenswrapper[4931]: E0130 05:54:23.117628 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="registry-server" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117640 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="registry-server" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117832 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc4dba-5433-4509-bf60-d080a781977b" containerName="registry-server" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.117876 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="962ef2ba-af31-4ef4-a699-dd69242ec082" containerName="registry-server" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.119366 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.137483 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxts8"] Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.296648 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-utilities\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.296700 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-catalog-content\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.296812 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t27j\" (UniqueName: \"kubernetes.io/projected/64eb9b0a-7a6b-479c-93ee-118642bac30f-kube-api-access-7t27j\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.398408 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t27j\" (UniqueName: \"kubernetes.io/projected/64eb9b0a-7a6b-479c-93ee-118642bac30f-kube-api-access-7t27j\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.398531 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-utilities\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.398569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-catalog-content\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.398983 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-utilities\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.399054 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-catalog-content\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.419168 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t27j\" (UniqueName: \"kubernetes.io/projected/64eb9b0a-7a6b-479c-93ee-118642bac30f-kube-api-access-7t27j\") pod \"redhat-operators-xxts8\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.438684 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.904280 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxts8"] Jan 30 05:54:23 crc kubenswrapper[4931]: I0130 05:54:23.985528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerStarted","Data":"5e48aaf71e51dd10317bd589006ab382d0e5ecebbc057cd2b565b569c9c29556"} Jan 30 05:54:24 crc kubenswrapper[4931]: I0130 05:54:24.997043 4931 generic.go:334] "Generic (PLEG): container finished" podID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerID="69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3" exitCode=0 Jan 30 05:54:24 crc kubenswrapper[4931]: I0130 05:54:24.997235 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerDied","Data":"69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3"} Jan 30 05:54:26 crc kubenswrapper[4931]: I0130 05:54:26.011914 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerStarted","Data":"3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4"} Jan 30 05:54:27 crc kubenswrapper[4931]: I0130 05:54:27.024539 4931 generic.go:334] "Generic (PLEG): container finished" podID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerID="3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4" exitCode=0 Jan 30 05:54:27 crc kubenswrapper[4931]: I0130 05:54:27.024609 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerDied","Data":"3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4"} Jan 30 05:54:28 crc kubenswrapper[4931]: I0130 05:54:28.039152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerStarted","Data":"962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf"} Jan 30 05:54:33 crc kubenswrapper[4931]: I0130 05:54:33.439345 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:33 crc kubenswrapper[4931]: I0130 05:54:33.441142 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:34 crc kubenswrapper[4931]: I0130 05:54:34.504627 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xxts8" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="registry-server" probeResult="failure" output=< Jan 30 05:54:34 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 05:54:34 crc kubenswrapper[4931]: > Jan 30 05:54:43 crc kubenswrapper[4931]: I0130 05:54:43.507126 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:43 crc kubenswrapper[4931]: I0130 05:54:43.547044 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xxts8" podStartSLOduration=17.953757635 podStartE2EDuration="20.547016933s" podCreationTimestamp="2026-01-30 05:54:23 +0000 UTC" firstStartedPulling="2026-01-30 05:54:25.000605722 +0000 UTC m=+2800.370516019" lastFinishedPulling="2026-01-30 05:54:27.59386503 +0000 UTC m=+2802.963775317" observedRunningTime="2026-01-30 05:54:28.073866385 +0000 UTC m=+2803.443776672" watchObservedRunningTime="2026-01-30 05:54:43.547016933 +0000 UTC m=+2818.916927220" Jan 30 05:54:43 crc kubenswrapper[4931]: I0130 05:54:43.590907 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:43 crc kubenswrapper[4931]: I0130 05:54:43.756084 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxts8"] Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.198294 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xxts8" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="registry-server" containerID="cri-o://962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf" gracePeriod=2 Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.662841 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.780309 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-utilities\") pod \"64eb9b0a-7a6b-479c-93ee-118642bac30f\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.780381 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-catalog-content\") pod \"64eb9b0a-7a6b-479c-93ee-118642bac30f\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.780587 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t27j\" (UniqueName: \"kubernetes.io/projected/64eb9b0a-7a6b-479c-93ee-118642bac30f-kube-api-access-7t27j\") pod \"64eb9b0a-7a6b-479c-93ee-118642bac30f\" (UID: \"64eb9b0a-7a6b-479c-93ee-118642bac30f\") " Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.781248 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-utilities" (OuterVolumeSpecName: "utilities") pod "64eb9b0a-7a6b-479c-93ee-118642bac30f" (UID: "64eb9b0a-7a6b-479c-93ee-118642bac30f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.786903 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64eb9b0a-7a6b-479c-93ee-118642bac30f-kube-api-access-7t27j" (OuterVolumeSpecName: "kube-api-access-7t27j") pod "64eb9b0a-7a6b-479c-93ee-118642bac30f" (UID: "64eb9b0a-7a6b-479c-93ee-118642bac30f"). InnerVolumeSpecName "kube-api-access-7t27j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.882701 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t27j\" (UniqueName: \"kubernetes.io/projected/64eb9b0a-7a6b-479c-93ee-118642bac30f-kube-api-access-7t27j\") on node \"crc\" DevicePath \"\"" Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.882757 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.941964 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64eb9b0a-7a6b-479c-93ee-118642bac30f" (UID: "64eb9b0a-7a6b-479c-93ee-118642bac30f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:54:45 crc kubenswrapper[4931]: I0130 05:54:45.984222 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eb9b0a-7a6b-479c-93ee-118642bac30f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.212594 4931 generic.go:334] "Generic (PLEG): container finished" podID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerID="962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf" exitCode=0 Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.212654 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerDied","Data":"962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf"} Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.212695 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxts8" event={"ID":"64eb9b0a-7a6b-479c-93ee-118642bac30f","Type":"ContainerDied","Data":"5e48aaf71e51dd10317bd589006ab382d0e5ecebbc057cd2b565b569c9c29556"} Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.212723 4931 scope.go:117] "RemoveContainer" containerID="962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.212733 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxts8" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.247869 4931 scope.go:117] "RemoveContainer" containerID="3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.274282 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxts8"] Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.284396 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xxts8"] Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.301687 4931 scope.go:117] "RemoveContainer" containerID="69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.321866 4931 scope.go:117] "RemoveContainer" containerID="962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf" Jan 30 05:54:46 crc kubenswrapper[4931]: E0130 05:54:46.322273 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf\": container with ID starting with 962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf not found: ID does not exist" containerID="962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.322334 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf"} err="failed to get container status \"962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf\": rpc error: code = NotFound desc = could not find container \"962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf\": container with ID starting with 962ef63dffb4cd6b5ff3880e0f6450256c1a450a0d3c3696bc5cd38f94eac1bf not found: ID does not exist" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.322371 4931 scope.go:117] "RemoveContainer" containerID="3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4" Jan 30 05:54:46 crc kubenswrapper[4931]: E0130 05:54:46.323042 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4\": container with ID starting with 3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4 not found: ID does not exist" containerID="3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.323089 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4"} err="failed to get container status \"3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4\": rpc error: code = NotFound desc = could not find container \"3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4\": container with ID starting with 3ab82937c12f6fda05938c880ad10d71218abf1a8f1d6775a183f50c6d7078e4 not found: ID does not exist" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.323118 4931 scope.go:117] "RemoveContainer" containerID="69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3" Jan 30 05:54:46 crc kubenswrapper[4931]: E0130 05:54:46.323687 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3\": container with ID starting with 69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3 not found: ID does not exist" containerID="69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3" Jan 30 05:54:46 crc kubenswrapper[4931]: I0130 05:54:46.323741 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3"} err="failed to get container status \"69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3\": rpc error: code = NotFound desc = could not find container \"69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3\": container with ID starting with 69934b78c3a186dd3ec212cfc8d944537b3fcdbd318da93763df4882959443d3 not found: ID does not exist" Jan 30 05:54:47 crc kubenswrapper[4931]: I0130 05:54:47.437297 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" path="/var/lib/kubelet/pods/64eb9b0a-7a6b-479c-93ee-118642bac30f/volumes" Jan 30 05:54:57 crc kubenswrapper[4931]: I0130 05:54:57.362904 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:54:57 crc kubenswrapper[4931]: I0130 05:54:57.363506 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:55:27 crc kubenswrapper[4931]: I0130 05:55:27.362966 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:55:27 crc kubenswrapper[4931]: I0130 05:55:27.363630 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.362990 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.363680 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.363747 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.364547 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.364641 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" gracePeriod=600 Jan 30 05:55:57 crc kubenswrapper[4931]: E0130 05:55:57.503368 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.873469 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" exitCode=0 Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.873542 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb"} Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.873958 4931 scope.go:117] "RemoveContainer" containerID="3bffa97502e128e6feab5bb4faa23ada0376dcd29bfbd235484da7266eed26ff" Jan 30 05:55:57 crc kubenswrapper[4931]: I0130 05:55:57.874531 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:55:57 crc kubenswrapper[4931]: E0130 05:55:57.874866 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:56:11 crc kubenswrapper[4931]: I0130 05:56:11.423975 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:56:11 crc kubenswrapper[4931]: E0130 05:56:11.425707 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:56:25 crc kubenswrapper[4931]: I0130 05:56:25.431017 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:56:25 crc kubenswrapper[4931]: E0130 05:56:25.432291 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:56:36 crc kubenswrapper[4931]: I0130 05:56:36.422316 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:56:36 crc kubenswrapper[4931]: E0130 05:56:36.423321 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:56:51 crc kubenswrapper[4931]: I0130 05:56:51.423704 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:56:51 crc kubenswrapper[4931]: E0130 05:56:51.425036 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:57:05 crc kubenswrapper[4931]: I0130 05:57:05.423700 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:57:05 crc kubenswrapper[4931]: E0130 05:57:05.424642 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:57:19 crc kubenswrapper[4931]: I0130 05:57:19.422528 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:57:19 crc kubenswrapper[4931]: E0130 05:57:19.423682 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:57:31 crc kubenswrapper[4931]: I0130 05:57:31.422320 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:57:31 crc kubenswrapper[4931]: E0130 05:57:31.423162 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:57:44 crc kubenswrapper[4931]: I0130 05:57:44.422588 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:57:44 crc kubenswrapper[4931]: E0130 05:57:44.423818 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:57:59 crc kubenswrapper[4931]: I0130 05:57:59.422457 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:57:59 crc kubenswrapper[4931]: E0130 05:57:59.424251 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:58:11 crc kubenswrapper[4931]: I0130 05:58:11.423021 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:58:11 crc kubenswrapper[4931]: E0130 05:58:11.424225 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.686090 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gg7gh"] Jan 30 05:58:16 crc kubenswrapper[4931]: E0130 05:58:16.687277 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="extract-utilities" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.687313 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="extract-utilities" Jan 30 05:58:16 crc kubenswrapper[4931]: E0130 05:58:16.687343 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="registry-server" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.687360 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="registry-server" Jan 30 05:58:16 crc kubenswrapper[4931]: E0130 05:58:16.687417 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="extract-content" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.687468 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="extract-content" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.687849 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="64eb9b0a-7a6b-479c-93ee-118642bac30f" containerName="registry-server" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.690710 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.728179 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg7gh"] Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.829702 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wrkm\" (UniqueName: \"kubernetes.io/projected/9e69c7f8-6633-40ec-baf7-33cd56b80526-kube-api-access-9wrkm\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.829758 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-utilities\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.829857 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-catalog-content\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.931526 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-catalog-content\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.931704 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wrkm\" (UniqueName: \"kubernetes.io/projected/9e69c7f8-6633-40ec-baf7-33cd56b80526-kube-api-access-9wrkm\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.931751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-utilities\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.932138 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-catalog-content\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.932281 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-utilities\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:16 crc kubenswrapper[4931]: I0130 05:58:16.959777 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wrkm\" (UniqueName: \"kubernetes.io/projected/9e69c7f8-6633-40ec-baf7-33cd56b80526-kube-api-access-9wrkm\") pod \"redhat-marketplace-gg7gh\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:17 crc kubenswrapper[4931]: I0130 05:58:17.026268 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:17 crc kubenswrapper[4931]: I0130 05:58:17.495872 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg7gh"] Jan 30 05:58:18 crc kubenswrapper[4931]: I0130 05:58:18.205340 4931 generic.go:334] "Generic (PLEG): container finished" podID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerID="65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f" exitCode=0 Jan 30 05:58:18 crc kubenswrapper[4931]: I0130 05:58:18.205451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerDied","Data":"65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f"} Jan 30 05:58:18 crc kubenswrapper[4931]: I0130 05:58:18.205522 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerStarted","Data":"b6f132ffffa9aa9e1de373451be5e50268b753fc3cd9a4826a9346c118d3b238"} Jan 30 05:58:18 crc kubenswrapper[4931]: I0130 05:58:18.208506 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:58:19 crc kubenswrapper[4931]: I0130 05:58:19.213352 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerStarted","Data":"4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e"} Jan 30 05:58:20 crc kubenswrapper[4931]: I0130 05:58:20.227279 4931 generic.go:334] "Generic (PLEG): container finished" podID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerID="4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e" exitCode=0 Jan 30 05:58:20 crc kubenswrapper[4931]: I0130 05:58:20.227377 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerDied","Data":"4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e"} Jan 30 05:58:21 crc kubenswrapper[4931]: I0130 05:58:21.241374 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerStarted","Data":"09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c"} Jan 30 05:58:21 crc kubenswrapper[4931]: I0130 05:58:21.285700 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gg7gh" podStartSLOduration=2.755532109 podStartE2EDuration="5.285664399s" podCreationTimestamp="2026-01-30 05:58:16 +0000 UTC" firstStartedPulling="2026-01-30 05:58:18.207908375 +0000 UTC m=+3033.577818662" lastFinishedPulling="2026-01-30 05:58:20.738040665 +0000 UTC m=+3036.107950952" observedRunningTime="2026-01-30 05:58:21.276096538 +0000 UTC m=+3036.646006805" watchObservedRunningTime="2026-01-30 05:58:21.285664399 +0000 UTC m=+3036.655574706" Jan 30 05:58:24 crc kubenswrapper[4931]: I0130 05:58:24.422507 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:58:24 crc kubenswrapper[4931]: E0130 05:58:24.423525 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:58:27 crc kubenswrapper[4931]: I0130 05:58:27.026650 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:27 crc kubenswrapper[4931]: I0130 05:58:27.026790 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:27 crc kubenswrapper[4931]: I0130 05:58:27.111589 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:27 crc kubenswrapper[4931]: I0130 05:58:27.362753 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:30 crc kubenswrapper[4931]: I0130 05:58:30.667286 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg7gh"] Jan 30 05:58:30 crc kubenswrapper[4931]: I0130 05:58:30.668013 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gg7gh" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="registry-server" containerID="cri-o://09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c" gracePeriod=2 Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.076431 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.267628 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-catalog-content\") pod \"9e69c7f8-6633-40ec-baf7-33cd56b80526\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.268159 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-utilities\") pod \"9e69c7f8-6633-40ec-baf7-33cd56b80526\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.268274 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wrkm\" (UniqueName: \"kubernetes.io/projected/9e69c7f8-6633-40ec-baf7-33cd56b80526-kube-api-access-9wrkm\") pod \"9e69c7f8-6633-40ec-baf7-33cd56b80526\" (UID: \"9e69c7f8-6633-40ec-baf7-33cd56b80526\") " Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.269164 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-utilities" (OuterVolumeSpecName: "utilities") pod "9e69c7f8-6633-40ec-baf7-33cd56b80526" (UID: "9e69c7f8-6633-40ec-baf7-33cd56b80526"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.279574 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e69c7f8-6633-40ec-baf7-33cd56b80526-kube-api-access-9wrkm" (OuterVolumeSpecName: "kube-api-access-9wrkm") pod "9e69c7f8-6633-40ec-baf7-33cd56b80526" (UID: "9e69c7f8-6633-40ec-baf7-33cd56b80526"). InnerVolumeSpecName "kube-api-access-9wrkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.309190 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e69c7f8-6633-40ec-baf7-33cd56b80526" (UID: "9e69c7f8-6633-40ec-baf7-33cd56b80526"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.348096 4931 generic.go:334] "Generic (PLEG): container finished" podID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerID="09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c" exitCode=0 Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.348150 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerDied","Data":"09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c"} Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.348181 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gg7gh" event={"ID":"9e69c7f8-6633-40ec-baf7-33cd56b80526","Type":"ContainerDied","Data":"b6f132ffffa9aa9e1de373451be5e50268b753fc3cd9a4826a9346c118d3b238"} Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.348673 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gg7gh" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.348774 4931 scope.go:117] "RemoveContainer" containerID="09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.369676 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wrkm\" (UniqueName: \"kubernetes.io/projected/9e69c7f8-6633-40ec-baf7-33cd56b80526-kube-api-access-9wrkm\") on node \"crc\" DevicePath \"\"" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.369722 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.369742 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69c7f8-6633-40ec-baf7-33cd56b80526-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.381966 4931 scope.go:117] "RemoveContainer" containerID="4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.389607 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg7gh"] Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.395414 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gg7gh"] Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.401466 4931 scope.go:117] "RemoveContainer" containerID="65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.428561 4931 scope.go:117] "RemoveContainer" containerID="09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c" Jan 30 05:58:31 crc kubenswrapper[4931]: E0130 05:58:31.428951 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c\": container with ID starting with 09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c not found: ID does not exist" containerID="09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.429067 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c"} err="failed to get container status \"09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c\": rpc error: code = NotFound desc = could not find container \"09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c\": container with ID starting with 09313b96f5317ca2081492e05aecd5ea8d38b582e37e0b6596d418a718ebe02c not found: ID does not exist" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.429150 4931 scope.go:117] "RemoveContainer" containerID="4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e" Jan 30 05:58:31 crc kubenswrapper[4931]: E0130 05:58:31.429602 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e\": container with ID starting with 4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e not found: ID does not exist" containerID="4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.429677 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e"} err="failed to get container status \"4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e\": rpc error: code = NotFound desc = could not find container \"4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e\": container with ID starting with 4495e4f4bca5085913535b514847ff140fc4fe4bef6afe94338910b4a33b047e not found: ID does not exist" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.429738 4931 scope.go:117] "RemoveContainer" containerID="65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.429818 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" path="/var/lib/kubelet/pods/9e69c7f8-6633-40ec-baf7-33cd56b80526/volumes" Jan 30 05:58:31 crc kubenswrapper[4931]: E0130 05:58:31.430122 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f\": container with ID starting with 65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f not found: ID does not exist" containerID="65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f" Jan 30 05:58:31 crc kubenswrapper[4931]: I0130 05:58:31.430182 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f"} err="failed to get container status \"65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f\": rpc error: code = NotFound desc = could not find container \"65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f\": container with ID starting with 65c5aa46d913497807faa62cda964f5223d6c4d84da21306349092f6cf00db7f not found: ID does not exist" Jan 30 05:58:35 crc kubenswrapper[4931]: I0130 05:58:35.429633 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:58:35 crc kubenswrapper[4931]: E0130 05:58:35.431011 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:58:49 crc kubenswrapper[4931]: I0130 05:58:49.421936 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:58:49 crc kubenswrapper[4931]: E0130 05:58:49.422755 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:59:02 crc kubenswrapper[4931]: I0130 05:59:02.422119 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:59:02 crc kubenswrapper[4931]: E0130 05:59:02.424912 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:59:14 crc kubenswrapper[4931]: I0130 05:59:14.422120 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:59:14 crc kubenswrapper[4931]: E0130 05:59:14.423483 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:59:29 crc kubenswrapper[4931]: I0130 05:59:29.422549 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:59:29 crc kubenswrapper[4931]: E0130 05:59:29.423880 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:59:44 crc kubenswrapper[4931]: I0130 05:59:44.422418 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:59:44 crc kubenswrapper[4931]: E0130 05:59:44.425112 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 05:59:57 crc kubenswrapper[4931]: I0130 05:59:57.422983 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 05:59:57 crc kubenswrapper[4931]: E0130 05:59:57.423991 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.166268 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g"] Jan 30 06:00:00 crc kubenswrapper[4931]: E0130 06:00:00.167304 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="extract-content" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.167323 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="extract-content" Jan 30 06:00:00 crc kubenswrapper[4931]: E0130 06:00:00.167345 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="extract-utilities" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.167362 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="extract-utilities" Jan 30 06:00:00 crc kubenswrapper[4931]: E0130 06:00:00.167395 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="registry-server" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.167403 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="registry-server" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.167628 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e69c7f8-6633-40ec-baf7-33cd56b80526" containerName="registry-server" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.168508 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.171960 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.176649 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.182237 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g"] Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.325927 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-config-volume\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.326006 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxgsk\" (UniqueName: \"kubernetes.io/projected/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-kube-api-access-hxgsk\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.326285 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-secret-volume\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.428310 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-secret-volume\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.428525 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-config-volume\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.428560 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxgsk\" (UniqueName: \"kubernetes.io/projected/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-kube-api-access-hxgsk\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.430534 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-config-volume\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.437003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-secret-volume\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.461709 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxgsk\" (UniqueName: \"kubernetes.io/projected/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-kube-api-access-hxgsk\") pod \"collect-profiles-29495880-jrm5g\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:00 crc kubenswrapper[4931]: I0130 06:00:00.494600 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:01 crc kubenswrapper[4931]: I0130 06:00:01.047094 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g"] Jan 30 06:00:01 crc kubenswrapper[4931]: I0130 06:00:01.183054 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" event={"ID":"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba","Type":"ContainerStarted","Data":"1940902600dfb612e1c4d9cb34bdb0c6655ed85cfd9760778627a748d0b9b96e"} Jan 30 06:00:02 crc kubenswrapper[4931]: I0130 06:00:02.194121 4931 generic.go:334] "Generic (PLEG): container finished" podID="ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" containerID="18c71eb1241272ed04cbfce337c51a3320bfd0991c28ac36edc8dd0665668963" exitCode=0 Jan 30 06:00:02 crc kubenswrapper[4931]: I0130 06:00:02.194198 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" event={"ID":"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba","Type":"ContainerDied","Data":"18c71eb1241272ed04cbfce337c51a3320bfd0991c28ac36edc8dd0665668963"} Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.533186 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.582383 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-secret-volume\") pod \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.582508 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-config-volume\") pod \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.582656 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxgsk\" (UniqueName: \"kubernetes.io/projected/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-kube-api-access-hxgsk\") pod \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\" (UID: \"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba\") " Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.600341 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" (UID: "ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.601190 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" (UID: "ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.601309 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-kube-api-access-hxgsk" (OuterVolumeSpecName: "kube-api-access-hxgsk") pod "ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" (UID: "ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba"). InnerVolumeSpecName "kube-api-access-hxgsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.685475 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxgsk\" (UniqueName: \"kubernetes.io/projected/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-kube-api-access-hxgsk\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.685536 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:03 crc kubenswrapper[4931]: I0130 06:00:03.685560 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:04 crc kubenswrapper[4931]: I0130 06:00:04.215315 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" event={"ID":"ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba","Type":"ContainerDied","Data":"1940902600dfb612e1c4d9cb34bdb0c6655ed85cfd9760778627a748d0b9b96e"} Jan 30 06:00:04 crc kubenswrapper[4931]: I0130 06:00:04.215371 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1940902600dfb612e1c4d9cb34bdb0c6655ed85cfd9760778627a748d0b9b96e" Jan 30 06:00:04 crc kubenswrapper[4931]: I0130 06:00:04.215416 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g" Jan 30 06:00:04 crc kubenswrapper[4931]: I0130 06:00:04.633354 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb"] Jan 30 06:00:04 crc kubenswrapper[4931]: I0130 06:00:04.639207 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-kk9gb"] Jan 30 06:00:05 crc kubenswrapper[4931]: I0130 06:00:05.436848 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2119e7a8-c484-4aef-ac04-c3f82433738d" path="/var/lib/kubelet/pods/2119e7a8-c484-4aef-ac04-c3f82433738d/volumes" Jan 30 06:00:09 crc kubenswrapper[4931]: I0130 06:00:09.422779 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 06:00:09 crc kubenswrapper[4931]: E0130 06:00:09.423486 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:00:22 crc kubenswrapper[4931]: I0130 06:00:22.422500 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 06:00:22 crc kubenswrapper[4931]: E0130 06:00:22.424048 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:00:35 crc kubenswrapper[4931]: I0130 06:00:35.430650 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 06:00:35 crc kubenswrapper[4931]: E0130 06:00:35.431553 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:00:48 crc kubenswrapper[4931]: I0130 06:00:48.422877 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 06:00:48 crc kubenswrapper[4931]: E0130 06:00:48.424109 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:00:52 crc kubenswrapper[4931]: I0130 06:00:52.694964 4931 scope.go:117] "RemoveContainer" containerID="93024ef1482e0faf5c83b31d25bb0153752fe08f7d8619cc6cdb7d2120e5e084" Jan 30 06:01:00 crc kubenswrapper[4931]: I0130 06:01:00.422787 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 06:01:00 crc kubenswrapper[4931]: I0130 06:01:00.715491 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"3ee39d7345786d63837440f7d9f37384973d255b777a319f3ba033acd5419f51"} Jan 30 06:03:27 crc kubenswrapper[4931]: I0130 06:03:27.363389 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:03:27 crc kubenswrapper[4931]: I0130 06:03:27.364292 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:03:57 crc kubenswrapper[4931]: I0130 06:03:57.362710 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:03:57 crc kubenswrapper[4931]: I0130 06:03:57.363528 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.362922 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.364695 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.364877 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.365887 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ee39d7345786d63837440f7d9f37384973d255b777a319f3ba033acd5419f51"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.366000 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://3ee39d7345786d63837440f7d9f37384973d255b777a319f3ba033acd5419f51" gracePeriod=600 Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.553437 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="3ee39d7345786d63837440f7d9f37384973d255b777a319f3ba033acd5419f51" exitCode=0 Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.553626 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"3ee39d7345786d63837440f7d9f37384973d255b777a319f3ba033acd5419f51"} Jan 30 06:04:27 crc kubenswrapper[4931]: I0130 06:04:27.553772 4931 scope.go:117] "RemoveContainer" containerID="f2f73de32d7d73d5b5bb194607c20eb568ac797bbaa7bad33c857b1a456369eb" Jan 30 06:04:28 crc kubenswrapper[4931]: I0130 06:04:28.566543 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b"} Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.364416 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwvw5"] Jan 30 06:04:34 crc kubenswrapper[4931]: E0130 06:04:34.365653 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" containerName="collect-profiles" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.365678 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" containerName="collect-profiles" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.366016 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" containerName="collect-profiles" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.369406 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.377650 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwvw5"] Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.447396 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-catalog-content\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.447589 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-utilities\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.447641 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxcb\" (UniqueName: \"kubernetes.io/projected/02c97a92-5bac-414e-ba28-8558cb9dbd96-kube-api-access-jdxcb\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.549472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-utilities\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.549549 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxcb\" (UniqueName: \"kubernetes.io/projected/02c97a92-5bac-414e-ba28-8558cb9dbd96-kube-api-access-jdxcb\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.549813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-catalog-content\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.550754 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-catalog-content\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.550872 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-utilities\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.586979 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxcb\" (UniqueName: \"kubernetes.io/projected/02c97a92-5bac-414e-ba28-8558cb9dbd96-kube-api-access-jdxcb\") pod \"certified-operators-mwvw5\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:34 crc kubenswrapper[4931]: I0130 06:04:34.762122 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:35 crc kubenswrapper[4931]: I0130 06:04:35.225641 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwvw5"] Jan 30 06:04:35 crc kubenswrapper[4931]: I0130 06:04:35.629405 4931 generic.go:334] "Generic (PLEG): container finished" podID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerID="9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5" exitCode=0 Jan 30 06:04:35 crc kubenswrapper[4931]: I0130 06:04:35.629491 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerDied","Data":"9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5"} Jan 30 06:04:35 crc kubenswrapper[4931]: I0130 06:04:35.629796 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerStarted","Data":"a63c24b05c14a83e6af87250f90141d89ab441e8933bbe43b0ae52740900c283"} Jan 30 06:04:35 crc kubenswrapper[4931]: I0130 06:04:35.631740 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:04:36 crc kubenswrapper[4931]: I0130 06:04:36.649328 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerStarted","Data":"fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a"} Jan 30 06:04:37 crc kubenswrapper[4931]: I0130 06:04:37.661406 4931 generic.go:334] "Generic (PLEG): container finished" podID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerID="fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a" exitCode=0 Jan 30 06:04:37 crc kubenswrapper[4931]: I0130 06:04:37.661538 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerDied","Data":"fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a"} Jan 30 06:04:38 crc kubenswrapper[4931]: I0130 06:04:38.675514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerStarted","Data":"feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140"} Jan 30 06:04:38 crc kubenswrapper[4931]: I0130 06:04:38.709327 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwvw5" podStartSLOduration=2.291144972 podStartE2EDuration="4.709302413s" podCreationTimestamp="2026-01-30 06:04:34 +0000 UTC" firstStartedPulling="2026-01-30 06:04:35.631115279 +0000 UTC m=+3411.001025566" lastFinishedPulling="2026-01-30 06:04:38.04927274 +0000 UTC m=+3413.419183007" observedRunningTime="2026-01-30 06:04:38.706306388 +0000 UTC m=+3414.076216675" watchObservedRunningTime="2026-01-30 06:04:38.709302413 +0000 UTC m=+3414.079212710" Jan 30 06:04:44 crc kubenswrapper[4931]: I0130 06:04:44.763210 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:44 crc kubenswrapper[4931]: I0130 06:04:44.763639 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:44 crc kubenswrapper[4931]: I0130 06:04:44.834515 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:45 crc kubenswrapper[4931]: I0130 06:04:45.818155 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:45 crc kubenswrapper[4931]: I0130 06:04:45.918182 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwvw5"] Jan 30 06:04:47 crc kubenswrapper[4931]: I0130 06:04:47.762815 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwvw5" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="registry-server" containerID="cri-o://feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140" gracePeriod=2 Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.271569 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.451218 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-utilities\") pod \"02c97a92-5bac-414e-ba28-8558cb9dbd96\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.451275 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdxcb\" (UniqueName: \"kubernetes.io/projected/02c97a92-5bac-414e-ba28-8558cb9dbd96-kube-api-access-jdxcb\") pod \"02c97a92-5bac-414e-ba28-8558cb9dbd96\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.451402 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-catalog-content\") pod \"02c97a92-5bac-414e-ba28-8558cb9dbd96\" (UID: \"02c97a92-5bac-414e-ba28-8558cb9dbd96\") " Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.452285 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-utilities" (OuterVolumeSpecName: "utilities") pod "02c97a92-5bac-414e-ba28-8558cb9dbd96" (UID: "02c97a92-5bac-414e-ba28-8558cb9dbd96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.452601 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.460084 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c97a92-5bac-414e-ba28-8558cb9dbd96-kube-api-access-jdxcb" (OuterVolumeSpecName: "kube-api-access-jdxcb") pod "02c97a92-5bac-414e-ba28-8558cb9dbd96" (UID: "02c97a92-5bac-414e-ba28-8558cb9dbd96"). InnerVolumeSpecName "kube-api-access-jdxcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.496703 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02c97a92-5bac-414e-ba28-8558cb9dbd96" (UID: "02c97a92-5bac-414e-ba28-8558cb9dbd96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.554794 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdxcb\" (UniqueName: \"kubernetes.io/projected/02c97a92-5bac-414e-ba28-8558cb9dbd96-kube-api-access-jdxcb\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.554854 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c97a92-5bac-414e-ba28-8558cb9dbd96-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.775694 4931 generic.go:334] "Generic (PLEG): container finished" podID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerID="feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140" exitCode=0 Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.775757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerDied","Data":"feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140"} Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.775792 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwvw5" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.775824 4931 scope.go:117] "RemoveContainer" containerID="feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.775806 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwvw5" event={"ID":"02c97a92-5bac-414e-ba28-8558cb9dbd96","Type":"ContainerDied","Data":"a63c24b05c14a83e6af87250f90141d89ab441e8933bbe43b0ae52740900c283"} Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.808684 4931 scope.go:117] "RemoveContainer" containerID="fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.830318 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwvw5"] Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.842472 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwvw5"] Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.857153 4931 scope.go:117] "RemoveContainer" containerID="9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.889768 4931 scope.go:117] "RemoveContainer" containerID="feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140" Jan 30 06:04:48 crc kubenswrapper[4931]: E0130 06:04:48.890524 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140\": container with ID starting with feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140 not found: ID does not exist" containerID="feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.890649 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140"} err="failed to get container status \"feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140\": rpc error: code = NotFound desc = could not find container \"feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140\": container with ID starting with feed4d6c848880fdc6fb49613c7c3962a5ae58756d94d295ec9e57e233d4e140 not found: ID does not exist" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.890785 4931 scope.go:117] "RemoveContainer" containerID="fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a" Jan 30 06:04:48 crc kubenswrapper[4931]: E0130 06:04:48.891419 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a\": container with ID starting with fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a not found: ID does not exist" containerID="fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.891540 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a"} err="failed to get container status \"fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a\": rpc error: code = NotFound desc = could not find container \"fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a\": container with ID starting with fd726bf9b73f80d9c6177c813cc0c940b41c35ad864d34170569c019c1c3d14a not found: ID does not exist" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.891581 4931 scope.go:117] "RemoveContainer" containerID="9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5" Jan 30 06:04:48 crc kubenswrapper[4931]: E0130 06:04:48.892105 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5\": container with ID starting with 9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5 not found: ID does not exist" containerID="9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5" Jan 30 06:04:48 crc kubenswrapper[4931]: I0130 06:04:48.892148 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5"} err="failed to get container status \"9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5\": rpc error: code = NotFound desc = could not find container \"9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5\": container with ID starting with 9891d825719792c8ef2ff6a45a990cfc6e2bdb363833ded55f9e3be7ec2562e5 not found: ID does not exist" Jan 30 06:04:49 crc kubenswrapper[4931]: I0130 06:04:49.437895 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" path="/var/lib/kubelet/pods/02c97a92-5bac-414e-ba28-8558cb9dbd96/volumes" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.003120 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-llckc"] Jan 30 06:05:53 crc kubenswrapper[4931]: E0130 06:05:53.004519 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="extract-content" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.004549 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="extract-content" Jan 30 06:05:53 crc kubenswrapper[4931]: E0130 06:05:53.004585 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="registry-server" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.004601 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="registry-server" Jan 30 06:05:53 crc kubenswrapper[4931]: E0130 06:05:53.004633 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="extract-utilities" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.004649 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="extract-utilities" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.004952 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c97a92-5bac-414e-ba28-8558cb9dbd96" containerName="registry-server" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.009575 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.038802 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llckc"] Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.152991 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-utilities\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.153251 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtth6\" (UniqueName: \"kubernetes.io/projected/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-kube-api-access-rtth6\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.153325 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-catalog-content\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.255052 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtth6\" (UniqueName: \"kubernetes.io/projected/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-kube-api-access-rtth6\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.255116 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-catalog-content\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.255178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-utilities\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.256006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-catalog-content\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.256191 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-utilities\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.277960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtth6\" (UniqueName: \"kubernetes.io/projected/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-kube-api-access-rtth6\") pod \"redhat-operators-llckc\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.339102 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:05:53 crc kubenswrapper[4931]: I0130 06:05:53.824296 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llckc"] Jan 30 06:05:54 crc kubenswrapper[4931]: I0130 06:05:54.423533 4931 generic.go:334] "Generic (PLEG): container finished" podID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerID="8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a" exitCode=0 Jan 30 06:05:54 crc kubenswrapper[4931]: I0130 06:05:54.423599 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerDied","Data":"8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a"} Jan 30 06:05:54 crc kubenswrapper[4931]: I0130 06:05:54.423641 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerStarted","Data":"d5d707b25abaf91ebe43fe475bc7a9765c36e250e9a1cf1650be37aaaf0e7b24"} Jan 30 06:05:55 crc kubenswrapper[4931]: I0130 06:05:55.442506 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerStarted","Data":"777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153"} Jan 30 06:05:56 crc kubenswrapper[4931]: I0130 06:05:56.452628 4931 generic.go:334] "Generic (PLEG): container finished" podID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerID="777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153" exitCode=0 Jan 30 06:05:56 crc kubenswrapper[4931]: I0130 06:05:56.452703 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerDied","Data":"777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153"} Jan 30 06:05:57 crc kubenswrapper[4931]: I0130 06:05:57.461302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerStarted","Data":"5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5"} Jan 30 06:05:57 crc kubenswrapper[4931]: I0130 06:05:57.495053 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-llckc" podStartSLOduration=3.044566233 podStartE2EDuration="5.495025588s" podCreationTimestamp="2026-01-30 06:05:52 +0000 UTC" firstStartedPulling="2026-01-30 06:05:54.425458278 +0000 UTC m=+3489.795368545" lastFinishedPulling="2026-01-30 06:05:56.875917603 +0000 UTC m=+3492.245827900" observedRunningTime="2026-01-30 06:05:57.485762906 +0000 UTC m=+3492.855673153" watchObservedRunningTime="2026-01-30 06:05:57.495025588 +0000 UTC m=+3492.864935895" Jan 30 06:06:03 crc kubenswrapper[4931]: I0130 06:06:03.340024 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:06:03 crc kubenswrapper[4931]: I0130 06:06:03.340648 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:06:04 crc kubenswrapper[4931]: I0130 06:06:04.415148 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llckc" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="registry-server" probeResult="failure" output=< Jan 30 06:06:04 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:06:04 crc kubenswrapper[4931]: > Jan 30 06:06:13 crc kubenswrapper[4931]: I0130 06:06:13.413522 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:06:13 crc kubenswrapper[4931]: I0130 06:06:13.485917 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:06:13 crc kubenswrapper[4931]: I0130 06:06:13.665144 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llckc"] Jan 30 06:06:14 crc kubenswrapper[4931]: I0130 06:06:14.601407 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-llckc" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="registry-server" containerID="cri-o://5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5" gracePeriod=2 Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.149722 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.181657 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtth6\" (UniqueName: \"kubernetes.io/projected/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-kube-api-access-rtth6\") pod \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.181809 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-catalog-content\") pod \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.181867 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-utilities\") pod \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\" (UID: \"c0d383f8-74f2-49c4-8586-1c0420ec4d5f\") " Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.182563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-utilities" (OuterVolumeSpecName: "utilities") pod "c0d383f8-74f2-49c4-8586-1c0420ec4d5f" (UID: "c0d383f8-74f2-49c4-8586-1c0420ec4d5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.186602 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-kube-api-access-rtth6" (OuterVolumeSpecName: "kube-api-access-rtth6") pod "c0d383f8-74f2-49c4-8586-1c0420ec4d5f" (UID: "c0d383f8-74f2-49c4-8586-1c0420ec4d5f"). InnerVolumeSpecName "kube-api-access-rtth6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.283538 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.283569 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtth6\" (UniqueName: \"kubernetes.io/projected/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-kube-api-access-rtth6\") on node \"crc\" DevicePath \"\"" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.310677 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0d383f8-74f2-49c4-8586-1c0420ec4d5f" (UID: "c0d383f8-74f2-49c4-8586-1c0420ec4d5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.385806 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d383f8-74f2-49c4-8586-1c0420ec4d5f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.613876 4931 generic.go:334] "Generic (PLEG): container finished" podID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerID="5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5" exitCode=0 Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.613993 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerDied","Data":"5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5"} Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.614562 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llckc" event={"ID":"c0d383f8-74f2-49c4-8586-1c0420ec4d5f","Type":"ContainerDied","Data":"d5d707b25abaf91ebe43fe475bc7a9765c36e250e9a1cf1650be37aaaf0e7b24"} Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.614594 4931 scope.go:117] "RemoveContainer" containerID="5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.613964 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llckc" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.643649 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llckc"] Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.653998 4931 scope.go:117] "RemoveContainer" containerID="777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.654698 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-llckc"] Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.684014 4931 scope.go:117] "RemoveContainer" containerID="8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.725216 4931 scope.go:117] "RemoveContainer" containerID="5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5" Jan 30 06:06:15 crc kubenswrapper[4931]: E0130 06:06:15.725827 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5\": container with ID starting with 5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5 not found: ID does not exist" containerID="5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.725875 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5"} err="failed to get container status \"5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5\": rpc error: code = NotFound desc = could not find container \"5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5\": container with ID starting with 5506d6cbe6111ed556ba1975bcde73216453214b22d4fe8608535a9935424bd5 not found: ID does not exist" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.725910 4931 scope.go:117] "RemoveContainer" containerID="777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153" Jan 30 06:06:15 crc kubenswrapper[4931]: E0130 06:06:15.726540 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153\": container with ID starting with 777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153 not found: ID does not exist" containerID="777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.726586 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153"} err="failed to get container status \"777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153\": rpc error: code = NotFound desc = could not find container \"777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153\": container with ID starting with 777ae7236efc1d41bc64eedf60b129a8e00c73c9050cac2e9347a487b87bd153 not found: ID does not exist" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.726674 4931 scope.go:117] "RemoveContainer" containerID="8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a" Jan 30 06:06:15 crc kubenswrapper[4931]: E0130 06:06:15.727129 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a\": container with ID starting with 8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a not found: ID does not exist" containerID="8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a" Jan 30 06:06:15 crc kubenswrapper[4931]: I0130 06:06:15.727163 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a"} err="failed to get container status \"8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a\": rpc error: code = NotFound desc = could not find container \"8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a\": container with ID starting with 8ad68add48f759e5bde99e4704165c1ff41188336b12af217effa58c26ea1e3a not found: ID does not exist" Jan 30 06:06:17 crc kubenswrapper[4931]: I0130 06:06:17.438493 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" path="/var/lib/kubelet/pods/c0d383f8-74f2-49c4-8586-1c0420ec4d5f/volumes" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.069370 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j8jt4"] Jan 30 06:06:20 crc kubenswrapper[4931]: E0130 06:06:20.069930 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="registry-server" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.069943 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="registry-server" Jan 30 06:06:20 crc kubenswrapper[4931]: E0130 06:06:20.069954 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="extract-utilities" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.069960 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="extract-utilities" Jan 30 06:06:20 crc kubenswrapper[4931]: E0130 06:06:20.069976 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="extract-content" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.069984 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="extract-content" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.070121 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d383f8-74f2-49c4-8586-1c0420ec4d5f" containerName="registry-server" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.071193 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.086093 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j8jt4"] Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.255411 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4ww\" (UniqueName: \"kubernetes.io/projected/da8ef012-9169-4a7f-9a5f-089f037767cb-kube-api-access-dk4ww\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.255488 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-catalog-content\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.255800 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-utilities\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.364433 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4ww\" (UniqueName: \"kubernetes.io/projected/da8ef012-9169-4a7f-9a5f-089f037767cb-kube-api-access-dk4ww\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.364501 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-catalog-content\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.364518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-utilities\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.364920 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-utilities\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.365285 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-catalog-content\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.396695 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4ww\" (UniqueName: \"kubernetes.io/projected/da8ef012-9169-4a7f-9a5f-089f037767cb-kube-api-access-dk4ww\") pod \"community-operators-j8jt4\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.403071 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.623319 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j8jt4"] Jan 30 06:06:20 crc kubenswrapper[4931]: I0130 06:06:20.665872 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8jt4" event={"ID":"da8ef012-9169-4a7f-9a5f-089f037767cb","Type":"ContainerStarted","Data":"ee74a72d8120c634d75689bf443743d2d71166952e679e589aec7096851cd440"} Jan 30 06:06:21 crc kubenswrapper[4931]: I0130 06:06:21.678258 4931 generic.go:334] "Generic (PLEG): container finished" podID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerID="e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da" exitCode=0 Jan 30 06:06:21 crc kubenswrapper[4931]: I0130 06:06:21.678325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8jt4" event={"ID":"da8ef012-9169-4a7f-9a5f-089f037767cb","Type":"ContainerDied","Data":"e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da"} Jan 30 06:06:23 crc kubenswrapper[4931]: I0130 06:06:23.698177 4931 generic.go:334] "Generic (PLEG): container finished" podID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerID="f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d" exitCode=0 Jan 30 06:06:23 crc kubenswrapper[4931]: I0130 06:06:23.698253 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8jt4" event={"ID":"da8ef012-9169-4a7f-9a5f-089f037767cb","Type":"ContainerDied","Data":"f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d"} Jan 30 06:06:24 crc kubenswrapper[4931]: I0130 06:06:24.711193 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8jt4" event={"ID":"da8ef012-9169-4a7f-9a5f-089f037767cb","Type":"ContainerStarted","Data":"d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f"} Jan 30 06:06:24 crc kubenswrapper[4931]: I0130 06:06:24.746865 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j8jt4" podStartSLOduration=2.313809279 podStartE2EDuration="4.746838432s" podCreationTimestamp="2026-01-30 06:06:20 +0000 UTC" firstStartedPulling="2026-01-30 06:06:21.681437638 +0000 UTC m=+3517.051347925" lastFinishedPulling="2026-01-30 06:06:24.114466811 +0000 UTC m=+3519.484377078" observedRunningTime="2026-01-30 06:06:24.736746956 +0000 UTC m=+3520.106657243" watchObservedRunningTime="2026-01-30 06:06:24.746838432 +0000 UTC m=+3520.116748729" Jan 30 06:06:27 crc kubenswrapper[4931]: I0130 06:06:27.362749 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:06:27 crc kubenswrapper[4931]: I0130 06:06:27.362808 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:06:30 crc kubenswrapper[4931]: I0130 06:06:30.404521 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:30 crc kubenswrapper[4931]: I0130 06:06:30.404882 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:30 crc kubenswrapper[4931]: I0130 06:06:30.463976 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:30 crc kubenswrapper[4931]: I0130 06:06:30.823478 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:30 crc kubenswrapper[4931]: I0130 06:06:30.876350 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j8jt4"] Jan 30 06:06:32 crc kubenswrapper[4931]: I0130 06:06:32.780347 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j8jt4" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="registry-server" containerID="cri-o://d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f" gracePeriod=2 Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.312979 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.432153 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-utilities\") pod \"da8ef012-9169-4a7f-9a5f-089f037767cb\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.432218 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-catalog-content\") pod \"da8ef012-9169-4a7f-9a5f-089f037767cb\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.432466 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk4ww\" (UniqueName: \"kubernetes.io/projected/da8ef012-9169-4a7f-9a5f-089f037767cb-kube-api-access-dk4ww\") pod \"da8ef012-9169-4a7f-9a5f-089f037767cb\" (UID: \"da8ef012-9169-4a7f-9a5f-089f037767cb\") " Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.434516 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-utilities" (OuterVolumeSpecName: "utilities") pod "da8ef012-9169-4a7f-9a5f-089f037767cb" (UID: "da8ef012-9169-4a7f-9a5f-089f037767cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.434761 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.441923 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8ef012-9169-4a7f-9a5f-089f037767cb-kube-api-access-dk4ww" (OuterVolumeSpecName: "kube-api-access-dk4ww") pod "da8ef012-9169-4a7f-9a5f-089f037767cb" (UID: "da8ef012-9169-4a7f-9a5f-089f037767cb"). InnerVolumeSpecName "kube-api-access-dk4ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.492165 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da8ef012-9169-4a7f-9a5f-089f037767cb" (UID: "da8ef012-9169-4a7f-9a5f-089f037767cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.536351 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da8ef012-9169-4a7f-9a5f-089f037767cb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.536403 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk4ww\" (UniqueName: \"kubernetes.io/projected/da8ef012-9169-4a7f-9a5f-089f037767cb-kube-api-access-dk4ww\") on node \"crc\" DevicePath \"\"" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.791710 4931 generic.go:334] "Generic (PLEG): container finished" podID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerID="d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f" exitCode=0 Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.791757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8jt4" event={"ID":"da8ef012-9169-4a7f-9a5f-089f037767cb","Type":"ContainerDied","Data":"d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f"} Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.791795 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8jt4" event={"ID":"da8ef012-9169-4a7f-9a5f-089f037767cb","Type":"ContainerDied","Data":"ee74a72d8120c634d75689bf443743d2d71166952e679e589aec7096851cd440"} Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.791817 4931 scope.go:117] "RemoveContainer" containerID="d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.791852 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8jt4" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.822165 4931 scope.go:117] "RemoveContainer" containerID="f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.847749 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j8jt4"] Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.855849 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j8jt4"] Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.865114 4931 scope.go:117] "RemoveContainer" containerID="e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.911082 4931 scope.go:117] "RemoveContainer" containerID="d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f" Jan 30 06:06:33 crc kubenswrapper[4931]: E0130 06:06:33.911796 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f\": container with ID starting with d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f not found: ID does not exist" containerID="d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.911854 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f"} err="failed to get container status \"d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f\": rpc error: code = NotFound desc = could not find container \"d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f\": container with ID starting with d5ae3d0c435be23282c47a6f4101d591c13d393322c4cb16f5a352737782a19f not found: ID does not exist" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.911984 4931 scope.go:117] "RemoveContainer" containerID="f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d" Jan 30 06:06:33 crc kubenswrapper[4931]: E0130 06:06:33.912409 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d\": container with ID starting with f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d not found: ID does not exist" containerID="f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.912497 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d"} err="failed to get container status \"f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d\": rpc error: code = NotFound desc = could not find container \"f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d\": container with ID starting with f6708eb42c892ea20b0cb1810bd6b68cdb5921ffd71de7b46668ec865179c12d not found: ID does not exist" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.912528 4931 scope.go:117] "RemoveContainer" containerID="e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da" Jan 30 06:06:33 crc kubenswrapper[4931]: E0130 06:06:33.913195 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da\": container with ID starting with e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da not found: ID does not exist" containerID="e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da" Jan 30 06:06:33 crc kubenswrapper[4931]: I0130 06:06:33.913243 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da"} err="failed to get container status \"e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da\": rpc error: code = NotFound desc = could not find container \"e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da\": container with ID starting with e0a99bf4b444fc11507f43c2e6824e602775ea1a814158f6d39bc9e5bf8b54da not found: ID does not exist" Jan 30 06:06:35 crc kubenswrapper[4931]: I0130 06:06:35.440282 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" path="/var/lib/kubelet/pods/da8ef012-9169-4a7f-9a5f-089f037767cb/volumes" Jan 30 06:06:57 crc kubenswrapper[4931]: I0130 06:06:57.362929 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:06:57 crc kubenswrapper[4931]: I0130 06:06:57.363663 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:07:27 crc kubenswrapper[4931]: I0130 06:07:27.362818 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:07:27 crc kubenswrapper[4931]: I0130 06:07:27.363562 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:07:27 crc kubenswrapper[4931]: I0130 06:07:27.363644 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:07:27 crc kubenswrapper[4931]: I0130 06:07:27.364707 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:07:27 crc kubenswrapper[4931]: I0130 06:07:27.364814 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" gracePeriod=600 Jan 30 06:07:27 crc kubenswrapper[4931]: E0130 06:07:27.493861 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:07:28 crc kubenswrapper[4931]: I0130 06:07:28.319243 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" exitCode=0 Jan 30 06:07:28 crc kubenswrapper[4931]: I0130 06:07:28.319308 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b"} Jan 30 06:07:28 crc kubenswrapper[4931]: I0130 06:07:28.319727 4931 scope.go:117] "RemoveContainer" containerID="3ee39d7345786d63837440f7d9f37384973d255b777a319f3ba033acd5419f51" Jan 30 06:07:28 crc kubenswrapper[4931]: I0130 06:07:28.320500 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:07:28 crc kubenswrapper[4931]: E0130 06:07:28.320893 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:07:42 crc kubenswrapper[4931]: I0130 06:07:42.422227 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:07:42 crc kubenswrapper[4931]: E0130 06:07:42.423313 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:07:56 crc kubenswrapper[4931]: I0130 06:07:56.422116 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:07:56 crc kubenswrapper[4931]: E0130 06:07:56.423119 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:08:10 crc kubenswrapper[4931]: I0130 06:08:10.422783 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:08:10 crc kubenswrapper[4931]: E0130 06:08:10.423864 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:08:25 crc kubenswrapper[4931]: I0130 06:08:25.429408 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:08:25 crc kubenswrapper[4931]: E0130 06:08:25.430838 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:08:40 crc kubenswrapper[4931]: I0130 06:08:40.421711 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:08:40 crc kubenswrapper[4931]: E0130 06:08:40.423044 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:08:51 crc kubenswrapper[4931]: I0130 06:08:51.422140 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:08:51 crc kubenswrapper[4931]: E0130 06:08:51.425678 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:09:04 crc kubenswrapper[4931]: I0130 06:09:04.422254 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:09:04 crc kubenswrapper[4931]: E0130 06:09:04.423332 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:09:17 crc kubenswrapper[4931]: I0130 06:09:17.438257 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:09:17 crc kubenswrapper[4931]: E0130 06:09:17.439242 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:09:28 crc kubenswrapper[4931]: I0130 06:09:28.422245 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:09:28 crc kubenswrapper[4931]: E0130 06:09:28.423481 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:09:43 crc kubenswrapper[4931]: I0130 06:09:43.423091 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:09:43 crc kubenswrapper[4931]: E0130 06:09:43.423851 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:09:54 crc kubenswrapper[4931]: I0130 06:09:54.422571 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:09:54 crc kubenswrapper[4931]: E0130 06:09:54.425322 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:10:09 crc kubenswrapper[4931]: I0130 06:10:09.423069 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:10:09 crc kubenswrapper[4931]: E0130 06:10:09.425050 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:10:24 crc kubenswrapper[4931]: I0130 06:10:24.422790 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:10:24 crc kubenswrapper[4931]: E0130 06:10:24.424175 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:10:35 crc kubenswrapper[4931]: I0130 06:10:35.431808 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:10:35 crc kubenswrapper[4931]: E0130 06:10:35.448058 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:10:48 crc kubenswrapper[4931]: I0130 06:10:48.423173 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:10:48 crc kubenswrapper[4931]: E0130 06:10:48.424486 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:11:02 crc kubenswrapper[4931]: I0130 06:11:02.421676 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:11:02 crc kubenswrapper[4931]: E0130 06:11:02.422787 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:11:15 crc kubenswrapper[4931]: I0130 06:11:15.429835 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:11:15 crc kubenswrapper[4931]: E0130 06:11:15.431000 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:11:29 crc kubenswrapper[4931]: I0130 06:11:29.422553 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:11:29 crc kubenswrapper[4931]: E0130 06:11:29.423947 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:11:40 crc kubenswrapper[4931]: I0130 06:11:40.422703 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:11:40 crc kubenswrapper[4931]: E0130 06:11:40.423667 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:11:54 crc kubenswrapper[4931]: I0130 06:11:54.422018 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:11:54 crc kubenswrapper[4931]: E0130 06:11:54.423128 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:12:09 crc kubenswrapper[4931]: I0130 06:12:09.423312 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:12:09 crc kubenswrapper[4931]: E0130 06:12:09.424959 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:12:24 crc kubenswrapper[4931]: I0130 06:12:24.422046 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:12:24 crc kubenswrapper[4931]: E0130 06:12:24.423111 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:12:35 crc kubenswrapper[4931]: I0130 06:12:35.431855 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:12:36 crc kubenswrapper[4931]: I0130 06:12:36.326641 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"3235141ead611a0f7e3893625153168342d0a69d19fff5669f6dde20d2fc1d8b"} Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.908325 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6226b"] Jan 30 06:13:19 crc kubenswrapper[4931]: E0130 06:13:19.909443 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="registry-server" Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.909464 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="registry-server" Jan 30 06:13:19 crc kubenswrapper[4931]: E0130 06:13:19.909497 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="extract-content" Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.909511 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="extract-content" Jan 30 06:13:19 crc kubenswrapper[4931]: E0130 06:13:19.909554 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="extract-utilities" Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.909567 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="extract-utilities" Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.909804 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8ef012-9169-4a7f-9a5f-089f037767cb" containerName="registry-server" Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.911709 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:19 crc kubenswrapper[4931]: I0130 06:13:19.930308 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6226b"] Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.002640 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-utilities\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.002710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4gh\" (UniqueName: \"kubernetes.io/projected/9820847a-122c-4574-ae00-c9fa43dbcb5c-kube-api-access-mx4gh\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.002774 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-catalog-content\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.104393 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4gh\" (UniqueName: \"kubernetes.io/projected/9820847a-122c-4574-ae00-c9fa43dbcb5c-kube-api-access-mx4gh\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.104559 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-catalog-content\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.104653 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-utilities\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.105231 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-catalog-content\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.105249 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-utilities\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.132776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4gh\" (UniqueName: \"kubernetes.io/projected/9820847a-122c-4574-ae00-c9fa43dbcb5c-kube-api-access-mx4gh\") pod \"redhat-marketplace-6226b\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.248320 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:20 crc kubenswrapper[4931]: I0130 06:13:20.738782 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6226b"] Jan 30 06:13:20 crc kubenswrapper[4931]: W0130 06:13:20.754043 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9820847a_122c_4574_ae00_c9fa43dbcb5c.slice/crio-87b14e16ffd43648ee6b21151d7310f6fa08bdb830abcee645cb7d85c984b865 WatchSource:0}: Error finding container 87b14e16ffd43648ee6b21151d7310f6fa08bdb830abcee645cb7d85c984b865: Status 404 returned error can't find the container with id 87b14e16ffd43648ee6b21151d7310f6fa08bdb830abcee645cb7d85c984b865 Jan 30 06:13:21 crc kubenswrapper[4931]: I0130 06:13:21.757518 4931 generic.go:334] "Generic (PLEG): container finished" podID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerID="9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5" exitCode=0 Jan 30 06:13:21 crc kubenswrapper[4931]: I0130 06:13:21.757684 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerDied","Data":"9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5"} Jan 30 06:13:21 crc kubenswrapper[4931]: I0130 06:13:21.758158 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerStarted","Data":"87b14e16ffd43648ee6b21151d7310f6fa08bdb830abcee645cb7d85c984b865"} Jan 30 06:13:21 crc kubenswrapper[4931]: I0130 06:13:21.762337 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:13:22 crc kubenswrapper[4931]: I0130 06:13:22.764767 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerStarted","Data":"4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f"} Jan 30 06:13:23 crc kubenswrapper[4931]: I0130 06:13:23.776770 4931 generic.go:334] "Generic (PLEG): container finished" podID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerID="4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f" exitCode=0 Jan 30 06:13:23 crc kubenswrapper[4931]: I0130 06:13:23.776937 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerDied","Data":"4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f"} Jan 30 06:13:24 crc kubenswrapper[4931]: I0130 06:13:24.788697 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerStarted","Data":"12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b"} Jan 30 06:13:24 crc kubenswrapper[4931]: I0130 06:13:24.826777 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6226b" podStartSLOduration=3.4010951289999998 podStartE2EDuration="5.826753856s" podCreationTimestamp="2026-01-30 06:13:19 +0000 UTC" firstStartedPulling="2026-01-30 06:13:21.761745868 +0000 UTC m=+3937.131656165" lastFinishedPulling="2026-01-30 06:13:24.187404595 +0000 UTC m=+3939.557314892" observedRunningTime="2026-01-30 06:13:24.817484677 +0000 UTC m=+3940.187394944" watchObservedRunningTime="2026-01-30 06:13:24.826753856 +0000 UTC m=+3940.196664133" Jan 30 06:13:30 crc kubenswrapper[4931]: I0130 06:13:30.249367 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:30 crc kubenswrapper[4931]: I0130 06:13:30.250475 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:30 crc kubenswrapper[4931]: I0130 06:13:30.324332 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:31 crc kubenswrapper[4931]: I0130 06:13:31.264360 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:33 crc kubenswrapper[4931]: I0130 06:13:33.889201 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6226b"] Jan 30 06:13:33 crc kubenswrapper[4931]: I0130 06:13:33.889599 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6226b" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="registry-server" containerID="cri-o://12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b" gracePeriod=2 Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.506828 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.649174 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx4gh\" (UniqueName: \"kubernetes.io/projected/9820847a-122c-4574-ae00-c9fa43dbcb5c-kube-api-access-mx4gh\") pod \"9820847a-122c-4574-ae00-c9fa43dbcb5c\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.649274 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-catalog-content\") pod \"9820847a-122c-4574-ae00-c9fa43dbcb5c\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.649455 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-utilities\") pod \"9820847a-122c-4574-ae00-c9fa43dbcb5c\" (UID: \"9820847a-122c-4574-ae00-c9fa43dbcb5c\") " Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.651235 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-utilities" (OuterVolumeSpecName: "utilities") pod "9820847a-122c-4574-ae00-c9fa43dbcb5c" (UID: "9820847a-122c-4574-ae00-c9fa43dbcb5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.659467 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9820847a-122c-4574-ae00-c9fa43dbcb5c-kube-api-access-mx4gh" (OuterVolumeSpecName: "kube-api-access-mx4gh") pod "9820847a-122c-4574-ae00-c9fa43dbcb5c" (UID: "9820847a-122c-4574-ae00-c9fa43dbcb5c"). InnerVolumeSpecName "kube-api-access-mx4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.685337 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9820847a-122c-4574-ae00-c9fa43dbcb5c" (UID: "9820847a-122c-4574-ae00-c9fa43dbcb5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.751713 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.751766 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx4gh\" (UniqueName: \"kubernetes.io/projected/9820847a-122c-4574-ae00-c9fa43dbcb5c-kube-api-access-mx4gh\") on node \"crc\" DevicePath \"\"" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.751787 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9820847a-122c-4574-ae00-c9fa43dbcb5c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.881077 4931 generic.go:334] "Generic (PLEG): container finished" podID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerID="12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b" exitCode=0 Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.881135 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerDied","Data":"12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b"} Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.881149 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6226b" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.881186 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6226b" event={"ID":"9820847a-122c-4574-ae00-c9fa43dbcb5c","Type":"ContainerDied","Data":"87b14e16ffd43648ee6b21151d7310f6fa08bdb830abcee645cb7d85c984b865"} Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.881222 4931 scope.go:117] "RemoveContainer" containerID="12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.922899 4931 scope.go:117] "RemoveContainer" containerID="4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.933143 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6226b"] Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.947729 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6226b"] Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.966131 4931 scope.go:117] "RemoveContainer" containerID="9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5" Jan 30 06:13:34 crc kubenswrapper[4931]: I0130 06:13:34.999293 4931 scope.go:117] "RemoveContainer" containerID="12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b" Jan 30 06:13:34 crc kubenswrapper[4931]: E0130 06:13:34.999823 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b\": container with ID starting with 12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b not found: ID does not exist" containerID="12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b" Jan 30 06:13:35 crc kubenswrapper[4931]: I0130 06:13:34.999885 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b"} err="failed to get container status \"12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b\": rpc error: code = NotFound desc = could not find container \"12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b\": container with ID starting with 12d931cfeb4103bb2568906ed16443a1e9d5632d761dc2358c0e1ef823d15d3b not found: ID does not exist" Jan 30 06:13:35 crc kubenswrapper[4931]: I0130 06:13:34.999912 4931 scope.go:117] "RemoveContainer" containerID="4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f" Jan 30 06:13:35 crc kubenswrapper[4931]: E0130 06:13:35.000410 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f\": container with ID starting with 4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f not found: ID does not exist" containerID="4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f" Jan 30 06:13:35 crc kubenswrapper[4931]: I0130 06:13:35.000461 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f"} err="failed to get container status \"4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f\": rpc error: code = NotFound desc = could not find container \"4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f\": container with ID starting with 4709a186bdcfef0e292891573314ae8946afbacfb45e50eef069867eaf13141f not found: ID does not exist" Jan 30 06:13:35 crc kubenswrapper[4931]: I0130 06:13:35.000482 4931 scope.go:117] "RemoveContainer" containerID="9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5" Jan 30 06:13:35 crc kubenswrapper[4931]: E0130 06:13:35.000895 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5\": container with ID starting with 9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5 not found: ID does not exist" containerID="9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5" Jan 30 06:13:35 crc kubenswrapper[4931]: I0130 06:13:35.000924 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5"} err="failed to get container status \"9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5\": rpc error: code = NotFound desc = could not find container \"9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5\": container with ID starting with 9717ea1e32889e7df9d8739d7f467a863036da2019d6afd7b48486952c11a7d5 not found: ID does not exist" Jan 30 06:13:35 crc kubenswrapper[4931]: I0130 06:13:35.437850 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" path="/var/lib/kubelet/pods/9820847a-122c-4574-ae00-c9fa43dbcb5c/volumes" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.576083 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7pttl"] Jan 30 06:14:42 crc kubenswrapper[4931]: E0130 06:14:42.577502 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="extract-utilities" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.577536 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="extract-utilities" Jan 30 06:14:42 crc kubenswrapper[4931]: E0130 06:14:42.577578 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="extract-content" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.577594 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="extract-content" Jan 30 06:14:42 crc kubenswrapper[4931]: E0130 06:14:42.577650 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="registry-server" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.577668 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="registry-server" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.578012 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9820847a-122c-4574-ae00-c9fa43dbcb5c" containerName="registry-server" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.580593 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.592665 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7pttl"] Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.696999 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqlg\" (UniqueName: \"kubernetes.io/projected/fe96f298-ec26-408d-9726-27cbd48f1000-kube-api-access-fsqlg\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.697175 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-catalog-content\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.697526 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-utilities\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.799375 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-catalog-content\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.799542 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-utilities\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.799620 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqlg\" (UniqueName: \"kubernetes.io/projected/fe96f298-ec26-408d-9726-27cbd48f1000-kube-api-access-fsqlg\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.800219 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-catalog-content\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.800296 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-utilities\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.823538 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqlg\" (UniqueName: \"kubernetes.io/projected/fe96f298-ec26-408d-9726-27cbd48f1000-kube-api-access-fsqlg\") pod \"certified-operators-7pttl\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:42 crc kubenswrapper[4931]: I0130 06:14:42.919030 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:43 crc kubenswrapper[4931]: I0130 06:14:43.159117 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7pttl"] Jan 30 06:14:43 crc kubenswrapper[4931]: I0130 06:14:43.528931 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe96f298-ec26-408d-9726-27cbd48f1000" containerID="b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688" exitCode=0 Jan 30 06:14:43 crc kubenswrapper[4931]: I0130 06:14:43.529018 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerDied","Data":"b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688"} Jan 30 06:14:43 crc kubenswrapper[4931]: I0130 06:14:43.529337 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerStarted","Data":"0008547ef7be2a21ff5cf547b5d7fc2f4b3459ce1ea50374e2ee436469690c2e"} Jan 30 06:14:44 crc kubenswrapper[4931]: I0130 06:14:44.544994 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerStarted","Data":"20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9"} Jan 30 06:14:45 crc kubenswrapper[4931]: I0130 06:14:45.556249 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe96f298-ec26-408d-9726-27cbd48f1000" containerID="20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9" exitCode=0 Jan 30 06:14:45 crc kubenswrapper[4931]: I0130 06:14:45.556306 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerDied","Data":"20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9"} Jan 30 06:14:46 crc kubenswrapper[4931]: I0130 06:14:46.569972 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerStarted","Data":"681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca"} Jan 30 06:14:46 crc kubenswrapper[4931]: I0130 06:14:46.600468 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7pttl" podStartSLOduration=2.1615184100000002 podStartE2EDuration="4.600402478s" podCreationTimestamp="2026-01-30 06:14:42 +0000 UTC" firstStartedPulling="2026-01-30 06:14:43.530572245 +0000 UTC m=+4018.900482532" lastFinishedPulling="2026-01-30 06:14:45.969456303 +0000 UTC m=+4021.339366600" observedRunningTime="2026-01-30 06:14:46.596993093 +0000 UTC m=+4021.966903410" watchObservedRunningTime="2026-01-30 06:14:46.600402478 +0000 UTC m=+4021.970312785" Jan 30 06:14:52 crc kubenswrapper[4931]: I0130 06:14:52.919998 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:52 crc kubenswrapper[4931]: I0130 06:14:52.921004 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:53 crc kubenswrapper[4931]: I0130 06:14:53.001361 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:53 crc kubenswrapper[4931]: I0130 06:14:53.723988 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:53 crc kubenswrapper[4931]: I0130 06:14:53.791453 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7pttl"] Jan 30 06:14:55 crc kubenswrapper[4931]: I0130 06:14:55.650288 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7pttl" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="registry-server" containerID="cri-o://681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca" gracePeriod=2 Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.177265 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.343069 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsqlg\" (UniqueName: \"kubernetes.io/projected/fe96f298-ec26-408d-9726-27cbd48f1000-kube-api-access-fsqlg\") pod \"fe96f298-ec26-408d-9726-27cbd48f1000\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.343225 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-utilities\") pod \"fe96f298-ec26-408d-9726-27cbd48f1000\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.343269 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-catalog-content\") pod \"fe96f298-ec26-408d-9726-27cbd48f1000\" (UID: \"fe96f298-ec26-408d-9726-27cbd48f1000\") " Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.344697 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-utilities" (OuterVolumeSpecName: "utilities") pod "fe96f298-ec26-408d-9726-27cbd48f1000" (UID: "fe96f298-ec26-408d-9726-27cbd48f1000"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.349503 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe96f298-ec26-408d-9726-27cbd48f1000-kube-api-access-fsqlg" (OuterVolumeSpecName: "kube-api-access-fsqlg") pod "fe96f298-ec26-408d-9726-27cbd48f1000" (UID: "fe96f298-ec26-408d-9726-27cbd48f1000"). InnerVolumeSpecName "kube-api-access-fsqlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.404532 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe96f298-ec26-408d-9726-27cbd48f1000" (UID: "fe96f298-ec26-408d-9726-27cbd48f1000"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.444973 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsqlg\" (UniqueName: \"kubernetes.io/projected/fe96f298-ec26-408d-9726-27cbd48f1000-kube-api-access-fsqlg\") on node \"crc\" DevicePath \"\"" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.445022 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.445041 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe96f298-ec26-408d-9726-27cbd48f1000-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.667405 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe96f298-ec26-408d-9726-27cbd48f1000" containerID="681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca" exitCode=0 Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.667522 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerDied","Data":"681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca"} Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.667573 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pttl" event={"ID":"fe96f298-ec26-408d-9726-27cbd48f1000","Type":"ContainerDied","Data":"0008547ef7be2a21ff5cf547b5d7fc2f4b3459ce1ea50374e2ee436469690c2e"} Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.667578 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pttl" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.667612 4931 scope.go:117] "RemoveContainer" containerID="681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.699232 4931 scope.go:117] "RemoveContainer" containerID="20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.724116 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7pttl"] Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.742570 4931 scope.go:117] "RemoveContainer" containerID="b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.745179 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7pttl"] Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.787482 4931 scope.go:117] "RemoveContainer" containerID="681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca" Jan 30 06:14:56 crc kubenswrapper[4931]: E0130 06:14:56.788052 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca\": container with ID starting with 681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca not found: ID does not exist" containerID="681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.788128 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca"} err="failed to get container status \"681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca\": rpc error: code = NotFound desc = could not find container \"681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca\": container with ID starting with 681697db3d7f21be1eb126ee925add4f086d68ee090000e0e969acd0abaa7fca not found: ID does not exist" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.788175 4931 scope.go:117] "RemoveContainer" containerID="20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9" Jan 30 06:14:56 crc kubenswrapper[4931]: E0130 06:14:56.788657 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9\": container with ID starting with 20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9 not found: ID does not exist" containerID="20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.788721 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9"} err="failed to get container status \"20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9\": rpc error: code = NotFound desc = could not find container \"20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9\": container with ID starting with 20fa62f0325e13f9a0a0df805e269ffeddb7f5f6ec884c6688eb9af3d56750c9 not found: ID does not exist" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.788750 4931 scope.go:117] "RemoveContainer" containerID="b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688" Jan 30 06:14:56 crc kubenswrapper[4931]: E0130 06:14:56.789076 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688\": container with ID starting with b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688 not found: ID does not exist" containerID="b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688" Jan 30 06:14:56 crc kubenswrapper[4931]: I0130 06:14:56.789125 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688"} err="failed to get container status \"b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688\": rpc error: code = NotFound desc = could not find container \"b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688\": container with ID starting with b0c0c7ee21138cd6602968a5f94a11879dfd4c2478dbe3f4674a5869822eb688 not found: ID does not exist" Jan 30 06:14:57 crc kubenswrapper[4931]: I0130 06:14:57.363862 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:14:57 crc kubenswrapper[4931]: I0130 06:14:57.364461 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:14:57 crc kubenswrapper[4931]: I0130 06:14:57.439829 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" path="/var/lib/kubelet/pods/fe96f298-ec26-408d-9726-27cbd48f1000/volumes" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.237883 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8"] Jan 30 06:15:00 crc kubenswrapper[4931]: E0130 06:15:00.238249 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="extract-content" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.238265 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="extract-content" Jan 30 06:15:00 crc kubenswrapper[4931]: E0130 06:15:00.238283 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="registry-server" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.238291 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="registry-server" Jan 30 06:15:00 crc kubenswrapper[4931]: E0130 06:15:00.238301 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="extract-utilities" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.238309 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="extract-utilities" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.238503 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe96f298-ec26-408d-9726-27cbd48f1000" containerName="registry-server" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.239032 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.244743 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.247376 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.269374 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8"] Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.324242 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-config-volume\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.324350 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-secret-volume\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.324616 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdt9v\" (UniqueName: \"kubernetes.io/projected/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-kube-api-access-mdt9v\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.425716 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-secret-volume\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.425900 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdt9v\" (UniqueName: \"kubernetes.io/projected/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-kube-api-access-mdt9v\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.426026 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-config-volume\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.427602 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-config-volume\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.440731 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-secret-volume\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.447775 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdt9v\" (UniqueName: \"kubernetes.io/projected/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-kube-api-access-mdt9v\") pod \"collect-profiles-29495895-64nq8\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.562827 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:00 crc kubenswrapper[4931]: I0130 06:15:00.854713 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8"] Jan 30 06:15:01 crc kubenswrapper[4931]: I0130 06:15:01.713525 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" event={"ID":"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e","Type":"ContainerStarted","Data":"d9b4e7ab55cdff59c11d247a35f15b900c5c2d23ac2be2cf5caa19378305d01e"} Jan 30 06:15:01 crc kubenswrapper[4931]: I0130 06:15:01.715092 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" event={"ID":"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e","Type":"ContainerStarted","Data":"c74a6b786346dd0ffbe00b1e070896f96528ce8f3c40277f6789ceba0bd660e2"} Jan 30 06:15:01 crc kubenswrapper[4931]: I0130 06:15:01.736568 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" podStartSLOduration=1.736544033 podStartE2EDuration="1.736544033s" podCreationTimestamp="2026-01-30 06:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:15:01.735881105 +0000 UTC m=+4037.105791372" watchObservedRunningTime="2026-01-30 06:15:01.736544033 +0000 UTC m=+4037.106454310" Jan 30 06:15:02 crc kubenswrapper[4931]: I0130 06:15:02.725890 4931 generic.go:334] "Generic (PLEG): container finished" podID="f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" containerID="d9b4e7ab55cdff59c11d247a35f15b900c5c2d23ac2be2cf5caa19378305d01e" exitCode=0 Jan 30 06:15:02 crc kubenswrapper[4931]: I0130 06:15:02.725966 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" event={"ID":"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e","Type":"ContainerDied","Data":"d9b4e7ab55cdff59c11d247a35f15b900c5c2d23ac2be2cf5caa19378305d01e"} Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.178190 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.284455 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-config-volume\") pod \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.284527 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdt9v\" (UniqueName: \"kubernetes.io/projected/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-kube-api-access-mdt9v\") pod \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.284659 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-secret-volume\") pod \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\" (UID: \"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e\") " Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.286170 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" (UID: "f7d1c4a0-d36c-47d4-b603-3320c87f7c8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.302343 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-kube-api-access-mdt9v" (OuterVolumeSpecName: "kube-api-access-mdt9v") pod "f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" (UID: "f7d1c4a0-d36c-47d4-b603-3320c87f7c8e"). InnerVolumeSpecName "kube-api-access-mdt9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.303148 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" (UID: "f7d1c4a0-d36c-47d4-b603-3320c87f7c8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.386755 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.386807 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdt9v\" (UniqueName: \"kubernetes.io/projected/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-kube-api-access-mdt9v\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.386830 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.747286 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" event={"ID":"f7d1c4a0-d36c-47d4-b603-3320c87f7c8e","Type":"ContainerDied","Data":"c74a6b786346dd0ffbe00b1e070896f96528ce8f3c40277f6789ceba0bd660e2"} Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.747834 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74a6b786346dd0ffbe00b1e070896f96528ce8f3c40277f6789ceba0bd660e2" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.747405 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8" Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.848297 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv"] Jan 30 06:15:04 crc kubenswrapper[4931]: I0130 06:15:04.858277 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-8t6xv"] Jan 30 06:15:05 crc kubenswrapper[4931]: I0130 06:15:05.438788 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a19500-eb44-455f-a8b7-7ee5375b87ef" path="/var/lib/kubelet/pods/09a19500-eb44-455f-a8b7-7ee5375b87ef/volumes" Jan 30 06:15:27 crc kubenswrapper[4931]: I0130 06:15:27.363735 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:15:27 crc kubenswrapper[4931]: I0130 06:15:27.364620 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:15:53 crc kubenswrapper[4931]: I0130 06:15:53.154250 4931 scope.go:117] "RemoveContainer" containerID="f1cc6685442d84c78caf7ee74e69ba6f0a12fa18a641f9f2d8eb2d03f2ae6e04" Jan 30 06:15:57 crc kubenswrapper[4931]: I0130 06:15:57.364320 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:15:57 crc kubenswrapper[4931]: I0130 06:15:57.365115 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:15:57 crc kubenswrapper[4931]: I0130 06:15:57.365189 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:15:57 crc kubenswrapper[4931]: I0130 06:15:57.366226 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3235141ead611a0f7e3893625153168342d0a69d19fff5669f6dde20d2fc1d8b"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:15:57 crc kubenswrapper[4931]: I0130 06:15:57.366338 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://3235141ead611a0f7e3893625153168342d0a69d19fff5669f6dde20d2fc1d8b" gracePeriod=600 Jan 30 06:15:58 crc kubenswrapper[4931]: I0130 06:15:58.294643 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="3235141ead611a0f7e3893625153168342d0a69d19fff5669f6dde20d2fc1d8b" exitCode=0 Jan 30 06:15:58 crc kubenswrapper[4931]: I0130 06:15:58.294747 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"3235141ead611a0f7e3893625153168342d0a69d19fff5669f6dde20d2fc1d8b"} Jan 30 06:15:58 crc kubenswrapper[4931]: I0130 06:15:58.295343 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229"} Jan 30 06:15:58 crc kubenswrapper[4931]: I0130 06:15:58.295381 4931 scope.go:117] "RemoveContainer" containerID="761577d14354de60e9f89a79784490438e71d9747d27a00c144870c39871d54b" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.098730 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lw8bs"] Jan 30 06:16:45 crc kubenswrapper[4931]: E0130 06:16:45.103033 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" containerName="collect-profiles" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.103626 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" containerName="collect-profiles" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.104329 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" containerName="collect-profiles" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.110540 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.122569 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lw8bs"] Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.280085 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9591e541-c3a7-4565-a829-b3da700f84ff-utilities\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.280579 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4frp8\" (UniqueName: \"kubernetes.io/projected/9591e541-c3a7-4565-a829-b3da700f84ff-kube-api-access-4frp8\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.280806 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9591e541-c3a7-4565-a829-b3da700f84ff-catalog-content\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.381622 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9591e541-c3a7-4565-a829-b3da700f84ff-catalog-content\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.381711 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9591e541-c3a7-4565-a829-b3da700f84ff-utilities\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.381840 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4frp8\" (UniqueName: \"kubernetes.io/projected/9591e541-c3a7-4565-a829-b3da700f84ff-kube-api-access-4frp8\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.382904 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9591e541-c3a7-4565-a829-b3da700f84ff-catalog-content\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.382974 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9591e541-c3a7-4565-a829-b3da700f84ff-utilities\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.401939 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4frp8\" (UniqueName: \"kubernetes.io/projected/9591e541-c3a7-4565-a829-b3da700f84ff-kube-api-access-4frp8\") pod \"community-operators-lw8bs\" (UID: \"9591e541-c3a7-4565-a829-b3da700f84ff\") " pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.437852 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:45 crc kubenswrapper[4931]: I0130 06:16:45.935881 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lw8bs"] Jan 30 06:16:46 crc kubenswrapper[4931]: I0130 06:16:46.740331 4931 generic.go:334] "Generic (PLEG): container finished" podID="9591e541-c3a7-4565-a829-b3da700f84ff" containerID="98a178c8781f2c5745176d2c050503a39574e2f26cb3321c038637ccc4b2d914" exitCode=0 Jan 30 06:16:46 crc kubenswrapper[4931]: I0130 06:16:46.740407 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw8bs" event={"ID":"9591e541-c3a7-4565-a829-b3da700f84ff","Type":"ContainerDied","Data":"98a178c8781f2c5745176d2c050503a39574e2f26cb3321c038637ccc4b2d914"} Jan 30 06:16:46 crc kubenswrapper[4931]: I0130 06:16:46.741560 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw8bs" event={"ID":"9591e541-c3a7-4565-a829-b3da700f84ff","Type":"ContainerStarted","Data":"3621eafb4fe816d5783cf2a52ae599f2c303610ec84bd79ebd10aa3a4f24be32"} Jan 30 06:16:50 crc kubenswrapper[4931]: I0130 06:16:50.798849 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw8bs" event={"ID":"9591e541-c3a7-4565-a829-b3da700f84ff","Type":"ContainerStarted","Data":"b0d974a3f53b6a76e998aaf425b1eeee40fef7215c292540c1180f9ff99d9b4b"} Jan 30 06:16:51 crc kubenswrapper[4931]: I0130 06:16:51.812269 4931 generic.go:334] "Generic (PLEG): container finished" podID="9591e541-c3a7-4565-a829-b3da700f84ff" containerID="b0d974a3f53b6a76e998aaf425b1eeee40fef7215c292540c1180f9ff99d9b4b" exitCode=0 Jan 30 06:16:51 crc kubenswrapper[4931]: I0130 06:16:51.812377 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw8bs" event={"ID":"9591e541-c3a7-4565-a829-b3da700f84ff","Type":"ContainerDied","Data":"b0d974a3f53b6a76e998aaf425b1eeee40fef7215c292540c1180f9ff99d9b4b"} Jan 30 06:16:52 crc kubenswrapper[4931]: I0130 06:16:52.824086 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw8bs" event={"ID":"9591e541-c3a7-4565-a829-b3da700f84ff","Type":"ContainerStarted","Data":"778afc5ebb1e169282450845aacae8ec9fe089e1af268dbbc12463e6f9e10e7e"} Jan 30 06:16:52 crc kubenswrapper[4931]: I0130 06:16:52.858554 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lw8bs" podStartSLOduration=2.300149483 podStartE2EDuration="7.858528437s" podCreationTimestamp="2026-01-30 06:16:45 +0000 UTC" firstStartedPulling="2026-01-30 06:16:46.742177982 +0000 UTC m=+4142.112088279" lastFinishedPulling="2026-01-30 06:16:52.300556936 +0000 UTC m=+4147.670467233" observedRunningTime="2026-01-30 06:16:52.852690995 +0000 UTC m=+4148.222601262" watchObservedRunningTime="2026-01-30 06:16:52.858528437 +0000 UTC m=+4148.228438734" Jan 30 06:16:55 crc kubenswrapper[4931]: I0130 06:16:55.438820 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:55 crc kubenswrapper[4931]: I0130 06:16:55.439662 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:16:55 crc kubenswrapper[4931]: I0130 06:16:55.519624 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:17:05 crc kubenswrapper[4931]: I0130 06:17:05.509317 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lw8bs" Jan 30 06:17:05 crc kubenswrapper[4931]: I0130 06:17:05.613578 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lw8bs"] Jan 30 06:17:05 crc kubenswrapper[4931]: I0130 06:17:05.678160 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kf2zk"] Jan 30 06:17:05 crc kubenswrapper[4931]: I0130 06:17:05.678558 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kf2zk" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="registry-server" containerID="cri-o://f2435e1bbad2c342892feef9048c365b9106973a68f19a101621695990d1928d" gracePeriod=2 Jan 30 06:17:05 crc kubenswrapper[4931]: I0130 06:17:05.926440 4931 generic.go:334] "Generic (PLEG): container finished" podID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerID="f2435e1bbad2c342892feef9048c365b9106973a68f19a101621695990d1928d" exitCode=0 Jan 30 06:17:05 crc kubenswrapper[4931]: I0130 06:17:05.926520 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerDied","Data":"f2435e1bbad2c342892feef9048c365b9106973a68f19a101621695990d1928d"} Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.066177 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.266251 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-utilities\") pod \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.266364 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kl88\" (UniqueName: \"kubernetes.io/projected/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-kube-api-access-8kl88\") pod \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.266467 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-catalog-content\") pod \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\" (UID: \"4e7fc26b-b0a0-4ed3-973a-d14f3118f495\") " Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.274104 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-utilities" (OuterVolumeSpecName: "utilities") pod "4e7fc26b-b0a0-4ed3-973a-d14f3118f495" (UID: "4e7fc26b-b0a0-4ed3-973a-d14f3118f495"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.279617 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-kube-api-access-8kl88" (OuterVolumeSpecName: "kube-api-access-8kl88") pod "4e7fc26b-b0a0-4ed3-973a-d14f3118f495" (UID: "4e7fc26b-b0a0-4ed3-973a-d14f3118f495"). InnerVolumeSpecName "kube-api-access-8kl88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.317230 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e7fc26b-b0a0-4ed3-973a-d14f3118f495" (UID: "4e7fc26b-b0a0-4ed3-973a-d14f3118f495"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.368379 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kl88\" (UniqueName: \"kubernetes.io/projected/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-kube-api-access-8kl88\") on node \"crc\" DevicePath \"\"" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.368433 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.368442 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e7fc26b-b0a0-4ed3-973a-d14f3118f495-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.942273 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kf2zk" event={"ID":"4e7fc26b-b0a0-4ed3-973a-d14f3118f495","Type":"ContainerDied","Data":"1886542e78e49a17a6d2a06541b8bb125f47e58db49595a42b19bb6b2b8126f7"} Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.942361 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kf2zk" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.942767 4931 scope.go:117] "RemoveContainer" containerID="f2435e1bbad2c342892feef9048c365b9106973a68f19a101621695990d1928d" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.967077 4931 scope.go:117] "RemoveContainer" containerID="3075fb652ae28ffe627ee2fb3e561168de8141b7ab9f92e2f6fdc3f70ab564db" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.981306 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kf2zk"] Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.994523 4931 scope.go:117] "RemoveContainer" containerID="354aad0cad4b5a2844a0aaa97a5d9c4e75d0d2f7996caccea5b63021c15588c0" Jan 30 06:17:06 crc kubenswrapper[4931]: I0130 06:17:06.995589 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kf2zk"] Jan 30 06:17:07 crc kubenswrapper[4931]: I0130 06:17:07.431339 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" path="/var/lib/kubelet/pods/4e7fc26b-b0a0-4ed3-973a-d14f3118f495/volumes" Jan 30 06:17:57 crc kubenswrapper[4931]: I0130 06:17:57.363637 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:17:57 crc kubenswrapper[4931]: I0130 06:17:57.364778 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:18:27 crc kubenswrapper[4931]: I0130 06:18:27.363589 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:18:27 crc kubenswrapper[4931]: I0130 06:18:27.364179 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:18:57 crc kubenswrapper[4931]: I0130 06:18:57.363722 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:18:57 crc kubenswrapper[4931]: I0130 06:18:57.364348 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:18:57 crc kubenswrapper[4931]: I0130 06:18:57.364417 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:18:57 crc kubenswrapper[4931]: I0130 06:18:57.365236 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:18:57 crc kubenswrapper[4931]: I0130 06:18:57.365335 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" gracePeriod=600 Jan 30 06:18:57 crc kubenswrapper[4931]: E0130 06:18:57.507996 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:18:58 crc kubenswrapper[4931]: I0130 06:18:58.037939 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" exitCode=0 Jan 30 06:18:58 crc kubenswrapper[4931]: I0130 06:18:58.037968 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229"} Jan 30 06:18:58 crc kubenswrapper[4931]: I0130 06:18:58.038637 4931 scope.go:117] "RemoveContainer" containerID="3235141ead611a0f7e3893625153168342d0a69d19fff5669f6dde20d2fc1d8b" Jan 30 06:18:58 crc kubenswrapper[4931]: I0130 06:18:58.039339 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:18:58 crc kubenswrapper[4931]: E0130 06:18:58.039778 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:19:09 crc kubenswrapper[4931]: I0130 06:19:09.422455 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:19:09 crc kubenswrapper[4931]: E0130 06:19:09.423468 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:19:21 crc kubenswrapper[4931]: I0130 06:19:21.422680 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:19:21 crc kubenswrapper[4931]: E0130 06:19:21.423492 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.484242 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pvwtk"] Jan 30 06:19:30 crc kubenswrapper[4931]: E0130 06:19:30.484995 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="registry-server" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.485007 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="registry-server" Jan 30 06:19:30 crc kubenswrapper[4931]: E0130 06:19:30.485025 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="extract-content" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.485031 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="extract-content" Jan 30 06:19:30 crc kubenswrapper[4931]: E0130 06:19:30.485047 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="extract-utilities" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.485053 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="extract-utilities" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.485179 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e7fc26b-b0a0-4ed3-973a-d14f3118f495" containerName="registry-server" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.486020 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.499206 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvwtk"] Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.576244 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-utilities\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.576302 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfb57\" (UniqueName: \"kubernetes.io/projected/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-kube-api-access-gfb57\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.576339 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-catalog-content\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.677606 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-utilities\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.677679 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfb57\" (UniqueName: \"kubernetes.io/projected/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-kube-api-access-gfb57\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.677762 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-catalog-content\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.678551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-catalog-content\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.678572 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-utilities\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.696853 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfb57\" (UniqueName: \"kubernetes.io/projected/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-kube-api-access-gfb57\") pod \"redhat-operators-pvwtk\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:30 crc kubenswrapper[4931]: I0130 06:19:30.804979 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:31 crc kubenswrapper[4931]: I0130 06:19:31.310637 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvwtk"] Jan 30 06:19:31 crc kubenswrapper[4931]: I0130 06:19:31.345015 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerStarted","Data":"f397bc428f5dd22c0df0143328bdde718f24dac9ab63094f67925c4d228021cd"} Jan 30 06:19:32 crc kubenswrapper[4931]: I0130 06:19:32.352789 4931 generic.go:334] "Generic (PLEG): container finished" podID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerID="b6286ad94de30681c2a835c7e301c6d1df98a96018bfe2306f9440b968c77016" exitCode=0 Jan 30 06:19:32 crc kubenswrapper[4931]: I0130 06:19:32.352883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerDied","Data":"b6286ad94de30681c2a835c7e301c6d1df98a96018bfe2306f9440b968c77016"} Jan 30 06:19:32 crc kubenswrapper[4931]: I0130 06:19:32.356170 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:19:33 crc kubenswrapper[4931]: I0130 06:19:33.366696 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerStarted","Data":"f8f5eb6693ba1ad57ad1b7da16804037a17ede8e8520ed704cf51910ecbe0943"} Jan 30 06:19:34 crc kubenswrapper[4931]: I0130 06:19:34.378501 4931 generic.go:334] "Generic (PLEG): container finished" podID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerID="f8f5eb6693ba1ad57ad1b7da16804037a17ede8e8520ed704cf51910ecbe0943" exitCode=0 Jan 30 06:19:34 crc kubenswrapper[4931]: I0130 06:19:34.378555 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerDied","Data":"f8f5eb6693ba1ad57ad1b7da16804037a17ede8e8520ed704cf51910ecbe0943"} Jan 30 06:19:35 crc kubenswrapper[4931]: I0130 06:19:35.410468 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerStarted","Data":"8a43ab2a73d1194d386a7a60f91b57edea527c87bc38e74d204537d7f9f1a4b8"} Jan 30 06:19:35 crc kubenswrapper[4931]: I0130 06:19:35.441927 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pvwtk" podStartSLOduration=3.022135117 podStartE2EDuration="5.44190379s" podCreationTimestamp="2026-01-30 06:19:30 +0000 UTC" firstStartedPulling="2026-01-30 06:19:32.355949548 +0000 UTC m=+4307.725859805" lastFinishedPulling="2026-01-30 06:19:34.775718181 +0000 UTC m=+4310.145628478" observedRunningTime="2026-01-30 06:19:35.441314024 +0000 UTC m=+4310.811224321" watchObservedRunningTime="2026-01-30 06:19:35.44190379 +0000 UTC m=+4310.811814087" Jan 30 06:19:36 crc kubenswrapper[4931]: I0130 06:19:36.421760 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:19:36 crc kubenswrapper[4931]: E0130 06:19:36.422054 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:19:40 crc kubenswrapper[4931]: I0130 06:19:40.805509 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:40 crc kubenswrapper[4931]: I0130 06:19:40.806236 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:41 crc kubenswrapper[4931]: I0130 06:19:41.877468 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvwtk" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="registry-server" probeResult="failure" output=< Jan 30 06:19:41 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:19:41 crc kubenswrapper[4931]: > Jan 30 06:19:50 crc kubenswrapper[4931]: I0130 06:19:50.882728 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:50 crc kubenswrapper[4931]: I0130 06:19:50.962794 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:51 crc kubenswrapper[4931]: I0130 06:19:51.139591 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvwtk"] Jan 30 06:19:51 crc kubenswrapper[4931]: I0130 06:19:51.422110 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:19:51 crc kubenswrapper[4931]: E0130 06:19:51.422485 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:19:52 crc kubenswrapper[4931]: I0130 06:19:52.608077 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pvwtk" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="registry-server" containerID="cri-o://8a43ab2a73d1194d386a7a60f91b57edea527c87bc38e74d204537d7f9f1a4b8" gracePeriod=2 Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.643020 4931 generic.go:334] "Generic (PLEG): container finished" podID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerID="8a43ab2a73d1194d386a7a60f91b57edea527c87bc38e74d204537d7f9f1a4b8" exitCode=0 Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.643114 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerDied","Data":"8a43ab2a73d1194d386a7a60f91b57edea527c87bc38e74d204537d7f9f1a4b8"} Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.877055 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.978720 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-catalog-content\") pod \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.978834 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfb57\" (UniqueName: \"kubernetes.io/projected/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-kube-api-access-gfb57\") pod \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.978911 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-utilities\") pod \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\" (UID: \"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4\") " Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.979887 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-utilities" (OuterVolumeSpecName: "utilities") pod "14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" (UID: "14543fe1-7a55-41f4-ab2d-fa5727bcf0c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:53 crc kubenswrapper[4931]: I0130 06:19:53.986635 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-kube-api-access-gfb57" (OuterVolumeSpecName: "kube-api-access-gfb57") pod "14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" (UID: "14543fe1-7a55-41f4-ab2d-fa5727bcf0c4"). InnerVolumeSpecName "kube-api-access-gfb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.081305 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfb57\" (UniqueName: \"kubernetes.io/projected/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-kube-api-access-gfb57\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.081364 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.169109 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" (UID: "14543fe1-7a55-41f4-ab2d-fa5727bcf0c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.183381 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.657361 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvwtk" event={"ID":"14543fe1-7a55-41f4-ab2d-fa5727bcf0c4","Type":"ContainerDied","Data":"f397bc428f5dd22c0df0143328bdde718f24dac9ab63094f67925c4d228021cd"} Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.657882 4931 scope.go:117] "RemoveContainer" containerID="8a43ab2a73d1194d386a7a60f91b57edea527c87bc38e74d204537d7f9f1a4b8" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.657527 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvwtk" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.705495 4931 scope.go:117] "RemoveContainer" containerID="f8f5eb6693ba1ad57ad1b7da16804037a17ede8e8520ed704cf51910ecbe0943" Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.726742 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvwtk"] Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.738689 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pvwtk"] Jan 30 06:19:54 crc kubenswrapper[4931]: I0130 06:19:54.745548 4931 scope.go:117] "RemoveContainer" containerID="b6286ad94de30681c2a835c7e301c6d1df98a96018bfe2306f9440b968c77016" Jan 30 06:19:55 crc kubenswrapper[4931]: I0130 06:19:55.442664 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" path="/var/lib/kubelet/pods/14543fe1-7a55-41f4-ab2d-fa5727bcf0c4/volumes" Jan 30 06:20:02 crc kubenswrapper[4931]: I0130 06:20:02.422164 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:20:02 crc kubenswrapper[4931]: E0130 06:20:02.423105 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.178241 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-mlqzd"] Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.188379 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-mlqzd"] Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.327953 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-h9twk"] Jan 30 06:20:04 crc kubenswrapper[4931]: E0130 06:20:04.328513 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="registry-server" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.328543 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="registry-server" Jan 30 06:20:04 crc kubenswrapper[4931]: E0130 06:20:04.328589 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="extract-utilities" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.328602 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="extract-utilities" Jan 30 06:20:04 crc kubenswrapper[4931]: E0130 06:20:04.328630 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="extract-content" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.328644 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="extract-content" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.328892 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="14543fe1-7a55-41f4-ab2d-fa5727bcf0c4" containerName="registry-server" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.329651 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.332813 4931 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-ff66z" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.333364 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.333556 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.334163 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.344488 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h9twk"] Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.395739 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-crc-storage\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.395826 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rxzf\" (UniqueName: \"kubernetes.io/projected/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-kube-api-access-2rxzf\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.396125 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-node-mnt\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.498216 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-crc-storage\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.498311 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rxzf\" (UniqueName: \"kubernetes.io/projected/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-kube-api-access-2rxzf\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.498520 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-node-mnt\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.498962 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-node-mnt\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.499496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-crc-storage\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.520641 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rxzf\" (UniqueName: \"kubernetes.io/projected/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-kube-api-access-2rxzf\") pod \"crc-storage-crc-h9twk\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:04 crc kubenswrapper[4931]: I0130 06:20:04.689230 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:05 crc kubenswrapper[4931]: I0130 06:20:05.015274 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h9twk"] Jan 30 06:20:05 crc kubenswrapper[4931]: I0130 06:20:05.439418 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f395498-8955-4aa5-b283-62e5b12505f1" path="/var/lib/kubelet/pods/7f395498-8955-4aa5-b283-62e5b12505f1/volumes" Jan 30 06:20:05 crc kubenswrapper[4931]: I0130 06:20:05.761920 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h9twk" event={"ID":"edadf0b9-7f51-445f-8dd1-53dc9fae53aa","Type":"ContainerStarted","Data":"720bcba1a2545c662bf1fa3d80619562b3bc89fa22675dd3d667df179fa2299b"} Jan 30 06:20:06 crc kubenswrapper[4931]: I0130 06:20:06.775262 4931 generic.go:334] "Generic (PLEG): container finished" podID="edadf0b9-7f51-445f-8dd1-53dc9fae53aa" containerID="7d286e4ff9e3a5d29e83c4a7e4320e5360dd3ec6c72cd95a6b0fdf400bac7103" exitCode=0 Jan 30 06:20:06 crc kubenswrapper[4931]: I0130 06:20:06.775639 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h9twk" event={"ID":"edadf0b9-7f51-445f-8dd1-53dc9fae53aa","Type":"ContainerDied","Data":"7d286e4ff9e3a5d29e83c4a7e4320e5360dd3ec6c72cd95a6b0fdf400bac7103"} Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.151039 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.270387 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rxzf\" (UniqueName: \"kubernetes.io/projected/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-kube-api-access-2rxzf\") pod \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.270614 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-crc-storage\") pod \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.270696 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-node-mnt\") pod \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\" (UID: \"edadf0b9-7f51-445f-8dd1-53dc9fae53aa\") " Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.270915 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "edadf0b9-7f51-445f-8dd1-53dc9fae53aa" (UID: "edadf0b9-7f51-445f-8dd1-53dc9fae53aa"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.271306 4931 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.279407 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-kube-api-access-2rxzf" (OuterVolumeSpecName: "kube-api-access-2rxzf") pod "edadf0b9-7f51-445f-8dd1-53dc9fae53aa" (UID: "edadf0b9-7f51-445f-8dd1-53dc9fae53aa"). InnerVolumeSpecName "kube-api-access-2rxzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.304508 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "edadf0b9-7f51-445f-8dd1-53dc9fae53aa" (UID: "edadf0b9-7f51-445f-8dd1-53dc9fae53aa"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.372298 4931 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.372337 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rxzf\" (UniqueName: \"kubernetes.io/projected/edadf0b9-7f51-445f-8dd1-53dc9fae53aa-kube-api-access-2rxzf\") on node \"crc\" DevicePath \"\"" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.797291 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h9twk" event={"ID":"edadf0b9-7f51-445f-8dd1-53dc9fae53aa","Type":"ContainerDied","Data":"720bcba1a2545c662bf1fa3d80619562b3bc89fa22675dd3d667df179fa2299b"} Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.797695 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="720bcba1a2545c662bf1fa3d80619562b3bc89fa22675dd3d667df179fa2299b" Jan 30 06:20:08 crc kubenswrapper[4931]: I0130 06:20:08.797570 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h9twk" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.555839 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-h9twk"] Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.566657 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-h9twk"] Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.725155 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-b8f6v"] Jan 30 06:20:10 crc kubenswrapper[4931]: E0130 06:20:10.725624 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edadf0b9-7f51-445f-8dd1-53dc9fae53aa" containerName="storage" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.725655 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="edadf0b9-7f51-445f-8dd1-53dc9fae53aa" containerName="storage" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.725893 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="edadf0b9-7f51-445f-8dd1-53dc9fae53aa" containerName="storage" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.726695 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.729325 4931 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-ff66z" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.730410 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.732132 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.739198 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.745710 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b8f6v"] Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.812586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-crc-storage\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.812767 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw24d\" (UniqueName: \"kubernetes.io/projected/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-kube-api-access-nw24d\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.813116 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-node-mnt\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.914359 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-node-mnt\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.914491 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-crc-storage\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.914579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw24d\" (UniqueName: \"kubernetes.io/projected/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-kube-api-access-nw24d\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.914781 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-node-mnt\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.915756 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-crc-storage\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:10 crc kubenswrapper[4931]: I0130 06:20:10.947961 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw24d\" (UniqueName: \"kubernetes.io/projected/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-kube-api-access-nw24d\") pod \"crc-storage-crc-b8f6v\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:11 crc kubenswrapper[4931]: I0130 06:20:11.057544 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:11 crc kubenswrapper[4931]: I0130 06:20:11.346665 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b8f6v"] Jan 30 06:20:11 crc kubenswrapper[4931]: W0130 06:20:11.358831 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d2f9de_7c7c_4cc3_9bce_e244b9d73535.slice/crio-1b8dfa464b7df25bb3e4d6dc1b8c41f38c19d57162b9bd74a5c4218406cbda56 WatchSource:0}: Error finding container 1b8dfa464b7df25bb3e4d6dc1b8c41f38c19d57162b9bd74a5c4218406cbda56: Status 404 returned error can't find the container with id 1b8dfa464b7df25bb3e4d6dc1b8c41f38c19d57162b9bd74a5c4218406cbda56 Jan 30 06:20:11 crc kubenswrapper[4931]: I0130 06:20:11.433860 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edadf0b9-7f51-445f-8dd1-53dc9fae53aa" path="/var/lib/kubelet/pods/edadf0b9-7f51-445f-8dd1-53dc9fae53aa/volumes" Jan 30 06:20:11 crc kubenswrapper[4931]: I0130 06:20:11.839548 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b8f6v" event={"ID":"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535","Type":"ContainerStarted","Data":"1b8dfa464b7df25bb3e4d6dc1b8c41f38c19d57162b9bd74a5c4218406cbda56"} Jan 30 06:20:12 crc kubenswrapper[4931]: I0130 06:20:12.851858 4931 generic.go:334] "Generic (PLEG): container finished" podID="e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" containerID="c6a166ae7c4660bbbdc44dc7c1b8670619ea5b26cb3c5e0fee1c8bbd8f4bb2af" exitCode=0 Jan 30 06:20:12 crc kubenswrapper[4931]: I0130 06:20:12.852014 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b8f6v" event={"ID":"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535","Type":"ContainerDied","Data":"c6a166ae7c4660bbbdc44dc7c1b8670619ea5b26cb3c5e0fee1c8bbd8f4bb2af"} Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.225575 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.370888 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-crc-storage\") pod \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.370997 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-node-mnt\") pod \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.371102 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw24d\" (UniqueName: \"kubernetes.io/projected/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-kube-api-access-nw24d\") pod \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\" (UID: \"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535\") " Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.371564 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" (UID: "e5d2f9de-7c7c-4cc3-9bce-e244b9d73535"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.371805 4931 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.379418 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-kube-api-access-nw24d" (OuterVolumeSpecName: "kube-api-access-nw24d") pod "e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" (UID: "e5d2f9de-7c7c-4cc3-9bce-e244b9d73535"). InnerVolumeSpecName "kube-api-access-nw24d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.401757 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" (UID: "e5d2f9de-7c7c-4cc3-9bce-e244b9d73535"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.473371 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw24d\" (UniqueName: \"kubernetes.io/projected/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-kube-api-access-nw24d\") on node \"crc\" DevicePath \"\"" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.473407 4931 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e5d2f9de-7c7c-4cc3-9bce-e244b9d73535-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.872782 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b8f6v" event={"ID":"e5d2f9de-7c7c-4cc3-9bce-e244b9d73535","Type":"ContainerDied","Data":"1b8dfa464b7df25bb3e4d6dc1b8c41f38c19d57162b9bd74a5c4218406cbda56"} Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.872842 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b8dfa464b7df25bb3e4d6dc1b8c41f38c19d57162b9bd74a5c4218406cbda56" Jan 30 06:20:14 crc kubenswrapper[4931]: I0130 06:20:14.872858 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b8f6v" Jan 30 06:20:16 crc kubenswrapper[4931]: I0130 06:20:16.422621 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:20:16 crc kubenswrapper[4931]: E0130 06:20:16.423554 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:20:31 crc kubenswrapper[4931]: I0130 06:20:31.421861 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:20:31 crc kubenswrapper[4931]: E0130 06:20:31.423160 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:20:46 crc kubenswrapper[4931]: I0130 06:20:46.422273 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:20:46 crc kubenswrapper[4931]: E0130 06:20:46.423807 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:20:53 crc kubenswrapper[4931]: I0130 06:20:53.351472 4931 scope.go:117] "RemoveContainer" containerID="ceeb8bcdff334f1b3490e1ee30443dff7dd6fd17a3f2d90428a1f38ad6f3cd5e" Jan 30 06:21:00 crc kubenswrapper[4931]: I0130 06:21:00.424058 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:21:00 crc kubenswrapper[4931]: E0130 06:21:00.426250 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:21:12 crc kubenswrapper[4931]: I0130 06:21:12.422614 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:21:12 crc kubenswrapper[4931]: E0130 06:21:12.423841 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:21:24 crc kubenswrapper[4931]: I0130 06:21:24.422476 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:21:24 crc kubenswrapper[4931]: E0130 06:21:24.423852 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:21:36 crc kubenswrapper[4931]: I0130 06:21:36.421735 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:21:36 crc kubenswrapper[4931]: E0130 06:21:36.422773 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:21:48 crc kubenswrapper[4931]: I0130 06:21:48.421860 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:21:48 crc kubenswrapper[4931]: E0130 06:21:48.422763 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:22:02 crc kubenswrapper[4931]: I0130 06:22:02.422829 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:22:02 crc kubenswrapper[4931]: E0130 06:22:02.423878 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:22:13 crc kubenswrapper[4931]: I0130 06:22:13.422192 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:22:13 crc kubenswrapper[4931]: E0130 06:22:13.423227 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:22:26 crc kubenswrapper[4931]: I0130 06:22:26.422950 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:22:26 crc kubenswrapper[4931]: E0130 06:22:26.424385 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:22:37 crc kubenswrapper[4931]: I0130 06:22:37.426028 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:22:37 crc kubenswrapper[4931]: E0130 06:22:37.427015 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:22:48 crc kubenswrapper[4931]: I0130 06:22:48.423013 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:22:48 crc kubenswrapper[4931]: E0130 06:22:48.424276 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:23:01 crc kubenswrapper[4931]: I0130 06:23:01.422098 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:23:01 crc kubenswrapper[4931]: E0130 06:23:01.423143 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:23:13 crc kubenswrapper[4931]: I0130 06:23:13.422145 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:23:13 crc kubenswrapper[4931]: E0130 06:23:13.423266 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.324939 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95587bc99-x7g4t"] Jan 30 06:23:22 crc kubenswrapper[4931]: E0130 06:23:22.325598 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" containerName="storage" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.325612 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" containerName="storage" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.325732 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d2f9de-7c7c-4cc3-9bce-e244b9d73535" containerName="storage" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.326606 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.329257 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.329479 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.329711 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.331448 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ccbcs" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.335377 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-x7g4t"] Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.336354 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.425403 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncd9\" (UniqueName: \"kubernetes.io/projected/6aa8cfa6-8d93-4f4c-844e-f180daf03802-kube-api-access-lncd9\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.425463 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-config\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.426062 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-dns-svc\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.527436 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-config\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.527771 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-dns-svc\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.527802 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lncd9\" (UniqueName: \"kubernetes.io/projected/6aa8cfa6-8d93-4f4c-844e-f180daf03802-kube-api-access-lncd9\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.528578 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-dns-svc\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.532213 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-config\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.558275 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-7dq6z"] Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.559448 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.563512 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncd9\" (UniqueName: \"kubernetes.io/projected/6aa8cfa6-8d93-4f4c-844e-f180daf03802-kube-api-access-lncd9\") pod \"dnsmasq-dns-95587bc99-x7g4t\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.613482 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-7dq6z"] Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.646946 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.734039 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2x7\" (UniqueName: \"kubernetes.io/projected/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-kube-api-access-5l2x7\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.734127 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-config\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.734154 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.836571 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2x7\" (UniqueName: \"kubernetes.io/projected/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-kube-api-access-5l2x7\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.836891 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-config\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.836935 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.837905 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.837998 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-config\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.903677 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2x7\" (UniqueName: \"kubernetes.io/projected/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-kube-api-access-5l2x7\") pod \"dnsmasq-dns-5d79f765b5-7dq6z\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:22 crc kubenswrapper[4931]: I0130 06:23:22.916458 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.120062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-x7g4t"] Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.327058 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-7dq6z"] Jan 30 06:23:23 crc kubenswrapper[4931]: W0130 06:23:23.358108 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda43ae7a0_f3ca_4eb9_ae23_e6dc8f4e5a1c.slice/crio-df23f9bc8284d347a4dda805a55b9a64a2739f707161ba449d8a6399ee4d3665 WatchSource:0}: Error finding container df23f9bc8284d347a4dda805a55b9a64a2739f707161ba449d8a6399ee4d3665: Status 404 returned error can't find the container with id df23f9bc8284d347a4dda805a55b9a64a2739f707161ba449d8a6399ee4d3665 Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.452365 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.453783 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.457948 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.457971 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.458189 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.458273 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mgh9s" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.458477 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.469314 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650415 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650504 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650550 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ffda181-212b-42f4-bd56-9ab2864ded3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650593 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650614 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ffda181-212b-42f4-bd56-9ab2864ded3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650713 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650928 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ngdz\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-kube-api-access-2ngdz\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.650967 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.651059 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.705982 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.707110 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.710636 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.710750 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.711348 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wgz6z" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.712145 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.712405 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.729310 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752059 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ngdz\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-kube-api-access-2ngdz\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752108 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752152 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752188 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752212 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752237 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ffda181-212b-42f4-bd56-9ab2864ded3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752255 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752270 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ffda181-212b-42f4-bd56-9ab2864ded3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752295 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.752758 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.754157 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.755616 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.756035 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.756927 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.762876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ffda181-212b-42f4-bd56-9ab2864ded3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.763344 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ffda181-212b-42f4-bd56-9ab2864ded3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.773302 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.773353 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/96f6795482cf208a0436fcedd4e13f5ef58c9a3e2d9d6166beea188ab34f9e81/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.779568 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ngdz\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-kube-api-access-2ngdz\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.808802 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853185 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853237 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853290 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853495 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3e54229-729f-4bfc-a208-dc39edc35b8a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3e54229-729f-4bfc-a208-dc39edc35b8a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853631 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bknnm\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-kube-api-access-bknnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853653 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853681 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.853745 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.930931 4931 generic.go:334] "Generic (PLEG): container finished" podID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerID="85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7" exitCode=0 Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.930995 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" event={"ID":"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c","Type":"ContainerDied","Data":"85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7"} Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.931338 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" event={"ID":"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c","Type":"ContainerStarted","Data":"df23f9bc8284d347a4dda805a55b9a64a2739f707161ba449d8a6399ee4d3665"} Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.932907 4931 generic.go:334] "Generic (PLEG): container finished" podID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerID="05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5" exitCode=0 Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.932941 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" event={"ID":"6aa8cfa6-8d93-4f4c-844e-f180daf03802","Type":"ContainerDied","Data":"05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5"} Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.932986 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" event={"ID":"6aa8cfa6-8d93-4f4c-844e-f180daf03802","Type":"ContainerStarted","Data":"1d3b909cce0f1614532a2470422ac028e30b8c0cc10d3660b10b49f6654468a0"} Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.955545 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.955722 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.955834 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.955931 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3e54229-729f-4bfc-a208-dc39edc35b8a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.956011 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3e54229-729f-4bfc-a208-dc39edc35b8a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.956119 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bknnm\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-kube-api-access-bknnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.956227 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.956304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.956404 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.958072 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.960400 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.961768 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.963163 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.965874 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.967488 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.967532 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/081bf8f84c60dfb918ea0eb5418be09e59105cf3295dde894d1b133731bc6391/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.967843 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3e54229-729f-4bfc-a208-dc39edc35b8a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.980580 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bknnm\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-kube-api-access-bknnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:23 crc kubenswrapper[4931]: I0130 06:23:23.984301 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3e54229-729f-4bfc-a208-dc39edc35b8a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.017331 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.030198 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.104409 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.422666 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:23:24 crc kubenswrapper[4931]: E0130 06:23:24.423614 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.502627 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:23:24 crc kubenswrapper[4931]: W0130 06:23:24.506355 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e54229_729f_4bfc_a208_dc39edc35b8a.slice/crio-8795a34898e6841b11366ed2b6c1d443c46d48e62ae64548732f8a0d5ccd026d WatchSource:0}: Error finding container 8795a34898e6841b11366ed2b6c1d443c46d48e62ae64548732f8a0d5ccd026d: Status 404 returned error can't find the container with id 8795a34898e6841b11366ed2b6c1d443c46d48e62ae64548732f8a0d5ccd026d Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.570690 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:23:24 crc kubenswrapper[4931]: W0130 06:23:24.588273 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ffda181_212b_42f4_bd56_9ab2864ded3c.slice/crio-d1546caee0e50a6ac2e387f8e39e4451c82f903d75d2682ba4430ed473fac38f WatchSource:0}: Error finding container d1546caee0e50a6ac2e387f8e39e4451c82f903d75d2682ba4430ed473fac38f: Status 404 returned error can't find the container with id d1546caee0e50a6ac2e387f8e39e4451c82f903d75d2682ba4430ed473fac38f Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.774324 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.776469 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.778871 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x2rnl" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.778884 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.779106 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.780290 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.788608 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.790872 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.881966 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882008 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882042 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-kolla-config\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882078 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882104 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-config-data-default\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882298 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882399 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2cw\" (UniqueName: \"kubernetes.io/projected/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-kube-api-access-mh2cw\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.882484 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.961173 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" event={"ID":"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c","Type":"ContainerStarted","Data":"8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46"} Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.963495 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.966772 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" event={"ID":"6aa8cfa6-8d93-4f4c-844e-f180daf03802","Type":"ContainerStarted","Data":"550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316"} Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.967728 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.968680 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2ffda181-212b-42f4-bd56-9ab2864ded3c","Type":"ContainerStarted","Data":"d1546caee0e50a6ac2e387f8e39e4451c82f903d75d2682ba4430ed473fac38f"} Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.969890 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3e54229-729f-4bfc-a208-dc39edc35b8a","Type":"ContainerStarted","Data":"8795a34898e6841b11366ed2b6c1d443c46d48e62ae64548732f8a0d5ccd026d"} Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.984818 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.984915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.984942 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.984976 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-kolla-config\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.985017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.985048 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-config-data-default\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.985100 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.985133 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2cw\" (UniqueName: \"kubernetes.io/projected/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-kube-api-access-mh2cw\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.986692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-kolla-config\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.986914 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.988866 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-config-data-default\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:24 crc kubenswrapper[4931]: I0130 06:23:24.989037 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.000758 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.000828 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcebf3741f68bcc13c2190545ffc3184ab1144655693389ba5c7ca5ba7137f7b/globalmount\"" pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.009844 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" podStartSLOduration=3.009815087 podStartE2EDuration="3.009815087s" podCreationTimestamp="2026-01-30 06:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:23:24.998919272 +0000 UTC m=+4540.368829539" watchObservedRunningTime="2026-01-30 06:23:25.009815087 +0000 UTC m=+4540.379725384" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.013538 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.019471 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.022818 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2cw\" (UniqueName: \"kubernetes.io/projected/01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3-kube-api-access-mh2cw\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.031559 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" podStartSLOduration=3.031529184 podStartE2EDuration="3.031529184s" podCreationTimestamp="2026-01-30 06:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:23:25.025795373 +0000 UTC m=+4540.395705670" watchObservedRunningTime="2026-01-30 06:23:25.031529184 +0000 UTC m=+4540.401439461" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.061169 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-369a6bb3-0312-4fe7-ba1c-d7defd15de60\") pod \"openstack-galera-0\" (UID: \"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3\") " pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.099448 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.390491 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.391856 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.394354 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.394410 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4wtw7" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.404976 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.500756 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5c3365d-6967-42e2-b00c-887a82a1b73e-config-data\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.500829 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5c3365d-6967-42e2-b00c-887a82a1b73e-kolla-config\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.500857 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7hln\" (UniqueName: \"kubernetes.io/projected/f5c3365d-6967-42e2-b00c-887a82a1b73e-kube-api-access-b7hln\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.527065 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.602991 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5c3365d-6967-42e2-b00c-887a82a1b73e-kolla-config\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.603086 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7hln\" (UniqueName: \"kubernetes.io/projected/f5c3365d-6967-42e2-b00c-887a82a1b73e-kube-api-access-b7hln\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.603252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5c3365d-6967-42e2-b00c-887a82a1b73e-config-data\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.603860 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5c3365d-6967-42e2-b00c-887a82a1b73e-kolla-config\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.604362 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5c3365d-6967-42e2-b00c-887a82a1b73e-config-data\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.706522 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7hln\" (UniqueName: \"kubernetes.io/projected/f5c3365d-6967-42e2-b00c-887a82a1b73e-kube-api-access-b7hln\") pod \"memcached-0\" (UID: \"f5c3365d-6967-42e2-b00c-887a82a1b73e\") " pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.710581 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.982277 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3e54229-729f-4bfc-a208-dc39edc35b8a","Type":"ContainerStarted","Data":"ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f"} Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.984356 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2ffda181-212b-42f4-bd56-9ab2864ded3c","Type":"ContainerStarted","Data":"1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee"} Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.987206 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3","Type":"ContainerStarted","Data":"347f4a99c878c60ced04fec19dba23244dbe8505c061092184cd7fe73d781eff"} Jan 30 06:23:25 crc kubenswrapper[4931]: I0130 06:23:25.987252 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3","Type":"ContainerStarted","Data":"bfffff77e88f4155244cfa8d546485a9a12b5f3703b432b1c04b887ff976bbd6"} Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.195525 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 06:23:26 crc kubenswrapper[4931]: W0130 06:23:26.198786 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5c3365d_6967_42e2_b00c_887a82a1b73e.slice/crio-7d3f316e2e1f08c4867ec8dfa647ae413af47ae1b01c28d4c9d377e88fe5ee34 WatchSource:0}: Error finding container 7d3f316e2e1f08c4867ec8dfa647ae413af47ae1b01c28d4c9d377e88fe5ee34: Status 404 returned error can't find the container with id 7d3f316e2e1f08c4867ec8dfa647ae413af47ae1b01c28d4c9d377e88fe5ee34 Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.451056 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.452436 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.454783 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2tv2r" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.457781 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.457926 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.458959 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.487189 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.620534 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66165b19-dfc8-403f-ae09-30299db6b19f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.621872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.622057 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66165b19-dfc8-403f-ae09-30299db6b19f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.622241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.622354 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66165b19-dfc8-403f-ae09-30299db6b19f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.622551 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtp4h\" (UniqueName: \"kubernetes.io/projected/66165b19-dfc8-403f-ae09-30299db6b19f-kube-api-access-dtp4h\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.622710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.622831 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724212 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtp4h\" (UniqueName: \"kubernetes.io/projected/66165b19-dfc8-403f-ae09-30299db6b19f-kube-api-access-dtp4h\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724317 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724377 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724480 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66165b19-dfc8-403f-ae09-30299db6b19f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724552 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724597 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66165b19-dfc8-403f-ae09-30299db6b19f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724648 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.724684 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66165b19-dfc8-403f-ae09-30299db6b19f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.725386 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66165b19-dfc8-403f-ae09-30299db6b19f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.726628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.726676 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.728689 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66165b19-dfc8-403f-ae09-30299db6b19f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.732564 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66165b19-dfc8-403f-ae09-30299db6b19f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.734082 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.734138 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ee78c3c6a638a66d2e564da262d5ca34b72507ff7765ff278d9157cb38212396/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.735302 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66165b19-dfc8-403f-ae09-30299db6b19f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.743090 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtp4h\" (UniqueName: \"kubernetes.io/projected/66165b19-dfc8-403f-ae09-30299db6b19f-kube-api-access-dtp4h\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.761134 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-52ee0189-780a-4812-b1e3-fbd3952c4c00\") pod \"openstack-cell1-galera-0\" (UID: \"66165b19-dfc8-403f-ae09-30299db6b19f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.811681 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.996575 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5c3365d-6967-42e2-b00c-887a82a1b73e","Type":"ContainerStarted","Data":"50f2dbb718a935ab17cb4ed5b180bca0eb84c7ca8339c04e4870c7611e9d5001"} Jan 30 06:23:26 crc kubenswrapper[4931]: I0130 06:23:26.997029 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5c3365d-6967-42e2-b00c-887a82a1b73e","Type":"ContainerStarted","Data":"7d3f316e2e1f08c4867ec8dfa647ae413af47ae1b01c28d4c9d377e88fe5ee34"} Jan 30 06:23:27 crc kubenswrapper[4931]: I0130 06:23:27.020102 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.020079351 podStartE2EDuration="2.020079351s" podCreationTimestamp="2026-01-30 06:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:23:27.01397134 +0000 UTC m=+4542.383881607" watchObservedRunningTime="2026-01-30 06:23:27.020079351 +0000 UTC m=+4542.389989608" Jan 30 06:23:27 crc kubenswrapper[4931]: W0130 06:23:27.282861 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66165b19_dfc8_403f_ae09_30299db6b19f.slice/crio-a8e6ee76c5e248997766131dd53dfc389bc3138a1466ea69cca0e56dfead52e1 WatchSource:0}: Error finding container a8e6ee76c5e248997766131dd53dfc389bc3138a1466ea69cca0e56dfead52e1: Status 404 returned error can't find the container with id a8e6ee76c5e248997766131dd53dfc389bc3138a1466ea69cca0e56dfead52e1 Jan 30 06:23:27 crc kubenswrapper[4931]: I0130 06:23:27.283967 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 06:23:28 crc kubenswrapper[4931]: I0130 06:23:28.009114 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"66165b19-dfc8-403f-ae09-30299db6b19f","Type":"ContainerStarted","Data":"8589e17dde43a588a9507ad40fe1ffee93a647ef1321ae9eab7554654bab0d99"} Jan 30 06:23:28 crc kubenswrapper[4931]: I0130 06:23:28.009724 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 06:23:28 crc kubenswrapper[4931]: I0130 06:23:28.009760 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"66165b19-dfc8-403f-ae09-30299db6b19f","Type":"ContainerStarted","Data":"a8e6ee76c5e248997766131dd53dfc389bc3138a1466ea69cca0e56dfead52e1"} Jan 30 06:23:30 crc kubenswrapper[4931]: I0130 06:23:30.027058 4931 generic.go:334] "Generic (PLEG): container finished" podID="01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3" containerID="347f4a99c878c60ced04fec19dba23244dbe8505c061092184cd7fe73d781eff" exitCode=0 Jan 30 06:23:30 crc kubenswrapper[4931]: I0130 06:23:30.027135 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3","Type":"ContainerDied","Data":"347f4a99c878c60ced04fec19dba23244dbe8505c061092184cd7fe73d781eff"} Jan 30 06:23:31 crc kubenswrapper[4931]: I0130 06:23:31.039580 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3","Type":"ContainerStarted","Data":"7b543a983037944ab2115585669090d4c1c4c5d04dfcf7437337b1e278dd9fcf"} Jan 30 06:23:31 crc kubenswrapper[4931]: I0130 06:23:31.079610 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.079584641 podStartE2EDuration="8.079584641s" podCreationTimestamp="2026-01-30 06:23:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:23:31.068651425 +0000 UTC m=+4546.438561712" watchObservedRunningTime="2026-01-30 06:23:31.079584641 +0000 UTC m=+4546.449494938" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.052515 4931 generic.go:334] "Generic (PLEG): container finished" podID="66165b19-dfc8-403f-ae09-30299db6b19f" containerID="8589e17dde43a588a9507ad40fe1ffee93a647ef1321ae9eab7554654bab0d99" exitCode=0 Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.052616 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"66165b19-dfc8-403f-ae09-30299db6b19f","Type":"ContainerDied","Data":"8589e17dde43a588a9507ad40fe1ffee93a647ef1321ae9eab7554654bab0d99"} Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.413709 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-44rwk"] Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.416134 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.423056 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44rwk"] Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.546598 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjt8j\" (UniqueName: \"kubernetes.io/projected/a76cda73-9eb5-4a03-aa82-713af868b080-kube-api-access-pjt8j\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.546724 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-catalog-content\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.546769 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-utilities\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.648389 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjt8j\" (UniqueName: \"kubernetes.io/projected/a76cda73-9eb5-4a03-aa82-713af868b080-kube-api-access-pjt8j\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.648480 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-catalog-content\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.648508 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-utilities\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.648676 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.648924 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-utilities\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.649081 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-catalog-content\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.684244 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjt8j\" (UniqueName: \"kubernetes.io/projected/a76cda73-9eb5-4a03-aa82-713af868b080-kube-api-access-pjt8j\") pod \"redhat-marketplace-44rwk\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.733630 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.917543 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:23:32 crc kubenswrapper[4931]: I0130 06:23:32.997839 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-x7g4t"] Jan 30 06:23:33 crc kubenswrapper[4931]: I0130 06:23:33.061179 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerName="dnsmasq-dns" containerID="cri-o://550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316" gracePeriod=10 Jan 30 06:23:33 crc kubenswrapper[4931]: I0130 06:23:33.061489 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"66165b19-dfc8-403f-ae09-30299db6b19f","Type":"ContainerStarted","Data":"c8da065dc67a7d340a0bfa87355c64913e5f8e8af5dcfd198ef08fb14f210ff4"} Jan 30 06:23:33 crc kubenswrapper[4931]: I0130 06:23:33.086128 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.086111602 podStartE2EDuration="8.086111602s" podCreationTimestamp="2026-01-30 06:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:23:33.081995786 +0000 UTC m=+4548.451906043" watchObservedRunningTime="2026-01-30 06:23:33.086111602 +0000 UTC m=+4548.456021859" Jan 30 06:23:33 crc kubenswrapper[4931]: I0130 06:23:33.205185 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44rwk"] Jan 30 06:23:33 crc kubenswrapper[4931]: W0130 06:23:33.710860 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda76cda73_9eb5_4a03_aa82_713af868b080.slice/crio-13f9e353d857817a966433846a45a41341967cdad2202309b77ec7da31ff2c31 WatchSource:0}: Error finding container 13f9e353d857817a966433846a45a41341967cdad2202309b77ec7da31ff2c31: Status 404 returned error can't find the container with id 13f9e353d857817a966433846a45a41341967cdad2202309b77ec7da31ff2c31 Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.003099 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.071243 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lncd9\" (UniqueName: \"kubernetes.io/projected/6aa8cfa6-8d93-4f4c-844e-f180daf03802-kube-api-access-lncd9\") pod \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.071376 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-config\") pod \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.071529 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-dns-svc\") pod \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\" (UID: \"6aa8cfa6-8d93-4f4c-844e-f180daf03802\") " Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.079108 4931 generic.go:334] "Generic (PLEG): container finished" podID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerID="550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316" exitCode=0 Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.079474 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.080078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" event={"ID":"6aa8cfa6-8d93-4f4c-844e-f180daf03802","Type":"ContainerDied","Data":"550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316"} Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.080113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-x7g4t" event={"ID":"6aa8cfa6-8d93-4f4c-844e-f180daf03802","Type":"ContainerDied","Data":"1d3b909cce0f1614532a2470422ac028e30b8c0cc10d3660b10b49f6654468a0"} Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.080134 4931 scope.go:117] "RemoveContainer" containerID="550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.081475 4931 generic.go:334] "Generic (PLEG): container finished" podID="a76cda73-9eb5-4a03-aa82-713af868b080" containerID="afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd" exitCode=0 Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.081497 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44rwk" event={"ID":"a76cda73-9eb5-4a03-aa82-713af868b080","Type":"ContainerDied","Data":"afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd"} Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.081515 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44rwk" event={"ID":"a76cda73-9eb5-4a03-aa82-713af868b080","Type":"ContainerStarted","Data":"13f9e353d857817a966433846a45a41341967cdad2202309b77ec7da31ff2c31"} Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.103694 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa8cfa6-8d93-4f4c-844e-f180daf03802-kube-api-access-lncd9" (OuterVolumeSpecName: "kube-api-access-lncd9") pod "6aa8cfa6-8d93-4f4c-844e-f180daf03802" (UID: "6aa8cfa6-8d93-4f4c-844e-f180daf03802"). InnerVolumeSpecName "kube-api-access-lncd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.131078 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6aa8cfa6-8d93-4f4c-844e-f180daf03802" (UID: "6aa8cfa6-8d93-4f4c-844e-f180daf03802"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.137139 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-config" (OuterVolumeSpecName: "config") pod "6aa8cfa6-8d93-4f4c-844e-f180daf03802" (UID: "6aa8cfa6-8d93-4f4c-844e-f180daf03802"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.174449 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.174501 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lncd9\" (UniqueName: \"kubernetes.io/projected/6aa8cfa6-8d93-4f4c-844e-f180daf03802-kube-api-access-lncd9\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.174522 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa8cfa6-8d93-4f4c-844e-f180daf03802-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.189521 4931 scope.go:117] "RemoveContainer" containerID="05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.208016 4931 scope.go:117] "RemoveContainer" containerID="550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316" Jan 30 06:23:34 crc kubenswrapper[4931]: E0130 06:23:34.208495 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316\": container with ID starting with 550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316 not found: ID does not exist" containerID="550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.208525 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316"} err="failed to get container status \"550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316\": rpc error: code = NotFound desc = could not find container \"550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316\": container with ID starting with 550a3fe365e2455dfd0d4c3bd533972ce18601eb9855307b54e4d35d258dd316 not found: ID does not exist" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.208543 4931 scope.go:117] "RemoveContainer" containerID="05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5" Jan 30 06:23:34 crc kubenswrapper[4931]: E0130 06:23:34.209073 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5\": container with ID starting with 05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5 not found: ID does not exist" containerID="05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.209095 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5"} err="failed to get container status \"05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5\": rpc error: code = NotFound desc = could not find container \"05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5\": container with ID starting with 05364d77fba2566dc0fb047fc3895d7c2ab607cd60db2040c9304dc2551d05d5 not found: ID does not exist" Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.442045 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-x7g4t"] Jan 30 06:23:34 crc kubenswrapper[4931]: I0130 06:23:34.454213 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-x7g4t"] Jan 30 06:23:35 crc kubenswrapper[4931]: I0130 06:23:35.099965 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 06:23:35 crc kubenswrapper[4931]: I0130 06:23:35.100724 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 06:23:35 crc kubenswrapper[4931]: I0130 06:23:35.113114 4931 generic.go:334] "Generic (PLEG): container finished" podID="a76cda73-9eb5-4a03-aa82-713af868b080" containerID="f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587" exitCode=0 Jan 30 06:23:35 crc kubenswrapper[4931]: I0130 06:23:35.113163 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44rwk" event={"ID":"a76cda73-9eb5-4a03-aa82-713af868b080","Type":"ContainerDied","Data":"f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587"} Jan 30 06:23:35 crc kubenswrapper[4931]: I0130 06:23:35.433326 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" path="/var/lib/kubelet/pods/6aa8cfa6-8d93-4f4c-844e-f180daf03802/volumes" Jan 30 06:23:35 crc kubenswrapper[4931]: I0130 06:23:35.712236 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 06:23:36 crc kubenswrapper[4931]: I0130 06:23:36.127070 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44rwk" event={"ID":"a76cda73-9eb5-4a03-aa82-713af868b080","Type":"ContainerStarted","Data":"37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35"} Jan 30 06:23:36 crc kubenswrapper[4931]: I0130 06:23:36.152829 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-44rwk" podStartSLOduration=2.585329306 podStartE2EDuration="4.152801139s" podCreationTimestamp="2026-01-30 06:23:32 +0000 UTC" firstStartedPulling="2026-01-30 06:23:34.083238266 +0000 UTC m=+4549.453148533" lastFinishedPulling="2026-01-30 06:23:35.650710069 +0000 UTC m=+4551.020620366" observedRunningTime="2026-01-30 06:23:36.150272628 +0000 UTC m=+4551.520182885" watchObservedRunningTime="2026-01-30 06:23:36.152801139 +0000 UTC m=+4551.522711396" Jan 30 06:23:36 crc kubenswrapper[4931]: I0130 06:23:36.813085 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:36 crc kubenswrapper[4931]: I0130 06:23:36.813141 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:37 crc kubenswrapper[4931]: I0130 06:23:37.933861 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 06:23:38 crc kubenswrapper[4931]: I0130 06:23:38.053529 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 06:23:38 crc kubenswrapper[4931]: I0130 06:23:38.423110 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:23:38 crc kubenswrapper[4931]: E0130 06:23:38.423617 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:23:39 crc kubenswrapper[4931]: I0130 06:23:39.387193 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:39 crc kubenswrapper[4931]: I0130 06:23:39.517933 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 06:23:42 crc kubenswrapper[4931]: I0130 06:23:42.734651 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:42 crc kubenswrapper[4931]: I0130 06:23:42.735021 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:42 crc kubenswrapper[4931]: I0130 06:23:42.809894 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.272354 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.324190 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44rwk"] Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.751527 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hfrzc"] Jan 30 06:23:43 crc kubenswrapper[4931]: E0130 06:23:43.751954 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerName="init" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.751977 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerName="init" Jan 30 06:23:43 crc kubenswrapper[4931]: E0130 06:23:43.752007 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerName="dnsmasq-dns" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.752021 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerName="dnsmasq-dns" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.752324 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa8cfa6-8d93-4f4c-844e-f180daf03802" containerName="dnsmasq-dns" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.753282 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.759822 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.772353 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hfrzc"] Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.842847 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zn5\" (UniqueName: \"kubernetes.io/projected/fe686ff0-2117-48c0-bde6-41faa75e59b7-kube-api-access-l6zn5\") pod \"root-account-create-update-hfrzc\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.842925 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe686ff0-2117-48c0-bde6-41faa75e59b7-operator-scripts\") pod \"root-account-create-update-hfrzc\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.944877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zn5\" (UniqueName: \"kubernetes.io/projected/fe686ff0-2117-48c0-bde6-41faa75e59b7-kube-api-access-l6zn5\") pod \"root-account-create-update-hfrzc\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.945010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe686ff0-2117-48c0-bde6-41faa75e59b7-operator-scripts\") pod \"root-account-create-update-hfrzc\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.946398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe686ff0-2117-48c0-bde6-41faa75e59b7-operator-scripts\") pod \"root-account-create-update-hfrzc\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:43 crc kubenswrapper[4931]: I0130 06:23:43.979521 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zn5\" (UniqueName: \"kubernetes.io/projected/fe686ff0-2117-48c0-bde6-41faa75e59b7-kube-api-access-l6zn5\") pod \"root-account-create-update-hfrzc\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:44 crc kubenswrapper[4931]: I0130 06:23:44.082777 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:44 crc kubenswrapper[4931]: I0130 06:23:44.668666 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hfrzc"] Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.231140 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe686ff0-2117-48c0-bde6-41faa75e59b7" containerID="71a8124b599814d410f6d79c3260191d88bc5b88a1405b9bfb832aebcb013dc4" exitCode=0 Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.231220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hfrzc" event={"ID":"fe686ff0-2117-48c0-bde6-41faa75e59b7","Type":"ContainerDied","Data":"71a8124b599814d410f6d79c3260191d88bc5b88a1405b9bfb832aebcb013dc4"} Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.231711 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hfrzc" event={"ID":"fe686ff0-2117-48c0-bde6-41faa75e59b7","Type":"ContainerStarted","Data":"2035b861086e20c1fb342c823c4eb82d86fc588a7b4c49c61d68ad62b4ba70ac"} Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.231930 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-44rwk" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="registry-server" containerID="cri-o://37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35" gracePeriod=2 Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.864546 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.986646 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-catalog-content\") pod \"a76cda73-9eb5-4a03-aa82-713af868b080\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.986725 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjt8j\" (UniqueName: \"kubernetes.io/projected/a76cda73-9eb5-4a03-aa82-713af868b080-kube-api-access-pjt8j\") pod \"a76cda73-9eb5-4a03-aa82-713af868b080\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.986768 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-utilities\") pod \"a76cda73-9eb5-4a03-aa82-713af868b080\" (UID: \"a76cda73-9eb5-4a03-aa82-713af868b080\") " Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.987764 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-utilities" (OuterVolumeSpecName: "utilities") pod "a76cda73-9eb5-4a03-aa82-713af868b080" (UID: "a76cda73-9eb5-4a03-aa82-713af868b080"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:23:45 crc kubenswrapper[4931]: I0130 06:23:45.999785 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76cda73-9eb5-4a03-aa82-713af868b080-kube-api-access-pjt8j" (OuterVolumeSpecName: "kube-api-access-pjt8j") pod "a76cda73-9eb5-4a03-aa82-713af868b080" (UID: "a76cda73-9eb5-4a03-aa82-713af868b080"). InnerVolumeSpecName "kube-api-access-pjt8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.027784 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a76cda73-9eb5-4a03-aa82-713af868b080" (UID: "a76cda73-9eb5-4a03-aa82-713af868b080"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.089004 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjt8j\" (UniqueName: \"kubernetes.io/projected/a76cda73-9eb5-4a03-aa82-713af868b080-kube-api-access-pjt8j\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.089041 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.089057 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76cda73-9eb5-4a03-aa82-713af868b080-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.247276 4931 generic.go:334] "Generic (PLEG): container finished" podID="a76cda73-9eb5-4a03-aa82-713af868b080" containerID="37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35" exitCode=0 Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.247353 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44rwk" event={"ID":"a76cda73-9eb5-4a03-aa82-713af868b080","Type":"ContainerDied","Data":"37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35"} Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.247454 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44rwk" event={"ID":"a76cda73-9eb5-4a03-aa82-713af868b080","Type":"ContainerDied","Data":"13f9e353d857817a966433846a45a41341967cdad2202309b77ec7da31ff2c31"} Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.247487 4931 scope.go:117] "RemoveContainer" containerID="37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.247865 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44rwk" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.289495 4931 scope.go:117] "RemoveContainer" containerID="f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.309974 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44rwk"] Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.321880 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-44rwk"] Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.816348 4931 scope.go:117] "RemoveContainer" containerID="afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.905291 4931 scope.go:117] "RemoveContainer" containerID="37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35" Jan 30 06:23:46 crc kubenswrapper[4931]: E0130 06:23:46.905929 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35\": container with ID starting with 37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35 not found: ID does not exist" containerID="37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.905981 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35"} err="failed to get container status \"37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35\": rpc error: code = NotFound desc = could not find container \"37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35\": container with ID starting with 37a60ee51f1eacc8f4a70b4bc576ee6a4ef069c876a9fe53fe08b8bd96e2ba35 not found: ID does not exist" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.906018 4931 scope.go:117] "RemoveContainer" containerID="f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587" Jan 30 06:23:46 crc kubenswrapper[4931]: E0130 06:23:46.907584 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587\": container with ID starting with f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587 not found: ID does not exist" containerID="f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.907647 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587"} err="failed to get container status \"f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587\": rpc error: code = NotFound desc = could not find container \"f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587\": container with ID starting with f4f135e27efe07e242a7194bef57d51e8baf7e9411831f604ac271ebfa69f587 not found: ID does not exist" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.907709 4931 scope.go:117] "RemoveContainer" containerID="afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd" Jan 30 06:23:46 crc kubenswrapper[4931]: E0130 06:23:46.908222 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd\": container with ID starting with afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd not found: ID does not exist" containerID="afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.908315 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd"} err="failed to get container status \"afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd\": rpc error: code = NotFound desc = could not find container \"afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd\": container with ID starting with afd03841b8f733b7ed95951668267d08caf2a799b4210665020fcf7a1845facd not found: ID does not exist" Jan 30 06:23:46 crc kubenswrapper[4931]: I0130 06:23:46.929739 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.005351 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe686ff0-2117-48c0-bde6-41faa75e59b7-operator-scripts\") pod \"fe686ff0-2117-48c0-bde6-41faa75e59b7\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.005510 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6zn5\" (UniqueName: \"kubernetes.io/projected/fe686ff0-2117-48c0-bde6-41faa75e59b7-kube-api-access-l6zn5\") pod \"fe686ff0-2117-48c0-bde6-41faa75e59b7\" (UID: \"fe686ff0-2117-48c0-bde6-41faa75e59b7\") " Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.006795 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe686ff0-2117-48c0-bde6-41faa75e59b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe686ff0-2117-48c0-bde6-41faa75e59b7" (UID: "fe686ff0-2117-48c0-bde6-41faa75e59b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.011829 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe686ff0-2117-48c0-bde6-41faa75e59b7-kube-api-access-l6zn5" (OuterVolumeSpecName: "kube-api-access-l6zn5") pod "fe686ff0-2117-48c0-bde6-41faa75e59b7" (UID: "fe686ff0-2117-48c0-bde6-41faa75e59b7"). InnerVolumeSpecName "kube-api-access-l6zn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.107482 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6zn5\" (UniqueName: \"kubernetes.io/projected/fe686ff0-2117-48c0-bde6-41faa75e59b7-kube-api-access-l6zn5\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.107859 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe686ff0-2117-48c0-bde6-41faa75e59b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.263002 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hfrzc" event={"ID":"fe686ff0-2117-48c0-bde6-41faa75e59b7","Type":"ContainerDied","Data":"2035b861086e20c1fb342c823c4eb82d86fc588a7b4c49c61d68ad62b4ba70ac"} Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.263070 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2035b861086e20c1fb342c823c4eb82d86fc588a7b4c49c61d68ad62b4ba70ac" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.263159 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hfrzc" Jan 30 06:23:47 crc kubenswrapper[4931]: I0130 06:23:47.438553 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" path="/var/lib/kubelet/pods/a76cda73-9eb5-4a03-aa82-713af868b080/volumes" Jan 30 06:23:50 crc kubenswrapper[4931]: I0130 06:23:50.467849 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hfrzc"] Jan 30 06:23:50 crc kubenswrapper[4931]: I0130 06:23:50.474756 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hfrzc"] Jan 30 06:23:51 crc kubenswrapper[4931]: I0130 06:23:51.436377 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe686ff0-2117-48c0-bde6-41faa75e59b7" path="/var/lib/kubelet/pods/fe686ff0-2117-48c0-bde6-41faa75e59b7/volumes" Jan 30 06:23:52 crc kubenswrapper[4931]: I0130 06:23:52.422720 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:23:52 crc kubenswrapper[4931]: E0130 06:23:52.423162 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.485007 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4x52j"] Jan 30 06:23:55 crc kubenswrapper[4931]: E0130 06:23:55.485944 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="registry-server" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.485966 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="registry-server" Jan 30 06:23:55 crc kubenswrapper[4931]: E0130 06:23:55.485989 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="extract-content" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.486002 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="extract-content" Jan 30 06:23:55 crc kubenswrapper[4931]: E0130 06:23:55.486022 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe686ff0-2117-48c0-bde6-41faa75e59b7" containerName="mariadb-account-create-update" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.486036 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe686ff0-2117-48c0-bde6-41faa75e59b7" containerName="mariadb-account-create-update" Jan 30 06:23:55 crc kubenswrapper[4931]: E0130 06:23:55.486063 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="extract-utilities" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.486075 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="extract-utilities" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.486341 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76cda73-9eb5-4a03-aa82-713af868b080" containerName="registry-server" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.486368 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe686ff0-2117-48c0-bde6-41faa75e59b7" containerName="mariadb-account-create-update" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.489083 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.491965 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.495754 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4x52j"] Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.574274 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmnmx\" (UniqueName: \"kubernetes.io/projected/56513a2a-14aa-4055-8b35-de5c272faab9-kube-api-access-bmnmx\") pod \"root-account-create-update-4x52j\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.574411 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56513a2a-14aa-4055-8b35-de5c272faab9-operator-scripts\") pod \"root-account-create-update-4x52j\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.677011 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmnmx\" (UniqueName: \"kubernetes.io/projected/56513a2a-14aa-4055-8b35-de5c272faab9-kube-api-access-bmnmx\") pod \"root-account-create-update-4x52j\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.677124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56513a2a-14aa-4055-8b35-de5c272faab9-operator-scripts\") pod \"root-account-create-update-4x52j\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.678514 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56513a2a-14aa-4055-8b35-de5c272faab9-operator-scripts\") pod \"root-account-create-update-4x52j\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.711398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmnmx\" (UniqueName: \"kubernetes.io/projected/56513a2a-14aa-4055-8b35-de5c272faab9-kube-api-access-bmnmx\") pod \"root-account-create-update-4x52j\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:55 crc kubenswrapper[4931]: I0130 06:23:55.819832 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:56 crc kubenswrapper[4931]: I0130 06:23:56.118176 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4x52j"] Jan 30 06:23:56 crc kubenswrapper[4931]: I0130 06:23:56.353945 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4x52j" event={"ID":"56513a2a-14aa-4055-8b35-de5c272faab9","Type":"ContainerStarted","Data":"88feac8d2b5d033c066731d3fa1d66cc34b935b576cc56423d76770840e869ad"} Jan 30 06:23:56 crc kubenswrapper[4931]: I0130 06:23:56.354017 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4x52j" event={"ID":"56513a2a-14aa-4055-8b35-de5c272faab9","Type":"ContainerStarted","Data":"39f25fd44e0bc97c99560df022df6c67ee698d6e72a5c8a01753ba3f5a6baf85"} Jan 30 06:23:56 crc kubenswrapper[4931]: I0130 06:23:56.390009 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-4x52j" podStartSLOduration=1.389973283 podStartE2EDuration="1.389973283s" podCreationTimestamp="2026-01-30 06:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:23:56.381588288 +0000 UTC m=+4571.751498575" watchObservedRunningTime="2026-01-30 06:23:56.389973283 +0000 UTC m=+4571.759883580" Jan 30 06:23:57 crc kubenswrapper[4931]: I0130 06:23:57.365780 4931 generic.go:334] "Generic (PLEG): container finished" podID="56513a2a-14aa-4055-8b35-de5c272faab9" containerID="88feac8d2b5d033c066731d3fa1d66cc34b935b576cc56423d76770840e869ad" exitCode=0 Jan 30 06:23:57 crc kubenswrapper[4931]: I0130 06:23:57.365837 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4x52j" event={"ID":"56513a2a-14aa-4055-8b35-de5c272faab9","Type":"ContainerDied","Data":"88feac8d2b5d033c066731d3fa1d66cc34b935b576cc56423d76770840e869ad"} Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.761450 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.840758 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmnmx\" (UniqueName: \"kubernetes.io/projected/56513a2a-14aa-4055-8b35-de5c272faab9-kube-api-access-bmnmx\") pod \"56513a2a-14aa-4055-8b35-de5c272faab9\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.840850 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56513a2a-14aa-4055-8b35-de5c272faab9-operator-scripts\") pod \"56513a2a-14aa-4055-8b35-de5c272faab9\" (UID: \"56513a2a-14aa-4055-8b35-de5c272faab9\") " Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.841382 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56513a2a-14aa-4055-8b35-de5c272faab9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56513a2a-14aa-4055-8b35-de5c272faab9" (UID: "56513a2a-14aa-4055-8b35-de5c272faab9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.849468 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56513a2a-14aa-4055-8b35-de5c272faab9-kube-api-access-bmnmx" (OuterVolumeSpecName: "kube-api-access-bmnmx") pod "56513a2a-14aa-4055-8b35-de5c272faab9" (UID: "56513a2a-14aa-4055-8b35-de5c272faab9"). InnerVolumeSpecName "kube-api-access-bmnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.942721 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56513a2a-14aa-4055-8b35-de5c272faab9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:58 crc kubenswrapper[4931]: I0130 06:23:58.942762 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmnmx\" (UniqueName: \"kubernetes.io/projected/56513a2a-14aa-4055-8b35-de5c272faab9-kube-api-access-bmnmx\") on node \"crc\" DevicePath \"\"" Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.388006 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4x52j" Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.387999 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4x52j" event={"ID":"56513a2a-14aa-4055-8b35-de5c272faab9","Type":"ContainerDied","Data":"39f25fd44e0bc97c99560df022df6c67ee698d6e72a5c8a01753ba3f5a6baf85"} Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.388639 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f25fd44e0bc97c99560df022df6c67ee698d6e72a5c8a01753ba3f5a6baf85" Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.390581 4931 generic.go:334] "Generic (PLEG): container finished" podID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerID="1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee" exitCode=0 Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.390642 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2ffda181-212b-42f4-bd56-9ab2864ded3c","Type":"ContainerDied","Data":"1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee"} Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.393974 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerID="ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f" exitCode=0 Jan 30 06:23:59 crc kubenswrapper[4931]: I0130 06:23:59.394020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3e54229-729f-4bfc-a208-dc39edc35b8a","Type":"ContainerDied","Data":"ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f"} Jan 30 06:24:00 crc kubenswrapper[4931]: I0130 06:24:00.403588 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2ffda181-212b-42f4-bd56-9ab2864ded3c","Type":"ContainerStarted","Data":"a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523"} Jan 30 06:24:00 crc kubenswrapper[4931]: I0130 06:24:00.404668 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 06:24:00 crc kubenswrapper[4931]: I0130 06:24:00.405658 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3e54229-729f-4bfc-a208-dc39edc35b8a","Type":"ContainerStarted","Data":"2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf"} Jan 30 06:24:00 crc kubenswrapper[4931]: I0130 06:24:00.406341 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:00 crc kubenswrapper[4931]: I0130 06:24:00.427827 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.427811807 podStartE2EDuration="38.427811807s" podCreationTimestamp="2026-01-30 06:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:24:00.425584585 +0000 UTC m=+4575.795494872" watchObservedRunningTime="2026-01-30 06:24:00.427811807 +0000 UTC m=+4575.797722064" Jan 30 06:24:00 crc kubenswrapper[4931]: I0130 06:24:00.453903 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.453886376 podStartE2EDuration="38.453886376s" podCreationTimestamp="2026-01-30 06:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:24:00.451071337 +0000 UTC m=+4575.820981684" watchObservedRunningTime="2026-01-30 06:24:00.453886376 +0000 UTC m=+4575.823796633" Jan 30 06:24:05 crc kubenswrapper[4931]: I0130 06:24:05.426179 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:24:06 crc kubenswrapper[4931]: I0130 06:24:06.456244 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"2c01137c96244e8746673f3822245d96905f563a3e4aa39b2a8d7db22e60ff5b"} Jan 30 06:24:14 crc kubenswrapper[4931]: I0130 06:24:14.035773 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:14 crc kubenswrapper[4931]: I0130 06:24:14.109055 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.204659 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699964fbc-8dc7b"] Jan 30 06:24:20 crc kubenswrapper[4931]: E0130 06:24:20.205601 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56513a2a-14aa-4055-8b35-de5c272faab9" containerName="mariadb-account-create-update" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.205621 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="56513a2a-14aa-4055-8b35-de5c272faab9" containerName="mariadb-account-create-update" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.205811 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="56513a2a-14aa-4055-8b35-de5c272faab9" containerName="mariadb-account-create-update" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.206706 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.224656 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-8dc7b"] Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.324704 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-dns-svc\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.324787 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-config\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.324949 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/df16978b-d22c-4dd1-87d8-330cf82a859d-kube-api-access-r68qr\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.426691 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-config\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.426760 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/df16978b-d22c-4dd1-87d8-330cf82a859d-kube-api-access-r68qr\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.426857 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-dns-svc\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.427817 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-config\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.427911 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-dns-svc\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.460129 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/df16978b-d22c-4dd1-87d8-330cf82a859d-kube-api-access-r68qr\") pod \"dnsmasq-dns-699964fbc-8dc7b\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:20 crc kubenswrapper[4931]: I0130 06:24:20.554502 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:21 crc kubenswrapper[4931]: I0130 06:24:21.045703 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:24:21 crc kubenswrapper[4931]: W0130 06:24:21.113621 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf16978b_d22c_4dd1_87d8_330cf82a859d.slice/crio-e39274ddf419bf92df1598af572cac14fef4ddf978c753b8f221ebec54b897c7 WatchSource:0}: Error finding container e39274ddf419bf92df1598af572cac14fef4ddf978c753b8f221ebec54b897c7: Status 404 returned error can't find the container with id e39274ddf419bf92df1598af572cac14fef4ddf978c753b8f221ebec54b897c7 Jan 30 06:24:21 crc kubenswrapper[4931]: I0130 06:24:21.117403 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-8dc7b"] Jan 30 06:24:21 crc kubenswrapper[4931]: I0130 06:24:21.630970 4931 generic.go:334] "Generic (PLEG): container finished" podID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerID="a56ecc0c98ffc762965d58506b3e81c6c6637f6a00e16f27ab2be355f3d037e0" exitCode=0 Jan 30 06:24:21 crc kubenswrapper[4931]: I0130 06:24:21.631016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" event={"ID":"df16978b-d22c-4dd1-87d8-330cf82a859d","Type":"ContainerDied","Data":"a56ecc0c98ffc762965d58506b3e81c6c6637f6a00e16f27ab2be355f3d037e0"} Jan 30 06:24:21 crc kubenswrapper[4931]: I0130 06:24:21.631312 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" event={"ID":"df16978b-d22c-4dd1-87d8-330cf82a859d","Type":"ContainerStarted","Data":"e39274ddf419bf92df1598af572cac14fef4ddf978c753b8f221ebec54b897c7"} Jan 30 06:24:21 crc kubenswrapper[4931]: I0130 06:24:21.722157 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:24:22 crc kubenswrapper[4931]: I0130 06:24:22.640398 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" event={"ID":"df16978b-d22c-4dd1-87d8-330cf82a859d","Type":"ContainerStarted","Data":"c38a0669baf43cbcb3f248e3fbd7e19a5da64c872efcf71b2a52e9bac3e9cedd"} Jan 30 06:24:22 crc kubenswrapper[4931]: I0130 06:24:22.641067 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:22 crc kubenswrapper[4931]: I0130 06:24:22.664442 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" podStartSLOduration=2.664412048 podStartE2EDuration="2.664412048s" podCreationTimestamp="2026-01-30 06:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:24:22.662569067 +0000 UTC m=+4598.032479364" watchObservedRunningTime="2026-01-30 06:24:22.664412048 +0000 UTC m=+4598.034322305" Jan 30 06:24:22 crc kubenswrapper[4931]: I0130 06:24:22.886046 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="rabbitmq" containerID="cri-o://a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523" gracePeriod=604799 Jan 30 06:24:23 crc kubenswrapper[4931]: I0130 06:24:23.517702 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="rabbitmq" containerID="cri-o://2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf" gracePeriod=604799 Jan 30 06:24:24 crc kubenswrapper[4931]: I0130 06:24:24.031494 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.249:5672: connect: connection refused" Jan 30 06:24:24 crc kubenswrapper[4931]: I0130 06:24:24.106110 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.248:5672: connect: connection refused" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.499127 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.622654 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-erlang-cookie\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.622699 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-confd\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.622740 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ngdz\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-kube-api-access-2ngdz\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.622784 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ffda181-212b-42f4-bd56-9ab2864ded3c-pod-info\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.622829 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-server-conf\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623042 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623096 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-plugins-conf\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623137 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ffda181-212b-42f4-bd56-9ab2864ded3c-erlang-cookie-secret\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623207 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-plugins\") pod \"2ffda181-212b-42f4-bd56-9ab2864ded3c\" (UID: \"2ffda181-212b-42f4-bd56-9ab2864ded3c\") " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623670 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623920 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.623933 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.624020 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.627954 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2ffda181-212b-42f4-bd56-9ab2864ded3c-pod-info" (OuterVolumeSpecName: "pod-info") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.629272 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffda181-212b-42f4-bd56-9ab2864ded3c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.635486 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485" (OuterVolumeSpecName: "persistence") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.647962 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-kube-api-access-2ngdz" (OuterVolumeSpecName: "kube-api-access-2ngdz") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "kube-api-access-2ngdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.659077 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-server-conf" (OuterVolumeSpecName: "server-conf") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.719893 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2ffda181-212b-42f4-bd56-9ab2864ded3c" (UID: "2ffda181-212b-42f4-bd56-9ab2864ded3c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.724028 4931 generic.go:334] "Generic (PLEG): container finished" podID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerID="a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523" exitCode=0 Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.724074 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2ffda181-212b-42f4-bd56-9ab2864ded3c","Type":"ContainerDied","Data":"a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523"} Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.724106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2ffda181-212b-42f4-bd56-9ab2864ded3c","Type":"ContainerDied","Data":"d1546caee0e50a6ac2e387f8e39e4451c82f903d75d2682ba4430ed473fac38f"} Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.724126 4931 scope.go:117] "RemoveContainer" containerID="a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.724234 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.736915 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.736947 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ngdz\" (UniqueName: \"kubernetes.io/projected/2ffda181-212b-42f4-bd56-9ab2864ded3c-kube-api-access-2ngdz\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.736958 4931 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ffda181-212b-42f4-bd56-9ab2864ded3c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.736967 4931 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.737002 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") on node \"crc\" " Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.737014 4931 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ffda181-212b-42f4-bd56-9ab2864ded3c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.737026 4931 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ffda181-212b-42f4-bd56-9ab2864ded3c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.737037 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ffda181-212b-42f4-bd56-9ab2864ded3c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.753046 4931 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.753175 4931 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485") on node "crc" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.762515 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.766737 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.788864 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:24:29 crc kubenswrapper[4931]: E0130 06:24:29.789241 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="setup-container" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.789256 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="setup-container" Jan 30 06:24:29 crc kubenswrapper[4931]: E0130 06:24:29.789290 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="rabbitmq" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.789298 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="rabbitmq" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.789518 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" containerName="rabbitmq" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.790434 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.794048 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.794088 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.794095 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mgh9s" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.794382 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.794533 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.795888 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.839796 4931 reconciler_common.go:293] "Volume detached for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.876489 4931 scope.go:117] "RemoveContainer" containerID="1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.897723 4931 scope.go:117] "RemoveContainer" containerID="a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523" Jan 30 06:24:29 crc kubenswrapper[4931]: E0130 06:24:29.898320 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523\": container with ID starting with a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523 not found: ID does not exist" containerID="a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.898379 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523"} err="failed to get container status \"a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523\": rpc error: code = NotFound desc = could not find container \"a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523\": container with ID starting with a149fd78b00119365a78cb917558b9ad6509169c4733a14475d7be14b2c5b523 not found: ID does not exist" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.898409 4931 scope.go:117] "RemoveContainer" containerID="1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee" Jan 30 06:24:29 crc kubenswrapper[4931]: E0130 06:24:29.898860 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee\": container with ID starting with 1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee not found: ID does not exist" containerID="1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.898903 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee"} err="failed to get container status \"1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee\": rpc error: code = NotFound desc = could not find container \"1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee\": container with ID starting with 1ecf05555efd310d268e093a1d1f4f0d2289ee41c1d84fd8cf157dc399ec6bee not found: ID does not exist" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941290 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941365 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941468 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8dabfefe-4927-44d0-b370-f7e28f2a4f57-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8dabfefe-4927-44d0-b370-f7e28f2a4f57-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941633 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8dabfefe-4927-44d0-b370-f7e28f2a4f57-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941680 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941715 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27h4b\" (UniqueName: \"kubernetes.io/projected/8dabfefe-4927-44d0-b370-f7e28f2a4f57-kube-api-access-27h4b\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941763 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:29 crc kubenswrapper[4931]: I0130 06:24:29.941804 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8dabfefe-4927-44d0-b370-f7e28f2a4f57-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.043462 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8dabfefe-4927-44d0-b370-f7e28f2a4f57-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.043897 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8dabfefe-4927-44d0-b370-f7e28f2a4f57-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.043933 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8dabfefe-4927-44d0-b370-f7e28f2a4f57-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.043984 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.044051 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27h4b\" (UniqueName: \"kubernetes.io/projected/8dabfefe-4927-44d0-b370-f7e28f2a4f57-kube-api-access-27h4b\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.044101 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.044156 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8dabfefe-4927-44d0-b370-f7e28f2a4f57-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.044261 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.044304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.045248 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.045256 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.047580 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8dabfefe-4927-44d0-b370-f7e28f2a4f57-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.048468 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8dabfefe-4927-44d0-b370-f7e28f2a4f57-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.049585 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.049679 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/96f6795482cf208a0436fcedd4e13f5ef58c9a3e2d9d6166beea188ab34f9e81/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.052607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8dabfefe-4927-44d0-b370-f7e28f2a4f57-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.057259 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8dabfefe-4927-44d0-b370-f7e28f2a4f57-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.057772 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8dabfefe-4927-44d0-b370-f7e28f2a4f57-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.073511 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27h4b\" (UniqueName: \"kubernetes.io/projected/8dabfefe-4927-44d0-b370-f7e28f2a4f57-kube-api-access-27h4b\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.089811 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-57c51230-7d07-4cc2-b5d8-2ac82e509485\") pod \"rabbitmq-server-0\" (UID: \"8dabfefe-4927-44d0-b370-f7e28f2a4f57\") " pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.151175 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.153413 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.247862 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-plugins\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.247957 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3e54229-729f-4bfc-a208-dc39edc35b8a-erlang-cookie-secret\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.247994 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-confd\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248031 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3e54229-729f-4bfc-a208-dc39edc35b8a-pod-info\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248075 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-server-conf\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248102 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-plugins-conf\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248139 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-erlang-cookie\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248324 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248343 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bknnm\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-kube-api-access-bknnm\") pod \"c3e54229-729f-4bfc-a208-dc39edc35b8a\" (UID: \"c3e54229-729f-4bfc-a208-dc39edc35b8a\") " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248537 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.248716 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.249448 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.249877 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.254175 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c3e54229-729f-4bfc-a208-dc39edc35b8a-pod-info" (OuterVolumeSpecName: "pod-info") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.257833 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e54229-729f-4bfc-a208-dc39edc35b8a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.269651 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-kube-api-access-bknnm" (OuterVolumeSpecName: "kube-api-access-bknnm") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "kube-api-access-bknnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.273364 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-server-conf" (OuterVolumeSpecName: "server-conf") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.278086 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d" (OuterVolumeSpecName: "persistence") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "pvc-73b1cc26-6baa-43d9-842a-e2612558a78d". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.347725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c3e54229-729f-4bfc-a208-dc39edc35b8a" (UID: "c3e54229-729f-4bfc-a208-dc39edc35b8a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350624 4931 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3e54229-729f-4bfc-a208-dc39edc35b8a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350676 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350692 4931 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3e54229-729f-4bfc-a208-dc39edc35b8a-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350703 4931 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350715 4931 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3e54229-729f-4bfc-a208-dc39edc35b8a-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350728 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3e54229-729f-4bfc-a208-dc39edc35b8a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350787 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") on node \"crc\" " Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.350803 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bknnm\" (UniqueName: \"kubernetes.io/projected/c3e54229-729f-4bfc-a208-dc39edc35b8a-kube-api-access-bknnm\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.375608 4931 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.376090 4931 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-73b1cc26-6baa-43d9-842a-e2612558a78d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d") on node "crc" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.452638 4931 reconciler_common.go:293] "Volume detached for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.556729 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.626087 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-7dq6z"] Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.626464 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerName="dnsmasq-dns" containerID="cri-o://8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46" gracePeriod=10 Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.692249 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:24:30 crc kubenswrapper[4931]: W0130 06:24:30.727930 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dabfefe_4927_44d0_b370_f7e28f2a4f57.slice/crio-26192b5a98ae39a9cf72b0bd822a0e13879c7330012c002611b214907a438036 WatchSource:0}: Error finding container 26192b5a98ae39a9cf72b0bd822a0e13879c7330012c002611b214907a438036: Status 404 returned error can't find the container with id 26192b5a98ae39a9cf72b0bd822a0e13879c7330012c002611b214907a438036 Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.741969 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerID="2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf" exitCode=0 Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.742008 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3e54229-729f-4bfc-a208-dc39edc35b8a","Type":"ContainerDied","Data":"2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf"} Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.742030 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3e54229-729f-4bfc-a208-dc39edc35b8a","Type":"ContainerDied","Data":"8795a34898e6841b11366ed2b6c1d443c46d48e62ae64548732f8a0d5ccd026d"} Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.742046 4931 scope.go:117] "RemoveContainer" containerID="2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.742186 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.799092 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.805330 4931 scope.go:117] "RemoveContainer" containerID="ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.819472 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.827294 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:24:30 crc kubenswrapper[4931]: E0130 06:24:30.827710 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="rabbitmq" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.827724 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="rabbitmq" Jan 30 06:24:30 crc kubenswrapper[4931]: E0130 06:24:30.827744 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="setup-container" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.827751 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="setup-container" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.827891 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" containerName="rabbitmq" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.831047 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.834080 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.834247 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.834413 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.834647 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wgz6z" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.834856 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.837237 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.844707 4931 scope.go:117] "RemoveContainer" containerID="2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf" Jan 30 06:24:30 crc kubenswrapper[4931]: E0130 06:24:30.845237 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf\": container with ID starting with 2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf not found: ID does not exist" containerID="2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.845285 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf"} err="failed to get container status \"2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf\": rpc error: code = NotFound desc = could not find container \"2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf\": container with ID starting with 2265dcbb515242dbc1813f20af18edf4d5c884fae8b8b34b8e14ce4d80da9cbf not found: ID does not exist" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.845310 4931 scope.go:117] "RemoveContainer" containerID="ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f" Jan 30 06:24:30 crc kubenswrapper[4931]: E0130 06:24:30.845572 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f\": container with ID starting with ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f not found: ID does not exist" containerID="ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.845594 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f"} err="failed to get container status \"ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f\": rpc error: code = NotFound desc = could not find container \"ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f\": container with ID starting with ec04384c6d230b5ff8300554f9d2de9fa198aca174b998590f18c0a77858c43f not found: ID does not exist" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966491 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966552 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966573 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966634 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966669 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthw6\" (UniqueName: \"kubernetes.io/projected/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-kube-api-access-gthw6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966707 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966723 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:30 crc kubenswrapper[4931]: I0130 06:24:30.966739 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067484 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067780 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067802 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067830 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067857 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067875 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthw6\" (UniqueName: \"kubernetes.io/projected/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-kube-api-access-gthw6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067898 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067916 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.067930 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.068483 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.069643 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.071735 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.072886 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.074029 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.074799 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.076763 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.079052 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.079111 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/081bf8f84c60dfb918ea0eb5418be09e59105cf3295dde894d1b133731bc6391/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.089814 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthw6\" (UniqueName: \"kubernetes.io/projected/ea4042a1-4ebc-4b11-a7e4-e695a668aa81-kube-api-access-gthw6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.108610 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.113898 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73b1cc26-6baa-43d9-842a-e2612558a78d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea4042a1-4ebc-4b11-a7e4-e695a668aa81\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.223361 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.271024 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l2x7\" (UniqueName: \"kubernetes.io/projected/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-kube-api-access-5l2x7\") pod \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.271227 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-config\") pod \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.271283 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-dns-svc\") pod \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\" (UID: \"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c\") " Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.276786 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-kube-api-access-5l2x7" (OuterVolumeSpecName: "kube-api-access-5l2x7") pod "a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" (UID: "a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c"). InnerVolumeSpecName "kube-api-access-5l2x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.321115 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" (UID: "a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.324231 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-config" (OuterVolumeSpecName: "config") pod "a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" (UID: "a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.376849 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l2x7\" (UniqueName: \"kubernetes.io/projected/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-kube-api-access-5l2x7\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.377113 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.377123 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.431628 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ffda181-212b-42f4-bd56-9ab2864ded3c" path="/var/lib/kubelet/pods/2ffda181-212b-42f4-bd56-9ab2864ded3c/volumes" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.432793 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e54229-729f-4bfc-a208-dc39edc35b8a" path="/var/lib/kubelet/pods/c3e54229-729f-4bfc-a208-dc39edc35b8a/volumes" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.697067 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:24:31 crc kubenswrapper[4931]: W0130 06:24:31.709357 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4042a1_4ebc_4b11_a7e4_e695a668aa81.slice/crio-258fba18620fdf53d39a231cccf314fbf000ec50000e662c1ea9ce4f5aa04430 WatchSource:0}: Error finding container 258fba18620fdf53d39a231cccf314fbf000ec50000e662c1ea9ce4f5aa04430: Status 404 returned error can't find the container with id 258fba18620fdf53d39a231cccf314fbf000ec50000e662c1ea9ce4f5aa04430 Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.755386 4931 generic.go:334] "Generic (PLEG): container finished" podID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerID="8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46" exitCode=0 Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.755564 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" event={"ID":"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c","Type":"ContainerDied","Data":"8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46"} Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.755655 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" event={"ID":"a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c","Type":"ContainerDied","Data":"df23f9bc8284d347a4dda805a55b9a64a2739f707161ba449d8a6399ee4d3665"} Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.755650 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-7dq6z" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.755689 4931 scope.go:117] "RemoveContainer" containerID="8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.760780 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea4042a1-4ebc-4b11-a7e4-e695a668aa81","Type":"ContainerStarted","Data":"258fba18620fdf53d39a231cccf314fbf000ec50000e662c1ea9ce4f5aa04430"} Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.767198 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8dabfefe-4927-44d0-b370-f7e28f2a4f57","Type":"ContainerStarted","Data":"26192b5a98ae39a9cf72b0bd822a0e13879c7330012c002611b214907a438036"} Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.888275 4931 scope.go:117] "RemoveContainer" containerID="85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7" Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.923161 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-7dq6z"] Jan 30 06:24:31 crc kubenswrapper[4931]: I0130 06:24:31.930564 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-7dq6z"] Jan 30 06:24:32 crc kubenswrapper[4931]: I0130 06:24:32.021334 4931 scope.go:117] "RemoveContainer" containerID="8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46" Jan 30 06:24:32 crc kubenswrapper[4931]: E0130 06:24:32.021965 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46\": container with ID starting with 8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46 not found: ID does not exist" containerID="8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46" Jan 30 06:24:32 crc kubenswrapper[4931]: I0130 06:24:32.022022 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46"} err="failed to get container status \"8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46\": rpc error: code = NotFound desc = could not find container \"8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46\": container with ID starting with 8077550b8b238461a8681b1d15ec72e4c6bfd9ffea2a3677dfce9da31986db46 not found: ID does not exist" Jan 30 06:24:32 crc kubenswrapper[4931]: I0130 06:24:32.022060 4931 scope.go:117] "RemoveContainer" containerID="85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7" Jan 30 06:24:32 crc kubenswrapper[4931]: E0130 06:24:32.022679 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7\": container with ID starting with 85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7 not found: ID does not exist" containerID="85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7" Jan 30 06:24:32 crc kubenswrapper[4931]: I0130 06:24:32.022722 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7"} err="failed to get container status \"85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7\": rpc error: code = NotFound desc = could not find container \"85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7\": container with ID starting with 85ba0da9a4253ef8d549cbbe8037a7a7fd65be4b07650d2f13e5de75d2006dc7 not found: ID does not exist" Jan 30 06:24:32 crc kubenswrapper[4931]: I0130 06:24:32.779006 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8dabfefe-4927-44d0-b370-f7e28f2a4f57","Type":"ContainerStarted","Data":"4a7054ec29eebc4e0ea1decb7fd718f56644883114ed130ad148289acb6131f6"} Jan 30 06:24:33 crc kubenswrapper[4931]: I0130 06:24:33.437057 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" path="/var/lib/kubelet/pods/a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c/volumes" Jan 30 06:24:33 crc kubenswrapper[4931]: I0130 06:24:33.795316 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea4042a1-4ebc-4b11-a7e4-e695a668aa81","Type":"ContainerStarted","Data":"e9b635eb7fff471125ec7e17c4a2d16fe738c9f4ff34ea8d446cfc2643643db8"} Jan 30 06:25:06 crc kubenswrapper[4931]: I0130 06:25:06.127254 4931 generic.go:334] "Generic (PLEG): container finished" podID="8dabfefe-4927-44d0-b370-f7e28f2a4f57" containerID="4a7054ec29eebc4e0ea1decb7fd718f56644883114ed130ad148289acb6131f6" exitCode=0 Jan 30 06:25:06 crc kubenswrapper[4931]: I0130 06:25:06.127324 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8dabfefe-4927-44d0-b370-f7e28f2a4f57","Type":"ContainerDied","Data":"4a7054ec29eebc4e0ea1decb7fd718f56644883114ed130ad148289acb6131f6"} Jan 30 06:25:07 crc kubenswrapper[4931]: I0130 06:25:07.177289 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8dabfefe-4927-44d0-b370-f7e28f2a4f57","Type":"ContainerStarted","Data":"b4e923c38a6c09f553fc519448bfaa59f47f84f3474d9f91b2de09906bc96c20"} Jan 30 06:25:07 crc kubenswrapper[4931]: I0130 06:25:07.177801 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 06:25:07 crc kubenswrapper[4931]: I0130 06:25:07.208305 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.208290095 podStartE2EDuration="38.208290095s" podCreationTimestamp="2026-01-30 06:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:25:07.207117892 +0000 UTC m=+4642.577028189" watchObservedRunningTime="2026-01-30 06:25:07.208290095 +0000 UTC m=+4642.578200352" Jan 30 06:25:07 crc kubenswrapper[4931]: I0130 06:25:07.210762 4931 generic.go:334] "Generic (PLEG): container finished" podID="ea4042a1-4ebc-4b11-a7e4-e695a668aa81" containerID="e9b635eb7fff471125ec7e17c4a2d16fe738c9f4ff34ea8d446cfc2643643db8" exitCode=0 Jan 30 06:25:07 crc kubenswrapper[4931]: I0130 06:25:07.210827 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea4042a1-4ebc-4b11-a7e4-e695a668aa81","Type":"ContainerDied","Data":"e9b635eb7fff471125ec7e17c4a2d16fe738c9f4ff34ea8d446cfc2643643db8"} Jan 30 06:25:08 crc kubenswrapper[4931]: I0130 06:25:08.219845 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea4042a1-4ebc-4b11-a7e4-e695a668aa81","Type":"ContainerStarted","Data":"7b9bea16453033d6720d410c61025a96cabc35996588e21e535a5bf5d370b443"} Jan 30 06:25:08 crc kubenswrapper[4931]: I0130 06:25:08.220286 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:25:08 crc kubenswrapper[4931]: I0130 06:25:08.249137 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.249112441 podStartE2EDuration="38.249112441s" podCreationTimestamp="2026-01-30 06:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:25:08.246373524 +0000 UTC m=+4643.616283791" watchObservedRunningTime="2026-01-30 06:25:08.249112441 +0000 UTC m=+4643.619022718" Jan 30 06:25:20 crc kubenswrapper[4931]: I0130 06:25:20.155787 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 06:25:21 crc kubenswrapper[4931]: I0130 06:25:21.232740 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:25:27 crc kubenswrapper[4931]: I0130 06:25:27.990380 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 30 06:25:27 crc kubenswrapper[4931]: E0130 06:25:27.991627 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerName="dnsmasq-dns" Jan 30 06:25:27 crc kubenswrapper[4931]: I0130 06:25:27.991646 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerName="dnsmasq-dns" Jan 30 06:25:27 crc kubenswrapper[4931]: E0130 06:25:27.991660 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerName="init" Jan 30 06:25:27 crc kubenswrapper[4931]: I0130 06:25:27.991668 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerName="init" Jan 30 06:25:27 crc kubenswrapper[4931]: I0130 06:25:27.991846 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43ae7a0-f3ca-4eb9-ae23-e6dc8f4e5a1c" containerName="dnsmasq-dns" Jan 30 06:25:27 crc kubenswrapper[4931]: I0130 06:25:27.992913 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:25:27 crc kubenswrapper[4931]: I0130 06:25:27.996165 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qqbkw" Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.000570 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.121482 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dx8c\" (UniqueName: \"kubernetes.io/projected/01ffdc71-90ec-42da-ae77-d65caba67d94-kube-api-access-4dx8c\") pod \"mariadb-client\" (UID: \"01ffdc71-90ec-42da-ae77-d65caba67d94\") " pod="openstack/mariadb-client" Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.223535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dx8c\" (UniqueName: \"kubernetes.io/projected/01ffdc71-90ec-42da-ae77-d65caba67d94-kube-api-access-4dx8c\") pod \"mariadb-client\" (UID: \"01ffdc71-90ec-42da-ae77-d65caba67d94\") " pod="openstack/mariadb-client" Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.262716 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dx8c\" (UniqueName: \"kubernetes.io/projected/01ffdc71-90ec-42da-ae77-d65caba67d94-kube-api-access-4dx8c\") pod \"mariadb-client\" (UID: \"01ffdc71-90ec-42da-ae77-d65caba67d94\") " pod="openstack/mariadb-client" Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.327003 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.877531 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:25:28 crc kubenswrapper[4931]: W0130 06:25:28.881063 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01ffdc71_90ec_42da_ae77_d65caba67d94.slice/crio-47b19816a25b7b5565b5d3a98e3c040159fe14ce20a317a79602e112c5e860fb WatchSource:0}: Error finding container 47b19816a25b7b5565b5d3a98e3c040159fe14ce20a317a79602e112c5e860fb: Status 404 returned error can't find the container with id 47b19816a25b7b5565b5d3a98e3c040159fe14ce20a317a79602e112c5e860fb Jan 30 06:25:28 crc kubenswrapper[4931]: I0130 06:25:28.884332 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:25:29 crc kubenswrapper[4931]: I0130 06:25:29.435304 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"01ffdc71-90ec-42da-ae77-d65caba67d94","Type":"ContainerStarted","Data":"47b19816a25b7b5565b5d3a98e3c040159fe14ce20a317a79602e112c5e860fb"} Jan 30 06:25:30 crc kubenswrapper[4931]: I0130 06:25:30.449908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"01ffdc71-90ec-42da-ae77-d65caba67d94","Type":"ContainerStarted","Data":"076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175"} Jan 30 06:25:30 crc kubenswrapper[4931]: I0130 06:25:30.471599 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.574171144 podStartE2EDuration="3.471572609s" podCreationTimestamp="2026-01-30 06:25:27 +0000 UTC" firstStartedPulling="2026-01-30 06:25:28.883948373 +0000 UTC m=+4664.253858670" lastFinishedPulling="2026-01-30 06:25:29.781349838 +0000 UTC m=+4665.151260135" observedRunningTime="2026-01-30 06:25:30.464545662 +0000 UTC m=+4665.834455959" watchObservedRunningTime="2026-01-30 06:25:30.471572609 +0000 UTC m=+4665.841482896" Jan 30 06:25:32 crc kubenswrapper[4931]: E0130 06:25:32.551459 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.179:49758->38.102.83.179:45103: write tcp 38.102.83.179:49758->38.102.83.179:45103: write: broken pipe Jan 30 06:25:43 crc kubenswrapper[4931]: I0130 06:25:43.912196 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:25:43 crc kubenswrapper[4931]: I0130 06:25:43.912751 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="01ffdc71-90ec-42da-ae77-d65caba67d94" containerName="mariadb-client" containerID="cri-o://076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175" gracePeriod=30 Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.476164 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.586967 4931 generic.go:334] "Generic (PLEG): container finished" podID="01ffdc71-90ec-42da-ae77-d65caba67d94" containerID="076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175" exitCode=143 Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.587033 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.587025 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"01ffdc71-90ec-42da-ae77-d65caba67d94","Type":"ContainerDied","Data":"076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175"} Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.587201 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"01ffdc71-90ec-42da-ae77-d65caba67d94","Type":"ContainerDied","Data":"47b19816a25b7b5565b5d3a98e3c040159fe14ce20a317a79602e112c5e860fb"} Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.587232 4931 scope.go:117] "RemoveContainer" containerID="076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.601377 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dx8c\" (UniqueName: \"kubernetes.io/projected/01ffdc71-90ec-42da-ae77-d65caba67d94-kube-api-access-4dx8c\") pod \"01ffdc71-90ec-42da-ae77-d65caba67d94\" (UID: \"01ffdc71-90ec-42da-ae77-d65caba67d94\") " Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.608896 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ffdc71-90ec-42da-ae77-d65caba67d94-kube-api-access-4dx8c" (OuterVolumeSpecName: "kube-api-access-4dx8c") pod "01ffdc71-90ec-42da-ae77-d65caba67d94" (UID: "01ffdc71-90ec-42da-ae77-d65caba67d94"). InnerVolumeSpecName "kube-api-access-4dx8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.615281 4931 scope.go:117] "RemoveContainer" containerID="076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175" Jan 30 06:25:44 crc kubenswrapper[4931]: E0130 06:25:44.615772 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175\": container with ID starting with 076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175 not found: ID does not exist" containerID="076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.615814 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175"} err="failed to get container status \"076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175\": rpc error: code = NotFound desc = could not find container \"076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175\": container with ID starting with 076ac035df6a30e346197c66aae9d9956205573072018033a2c8c55784c52175 not found: ID does not exist" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.703198 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dx8c\" (UniqueName: \"kubernetes.io/projected/01ffdc71-90ec-42da-ae77-d65caba67d94-kube-api-access-4dx8c\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.924820 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:25:44 crc kubenswrapper[4931]: I0130 06:25:44.934443 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:25:45 crc kubenswrapper[4931]: I0130 06:25:45.439505 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ffdc71-90ec-42da-ae77-d65caba67d94" path="/var/lib/kubelet/pods/01ffdc71-90ec-42da-ae77-d65caba67d94/volumes" Jan 30 06:25:56 crc kubenswrapper[4931]: I0130 06:25:56.917491 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gp9rt"] Jan 30 06:25:56 crc kubenswrapper[4931]: E0130 06:25:56.918154 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ffdc71-90ec-42da-ae77-d65caba67d94" containerName="mariadb-client" Jan 30 06:25:56 crc kubenswrapper[4931]: I0130 06:25:56.918166 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ffdc71-90ec-42da-ae77-d65caba67d94" containerName="mariadb-client" Jan 30 06:25:56 crc kubenswrapper[4931]: I0130 06:25:56.918308 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ffdc71-90ec-42da-ae77-d65caba67d94" containerName="mariadb-client" Jan 30 06:25:56 crc kubenswrapper[4931]: I0130 06:25:56.919254 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:56 crc kubenswrapper[4931]: I0130 06:25:56.941262 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp9rt"] Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.104626 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-catalog-content\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.104712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvw9m\" (UniqueName: \"kubernetes.io/projected/a8685ea7-9223-467f-aa16-c300e37458a6-kube-api-access-xvw9m\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.104914 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-utilities\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.206534 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-catalog-content\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.206638 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvw9m\" (UniqueName: \"kubernetes.io/projected/a8685ea7-9223-467f-aa16-c300e37458a6-kube-api-access-xvw9m\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.206729 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-utilities\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.207098 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-catalog-content\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.207213 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-utilities\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.248851 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvw9m\" (UniqueName: \"kubernetes.io/projected/a8685ea7-9223-467f-aa16-c300e37458a6-kube-api-access-xvw9m\") pod \"certified-operators-gp9rt\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.543407 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:25:57 crc kubenswrapper[4931]: I0130 06:25:57.807590 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp9rt"] Jan 30 06:25:58 crc kubenswrapper[4931]: I0130 06:25:58.727887 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8685ea7-9223-467f-aa16-c300e37458a6" containerID="b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9" exitCode=0 Jan 30 06:25:58 crc kubenswrapper[4931]: I0130 06:25:58.727955 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerDied","Data":"b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9"} Jan 30 06:25:58 crc kubenswrapper[4931]: I0130 06:25:58.727999 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerStarted","Data":"146130ba3b835b68dc312b2d6298eca310c31f68dc9e2518f2d63a78ca8e23b3"} Jan 30 06:25:59 crc kubenswrapper[4931]: I0130 06:25:59.740217 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerStarted","Data":"8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46"} Jan 30 06:26:00 crc kubenswrapper[4931]: I0130 06:26:00.756379 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8685ea7-9223-467f-aa16-c300e37458a6" containerID="8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46" exitCode=0 Jan 30 06:26:00 crc kubenswrapper[4931]: I0130 06:26:00.756478 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerDied","Data":"8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46"} Jan 30 06:26:02 crc kubenswrapper[4931]: I0130 06:26:02.786033 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerStarted","Data":"e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071"} Jan 30 06:26:02 crc kubenswrapper[4931]: I0130 06:26:02.831141 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gp9rt" podStartSLOduration=4.426415297 podStartE2EDuration="6.831111531s" podCreationTimestamp="2026-01-30 06:25:56 +0000 UTC" firstStartedPulling="2026-01-30 06:25:58.733039564 +0000 UTC m=+4694.102949861" lastFinishedPulling="2026-01-30 06:26:01.137735798 +0000 UTC m=+4696.507646095" observedRunningTime="2026-01-30 06:26:02.823607621 +0000 UTC m=+4698.193517968" watchObservedRunningTime="2026-01-30 06:26:02.831111531 +0000 UTC m=+4698.201021828" Jan 30 06:26:07 crc kubenswrapper[4931]: I0130 06:26:07.543697 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:26:07 crc kubenswrapper[4931]: I0130 06:26:07.544532 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:26:07 crc kubenswrapper[4931]: I0130 06:26:07.617211 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:26:07 crc kubenswrapper[4931]: I0130 06:26:07.927309 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:26:07 crc kubenswrapper[4931]: I0130 06:26:07.994133 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp9rt"] Jan 30 06:26:09 crc kubenswrapper[4931]: I0130 06:26:09.851553 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gp9rt" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="registry-server" containerID="cri-o://e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071" gracePeriod=2 Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.387530 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.543464 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-catalog-content\") pod \"a8685ea7-9223-467f-aa16-c300e37458a6\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.543521 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-utilities\") pod \"a8685ea7-9223-467f-aa16-c300e37458a6\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.543570 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvw9m\" (UniqueName: \"kubernetes.io/projected/a8685ea7-9223-467f-aa16-c300e37458a6-kube-api-access-xvw9m\") pod \"a8685ea7-9223-467f-aa16-c300e37458a6\" (UID: \"a8685ea7-9223-467f-aa16-c300e37458a6\") " Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.546207 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-utilities" (OuterVolumeSpecName: "utilities") pod "a8685ea7-9223-467f-aa16-c300e37458a6" (UID: "a8685ea7-9223-467f-aa16-c300e37458a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.550042 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8685ea7-9223-467f-aa16-c300e37458a6-kube-api-access-xvw9m" (OuterVolumeSpecName: "kube-api-access-xvw9m") pod "a8685ea7-9223-467f-aa16-c300e37458a6" (UID: "a8685ea7-9223-467f-aa16-c300e37458a6"). InnerVolumeSpecName "kube-api-access-xvw9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.645363 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvw9m\" (UniqueName: \"kubernetes.io/projected/a8685ea7-9223-467f-aa16-c300e37458a6-kube-api-access-xvw9m\") on node \"crc\" DevicePath \"\"" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.645395 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.871537 4931 generic.go:334] "Generic (PLEG): container finished" podID="a8685ea7-9223-467f-aa16-c300e37458a6" containerID="e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071" exitCode=0 Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.871625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerDied","Data":"e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071"} Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.871639 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp9rt" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.871692 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp9rt" event={"ID":"a8685ea7-9223-467f-aa16-c300e37458a6","Type":"ContainerDied","Data":"146130ba3b835b68dc312b2d6298eca310c31f68dc9e2518f2d63a78ca8e23b3"} Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.871731 4931 scope.go:117] "RemoveContainer" containerID="e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.913768 4931 scope.go:117] "RemoveContainer" containerID="8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46" Jan 30 06:26:10 crc kubenswrapper[4931]: I0130 06:26:10.959992 4931 scope.go:117] "RemoveContainer" containerID="b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.031224 4931 scope.go:117] "RemoveContainer" containerID="e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071" Jan 30 06:26:11 crc kubenswrapper[4931]: E0130 06:26:11.032278 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071\": container with ID starting with e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071 not found: ID does not exist" containerID="e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.032343 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071"} err="failed to get container status \"e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071\": rpc error: code = NotFound desc = could not find container \"e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071\": container with ID starting with e0ec2dd37b22b9e1e57abf9e5cace96a2831aa2924c90e60c36810dfd0a9f071 not found: ID does not exist" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.032383 4931 scope.go:117] "RemoveContainer" containerID="8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46" Jan 30 06:26:11 crc kubenswrapper[4931]: E0130 06:26:11.032912 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46\": container with ID starting with 8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46 not found: ID does not exist" containerID="8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.032958 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46"} err="failed to get container status \"8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46\": rpc error: code = NotFound desc = could not find container \"8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46\": container with ID starting with 8151a207868259506fa8d97beb80eae49c90d31fc1799495821d0c42c53b9c46 not found: ID does not exist" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.032985 4931 scope.go:117] "RemoveContainer" containerID="b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9" Jan 30 06:26:11 crc kubenswrapper[4931]: E0130 06:26:11.033396 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9\": container with ID starting with b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9 not found: ID does not exist" containerID="b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.033459 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9"} err="failed to get container status \"b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9\": rpc error: code = NotFound desc = could not find container \"b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9\": container with ID starting with b4c888e2b470037ed9cd1997c3b6ae2d408007800ca0928273fd3046d6d08cb9 not found: ID does not exist" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.078332 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8685ea7-9223-467f-aa16-c300e37458a6" (UID: "a8685ea7-9223-467f-aa16-c300e37458a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.154176 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8685ea7-9223-467f-aa16-c300e37458a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.221133 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp9rt"] Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.228356 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gp9rt"] Jan 30 06:26:11 crc kubenswrapper[4931]: I0130 06:26:11.440055 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" path="/var/lib/kubelet/pods/a8685ea7-9223-467f-aa16-c300e37458a6/volumes" Jan 30 06:26:27 crc kubenswrapper[4931]: I0130 06:26:27.363641 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:26:27 crc kubenswrapper[4931]: I0130 06:26:27.364326 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:26:53 crc kubenswrapper[4931]: I0130 06:26:53.678197 4931 scope.go:117] "RemoveContainer" containerID="7d286e4ff9e3a5d29e83c4a7e4320e5360dd3ec6c72cd95a6b0fdf400bac7103" Jan 30 06:26:57 crc kubenswrapper[4931]: I0130 06:26:57.363632 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:26:57 crc kubenswrapper[4931]: I0130 06:26:57.364111 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.363944 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.365560 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.365623 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.366253 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c01137c96244e8746673f3822245d96905f563a3e4aa39b2a8d7db22e60ff5b"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.366308 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://2c01137c96244e8746673f3822245d96905f563a3e4aa39b2a8d7db22e60ff5b" gracePeriod=600 Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.633067 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="2c01137c96244e8746673f3822245d96905f563a3e4aa39b2a8d7db22e60ff5b" exitCode=0 Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.633179 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"2c01137c96244e8746673f3822245d96905f563a3e4aa39b2a8d7db22e60ff5b"} Jan 30 06:27:27 crc kubenswrapper[4931]: I0130 06:27:27.633578 4931 scope.go:117] "RemoveContainer" containerID="77d9af72e686de38fde2b8554c979e8b7ae6fb2e13737ff4434c70e44692f229" Jan 30 06:27:28 crc kubenswrapper[4931]: I0130 06:27:28.647249 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b"} Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.031901 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vq4hz"] Jan 30 06:27:55 crc kubenswrapper[4931]: E0130 06:27:55.032939 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="extract-utilities" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.032961 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="extract-utilities" Jan 30 06:27:55 crc kubenswrapper[4931]: E0130 06:27:55.032980 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="extract-content" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.032992 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="extract-content" Jan 30 06:27:55 crc kubenswrapper[4931]: E0130 06:27:55.033035 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="registry-server" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.033048 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="registry-server" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.033304 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8685ea7-9223-467f-aa16-c300e37458a6" containerName="registry-server" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.035181 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.079770 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vq4hz"] Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.109415 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-catalog-content\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.109906 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-utilities\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.110009 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvc8g\" (UniqueName: \"kubernetes.io/projected/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-kube-api-access-pvc8g\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.211629 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-utilities\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.211679 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvc8g\" (UniqueName: \"kubernetes.io/projected/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-kube-api-access-pvc8g\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.211799 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-catalog-content\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.212326 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-catalog-content\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.212634 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-utilities\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.304876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvc8g\" (UniqueName: \"kubernetes.io/projected/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-kube-api-access-pvc8g\") pod \"community-operators-vq4hz\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.385480 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:27:55 crc kubenswrapper[4931]: I0130 06:27:55.932294 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vq4hz"] Jan 30 06:27:56 crc kubenswrapper[4931]: I0130 06:27:56.927177 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerID="7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6" exitCode=0 Jan 30 06:27:56 crc kubenswrapper[4931]: I0130 06:27:56.927299 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vq4hz" event={"ID":"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b","Type":"ContainerDied","Data":"7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6"} Jan 30 06:27:56 crc kubenswrapper[4931]: I0130 06:27:56.927593 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vq4hz" event={"ID":"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b","Type":"ContainerStarted","Data":"10231b9edebacb1de68f829de0d15b844091430d6bd68dead34457355f358e40"} Jan 30 06:27:58 crc kubenswrapper[4931]: I0130 06:27:58.947101 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerID="c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f" exitCode=0 Jan 30 06:27:58 crc kubenswrapper[4931]: I0130 06:27:58.947212 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vq4hz" event={"ID":"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b","Type":"ContainerDied","Data":"c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f"} Jan 30 06:27:59 crc kubenswrapper[4931]: I0130 06:27:59.970830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vq4hz" event={"ID":"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b","Type":"ContainerStarted","Data":"3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf"} Jan 30 06:28:01 crc kubenswrapper[4931]: I0130 06:28:01.001172 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vq4hz" podStartSLOduration=4.21075302 podStartE2EDuration="7.001154481s" podCreationTimestamp="2026-01-30 06:27:54 +0000 UTC" firstStartedPulling="2026-01-30 06:27:56.932402453 +0000 UTC m=+4812.302312740" lastFinishedPulling="2026-01-30 06:27:59.722803914 +0000 UTC m=+4815.092714201" observedRunningTime="2026-01-30 06:28:00.994014852 +0000 UTC m=+4816.363925129" watchObservedRunningTime="2026-01-30 06:28:01.001154481 +0000 UTC m=+4816.371064758" Jan 30 06:28:05 crc kubenswrapper[4931]: I0130 06:28:05.386214 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:28:05 crc kubenswrapper[4931]: I0130 06:28:05.387130 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:28:05 crc kubenswrapper[4931]: I0130 06:28:05.447375 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:28:06 crc kubenswrapper[4931]: I0130 06:28:06.092530 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:28:06 crc kubenswrapper[4931]: I0130 06:28:06.165634 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vq4hz"] Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.039008 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vq4hz" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="registry-server" containerID="cri-o://3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf" gracePeriod=2 Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.785623 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.804872 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-catalog-content\") pod \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.804961 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvc8g\" (UniqueName: \"kubernetes.io/projected/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-kube-api-access-pvc8g\") pod \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.805112 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-utilities\") pod \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\" (UID: \"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b\") " Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.806035 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-utilities" (OuterVolumeSpecName: "utilities") pod "c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" (UID: "c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.814233 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-kube-api-access-pvc8g" (OuterVolumeSpecName: "kube-api-access-pvc8g") pod "c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" (UID: "c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b"). InnerVolumeSpecName "kube-api-access-pvc8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.907899 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvc8g\" (UniqueName: \"kubernetes.io/projected/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-kube-api-access-pvc8g\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:08 crc kubenswrapper[4931]: I0130 06:28:08.907953 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.052630 4931 generic.go:334] "Generic (PLEG): container finished" podID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerID="3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf" exitCode=0 Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.052708 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vq4hz" event={"ID":"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b","Type":"ContainerDied","Data":"3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf"} Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.052748 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vq4hz" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.052789 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vq4hz" event={"ID":"c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b","Type":"ContainerDied","Data":"10231b9edebacb1de68f829de0d15b844091430d6bd68dead34457355f358e40"} Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.052822 4931 scope.go:117] "RemoveContainer" containerID="3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.088201 4931 scope.go:117] "RemoveContainer" containerID="c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.120777 4931 scope.go:117] "RemoveContainer" containerID="7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.164452 4931 scope.go:117] "RemoveContainer" containerID="3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf" Jan 30 06:28:09 crc kubenswrapper[4931]: E0130 06:28:09.165013 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf\": container with ID starting with 3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf not found: ID does not exist" containerID="3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.165076 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf"} err="failed to get container status \"3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf\": rpc error: code = NotFound desc = could not find container \"3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf\": container with ID starting with 3649ec64523234caeca32f0820ba131802cf5e688cbcb0f82056acb9e312cfaf not found: ID does not exist" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.165115 4931 scope.go:117] "RemoveContainer" containerID="c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f" Jan 30 06:28:09 crc kubenswrapper[4931]: E0130 06:28:09.165732 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f\": container with ID starting with c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f not found: ID does not exist" containerID="c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.165818 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f"} err="failed to get container status \"c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f\": rpc error: code = NotFound desc = could not find container \"c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f\": container with ID starting with c223b5a5d94837a49b52c559a51252c5dfa68877ed7384b6c7103d28fc93de4f not found: ID does not exist" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.165871 4931 scope.go:117] "RemoveContainer" containerID="7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6" Jan 30 06:28:09 crc kubenswrapper[4931]: E0130 06:28:09.166525 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6\": container with ID starting with 7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6 not found: ID does not exist" containerID="7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.166606 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6"} err="failed to get container status \"7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6\": rpc error: code = NotFound desc = could not find container \"7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6\": container with ID starting with 7c08337cd5832dab4df7a7ecf0ffe08370dc38255d2a6ae0680fbc33f40952d6 not found: ID does not exist" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.233243 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" (UID: "c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.316191 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.400528 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vq4hz"] Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.411400 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vq4hz"] Jan 30 06:28:09 crc kubenswrapper[4931]: I0130 06:28:09.432569 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" path="/var/lib/kubelet/pods/c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b/volumes" Jan 30 06:29:27 crc kubenswrapper[4931]: I0130 06:29:27.363616 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:29:27 crc kubenswrapper[4931]: I0130 06:29:27.364532 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.678595 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Jan 30 06:29:50 crc kubenswrapper[4931]: E0130 06:29:50.680100 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="extract-utilities" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.680131 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="extract-utilities" Jan 30 06:29:50 crc kubenswrapper[4931]: E0130 06:29:50.680180 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="extract-content" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.680198 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="extract-content" Jan 30 06:29:50 crc kubenswrapper[4931]: E0130 06:29:50.680227 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="registry-server" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.680245 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="registry-server" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.680620 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a8277a-ce11-4fbf-ab16-7f95c7b2a97b" containerName="registry-server" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.681732 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.685976 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qqbkw" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.692351 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.792942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkrs2\" (UniqueName: \"kubernetes.io/projected/371cff3f-3d31-4dc6-98eb-b03f2d967337-kube-api-access-kkrs2\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") " pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.793078 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") " pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.895136 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkrs2\" (UniqueName: \"kubernetes.io/projected/371cff3f-3d31-4dc6-98eb-b03f2d967337-kube-api-access-kkrs2\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") " pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.895515 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") " pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.902396 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.902656 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/288f6df6449974b404cf65913d6950f1694034a208c1d00e9450880132f599b0/globalmount\"" pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.923669 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkrs2\" (UniqueName: \"kubernetes.io/projected/371cff3f-3d31-4dc6-98eb-b03f2d967337-kube-api-access-kkrs2\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") " pod="openstack/mariadb-copy-data" Jan 30 06:29:50 crc kubenswrapper[4931]: I0130 06:29:50.942088 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9322c3d3-7eeb-4461-b5eb-df57a66b6935\") pod \"mariadb-copy-data\" (UID: \"371cff3f-3d31-4dc6-98eb-b03f2d967337\") " pod="openstack/mariadb-copy-data" Jan 30 06:29:51 crc kubenswrapper[4931]: I0130 06:29:51.008315 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 30 06:29:51 crc kubenswrapper[4931]: I0130 06:29:51.612061 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 30 06:29:51 crc kubenswrapper[4931]: W0130 06:29:51.811907 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod371cff3f_3d31_4dc6_98eb_b03f2d967337.slice/crio-408220d8d1998f399ef671bd3376facd5e3c82722ed4e84405b357afb42ccc95 WatchSource:0}: Error finding container 408220d8d1998f399ef671bd3376facd5e3c82722ed4e84405b357afb42ccc95: Status 404 returned error can't find the container with id 408220d8d1998f399ef671bd3376facd5e3c82722ed4e84405b357afb42ccc95 Jan 30 06:29:52 crc kubenswrapper[4931]: I0130 06:29:52.211053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"371cff3f-3d31-4dc6-98eb-b03f2d967337","Type":"ContainerStarted","Data":"f1ba84c68713736ab6df303984300eb188f441a6bdf8bef551520210f094feca"} Jan 30 06:29:52 crc kubenswrapper[4931]: I0130 06:29:52.211409 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"371cff3f-3d31-4dc6-98eb-b03f2d967337","Type":"ContainerStarted","Data":"408220d8d1998f399ef671bd3376facd5e3c82722ed4e84405b357afb42ccc95"} Jan 30 06:29:52 crc kubenswrapper[4931]: I0130 06:29:52.230311 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.230290961 podStartE2EDuration="3.230290961s" podCreationTimestamp="2026-01-30 06:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:29:52.229184639 +0000 UTC m=+4927.599094906" watchObservedRunningTime="2026-01-30 06:29:52.230290961 +0000 UTC m=+4927.600201228" Jan 30 06:29:53 crc kubenswrapper[4931]: I0130 06:29:53.843585 4931 scope.go:117] "RemoveContainer" containerID="71a8124b599814d410f6d79c3260191d88bc5b88a1405b9bfb832aebcb013dc4" Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.224174 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.226186 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.238916 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.370501 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtfm\" (UniqueName: \"kubernetes.io/projected/78316e93-d485-4836-824e-42cbe23eb625-kube-api-access-7qtfm\") pod \"mariadb-client\" (UID: \"78316e93-d485-4836-824e-42cbe23eb625\") " pod="openstack/mariadb-client" Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.472056 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtfm\" (UniqueName: \"kubernetes.io/projected/78316e93-d485-4836-824e-42cbe23eb625-kube-api-access-7qtfm\") pod \"mariadb-client\" (UID: \"78316e93-d485-4836-824e-42cbe23eb625\") " pod="openstack/mariadb-client" Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.505574 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtfm\" (UniqueName: \"kubernetes.io/projected/78316e93-d485-4836-824e-42cbe23eb625-kube-api-access-7qtfm\") pod \"mariadb-client\" (UID: \"78316e93-d485-4836-824e-42cbe23eb625\") " pod="openstack/mariadb-client" Jan 30 06:29:55 crc kubenswrapper[4931]: I0130 06:29:55.587936 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:56 crc kubenswrapper[4931]: I0130 06:29:56.091941 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:56 crc kubenswrapper[4931]: W0130 06:29:56.096203 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78316e93_d485_4836_824e_42cbe23eb625.slice/crio-2ce03d4660d6878d1ae82ff00b48e5693133d8305792a3c56103245b882c80a4 WatchSource:0}: Error finding container 2ce03d4660d6878d1ae82ff00b48e5693133d8305792a3c56103245b882c80a4: Status 404 returned error can't find the container with id 2ce03d4660d6878d1ae82ff00b48e5693133d8305792a3c56103245b882c80a4 Jan 30 06:29:56 crc kubenswrapper[4931]: I0130 06:29:56.294035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"78316e93-d485-4836-824e-42cbe23eb625","Type":"ContainerStarted","Data":"2ce03d4660d6878d1ae82ff00b48e5693133d8305792a3c56103245b882c80a4"} Jan 30 06:29:57 crc kubenswrapper[4931]: I0130 06:29:57.305658 4931 generic.go:334] "Generic (PLEG): container finished" podID="78316e93-d485-4836-824e-42cbe23eb625" containerID="c20d2d48ca6794144eddad5037d464e6a9ffdad2028bd7ef00590c377c6183ff" exitCode=0 Jan 30 06:29:57 crc kubenswrapper[4931]: I0130 06:29:57.305899 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"78316e93-d485-4836-824e-42cbe23eb625","Type":"ContainerDied","Data":"c20d2d48ca6794144eddad5037d464e6a9ffdad2028bd7ef00590c377c6183ff"} Jan 30 06:29:57 crc kubenswrapper[4931]: I0130 06:29:57.362963 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:29:57 crc kubenswrapper[4931]: I0130 06:29:57.363028 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.679105 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.705842 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_78316e93-d485-4836-824e-42cbe23eb625/mariadb-client/0.log" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.735090 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.739937 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.823457 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qtfm\" (UniqueName: \"kubernetes.io/projected/78316e93-d485-4836-824e-42cbe23eb625-kube-api-access-7qtfm\") pod \"78316e93-d485-4836-824e-42cbe23eb625\" (UID: \"78316e93-d485-4836-824e-42cbe23eb625\") " Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.828658 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78316e93-d485-4836-824e-42cbe23eb625-kube-api-access-7qtfm" (OuterVolumeSpecName: "kube-api-access-7qtfm") pod "78316e93-d485-4836-824e-42cbe23eb625" (UID: "78316e93-d485-4836-824e-42cbe23eb625"). InnerVolumeSpecName "kube-api-access-7qtfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.913288 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:59 crc kubenswrapper[4931]: E0130 06:29:58.913635 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78316e93-d485-4836-824e-42cbe23eb625" containerName="mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.913649 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="78316e93-d485-4836-824e-42cbe23eb625" containerName="mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.913788 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="78316e93-d485-4836-824e-42cbe23eb625" containerName="mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.914243 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.918950 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:58.925020 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qtfm\" (UniqueName: \"kubernetes.io/projected/78316e93-d485-4836-824e-42cbe23eb625-kube-api-access-7qtfm\") on node \"crc\" DevicePath \"\"" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.025814 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvt4\" (UniqueName: \"kubernetes.io/projected/e7624857-270e-497b-b3b1-51df662ce3dc-kube-api-access-vlvt4\") pod \"mariadb-client\" (UID: \"e7624857-270e-497b-b3b1-51df662ce3dc\") " pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.127016 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvt4\" (UniqueName: \"kubernetes.io/projected/e7624857-270e-497b-b3b1-51df662ce3dc-kube-api-access-vlvt4\") pod \"mariadb-client\" (UID: \"e7624857-270e-497b-b3b1-51df662ce3dc\") " pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.148003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvt4\" (UniqueName: \"kubernetes.io/projected/e7624857-270e-497b-b3b1-51df662ce3dc-kube-api-access-vlvt4\") pod \"mariadb-client\" (UID: \"e7624857-270e-497b-b3b1-51df662ce3dc\") " pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.233311 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.329396 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce03d4660d6878d1ae82ff00b48e5693133d8305792a3c56103245b882c80a4" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.329746 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.357027 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="78316e93-d485-4836-824e-42cbe23eb625" podUID="e7624857-270e-497b-b3b1-51df662ce3dc" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.437823 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78316e93-d485-4836-824e-42cbe23eb625" path="/var/lib/kubelet/pods/78316e93-d485-4836-824e-42cbe23eb625/volumes" Jan 30 06:29:59 crc kubenswrapper[4931]: I0130 06:29:59.681605 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:59 crc kubenswrapper[4931]: W0130 06:29:59.684463 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7624857_270e_497b_b3b1_51df662ce3dc.slice/crio-e553feece4667a023db5fed463f69eb82294d8b6104fd7b3fc9694171a2ab0d5 WatchSource:0}: Error finding container e553feece4667a023db5fed463f69eb82294d8b6104fd7b3fc9694171a2ab0d5: Status 404 returned error can't find the container with id e553feece4667a023db5fed463f69eb82294d8b6104fd7b3fc9694171a2ab0d5 Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.147381 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2"] Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.149594 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.155524 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.155944 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.174711 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2"] Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.247149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e96da373-c61f-4a59-9311-65f140a354a4-secret-volume\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.247297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e96da373-c61f-4a59-9311-65f140a354a4-config-volume\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.247353 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7nnv\" (UniqueName: \"kubernetes.io/projected/e96da373-c61f-4a59-9311-65f140a354a4-kube-api-access-x7nnv\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.341631 4931 generic.go:334] "Generic (PLEG): container finished" podID="e7624857-270e-497b-b3b1-51df662ce3dc" containerID="173f20cd392f59bb3d09e8d879e9a2c54ad0461fcc8850325a690b330805f7aa" exitCode=0 Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.341705 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e7624857-270e-497b-b3b1-51df662ce3dc","Type":"ContainerDied","Data":"173f20cd392f59bb3d09e8d879e9a2c54ad0461fcc8850325a690b330805f7aa"} Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.341753 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e7624857-270e-497b-b3b1-51df662ce3dc","Type":"ContainerStarted","Data":"e553feece4667a023db5fed463f69eb82294d8b6104fd7b3fc9694171a2ab0d5"} Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.349350 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7nnv\" (UniqueName: \"kubernetes.io/projected/e96da373-c61f-4a59-9311-65f140a354a4-kube-api-access-x7nnv\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.349518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e96da373-c61f-4a59-9311-65f140a354a4-secret-volume\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.349581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e96da373-c61f-4a59-9311-65f140a354a4-config-volume\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.350680 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e96da373-c61f-4a59-9311-65f140a354a4-config-volume\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.361042 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e96da373-c61f-4a59-9311-65f140a354a4-secret-volume\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.382299 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7nnv\" (UniqueName: \"kubernetes.io/projected/e96da373-c61f-4a59-9311-65f140a354a4-kube-api-access-x7nnv\") pod \"collect-profiles-29495910-ccdq2\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.482414 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:00 crc kubenswrapper[4931]: I0130 06:30:00.970894 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2"] Jan 30 06:30:01 crc kubenswrapper[4931]: I0130 06:30:01.350300 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" event={"ID":"e96da373-c61f-4a59-9311-65f140a354a4","Type":"ContainerStarted","Data":"705db526c932889de3f11f056c165dba633c441108b3e93fffe75f611e076e31"} Jan 30 06:30:01 crc kubenswrapper[4931]: I0130 06:30:01.350363 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" event={"ID":"e96da373-c61f-4a59-9311-65f140a354a4","Type":"ContainerStarted","Data":"29394f67334e83e67eda76d071643e6a82eb5e42f9506be9f8b3cba2f0463934"} Jan 30 06:30:01 crc kubenswrapper[4931]: I0130 06:30:01.372820 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" podStartSLOduration=1.372798006 podStartE2EDuration="1.372798006s" podCreationTimestamp="2026-01-30 06:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:30:01.370305264 +0000 UTC m=+4936.740215531" watchObservedRunningTime="2026-01-30 06:30:01.372798006 +0000 UTC m=+4936.742708263" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.060010 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.083412 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_e7624857-270e-497b-b3b1-51df662ce3dc/mariadb-client/0.log" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.115052 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.122286 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.179151 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlvt4\" (UniqueName: \"kubernetes.io/projected/e7624857-270e-497b-b3b1-51df662ce3dc-kube-api-access-vlvt4\") pod \"e7624857-270e-497b-b3b1-51df662ce3dc\" (UID: \"e7624857-270e-497b-b3b1-51df662ce3dc\") " Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.185701 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7624857-270e-497b-b3b1-51df662ce3dc-kube-api-access-vlvt4" (OuterVolumeSpecName: "kube-api-access-vlvt4") pod "e7624857-270e-497b-b3b1-51df662ce3dc" (UID: "e7624857-270e-497b-b3b1-51df662ce3dc"). InnerVolumeSpecName "kube-api-access-vlvt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.281657 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlvt4\" (UniqueName: \"kubernetes.io/projected/e7624857-270e-497b-b3b1-51df662ce3dc-kube-api-access-vlvt4\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.360189 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e553feece4667a023db5fed463f69eb82294d8b6104fd7b3fc9694171a2ab0d5" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.360296 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.363041 4931 generic.go:334] "Generic (PLEG): container finished" podID="e96da373-c61f-4a59-9311-65f140a354a4" containerID="705db526c932889de3f11f056c165dba633c441108b3e93fffe75f611e076e31" exitCode=0 Jan 30 06:30:02 crc kubenswrapper[4931]: I0130 06:30:02.363076 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" event={"ID":"e96da373-c61f-4a59-9311-65f140a354a4","Type":"ContainerDied","Data":"705db526c932889de3f11f056c165dba633c441108b3e93fffe75f611e076e31"} Jan 30 06:30:03 crc kubenswrapper[4931]: I0130 06:30:03.436858 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7624857-270e-497b-b3b1-51df662ce3dc" path="/var/lib/kubelet/pods/e7624857-270e-497b-b3b1-51df662ce3dc/volumes" Jan 30 06:30:03 crc kubenswrapper[4931]: I0130 06:30:03.908055 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.007235 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e96da373-c61f-4a59-9311-65f140a354a4-config-volume\") pod \"e96da373-c61f-4a59-9311-65f140a354a4\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.007694 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e96da373-c61f-4a59-9311-65f140a354a4-secret-volume\") pod \"e96da373-c61f-4a59-9311-65f140a354a4\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.008013 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7nnv\" (UniqueName: \"kubernetes.io/projected/e96da373-c61f-4a59-9311-65f140a354a4-kube-api-access-x7nnv\") pod \"e96da373-c61f-4a59-9311-65f140a354a4\" (UID: \"e96da373-c61f-4a59-9311-65f140a354a4\") " Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.008469 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e96da373-c61f-4a59-9311-65f140a354a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "e96da373-c61f-4a59-9311-65f140a354a4" (UID: "e96da373-c61f-4a59-9311-65f140a354a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.016072 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96da373-c61f-4a59-9311-65f140a354a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e96da373-c61f-4a59-9311-65f140a354a4" (UID: "e96da373-c61f-4a59-9311-65f140a354a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.016685 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96da373-c61f-4a59-9311-65f140a354a4-kube-api-access-x7nnv" (OuterVolumeSpecName: "kube-api-access-x7nnv") pod "e96da373-c61f-4a59-9311-65f140a354a4" (UID: "e96da373-c61f-4a59-9311-65f140a354a4"). InnerVolumeSpecName "kube-api-access-x7nnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.110696 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7nnv\" (UniqueName: \"kubernetes.io/projected/e96da373-c61f-4a59-9311-65f140a354a4-kube-api-access-x7nnv\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.110745 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e96da373-c61f-4a59-9311-65f140a354a4-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.110758 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e96da373-c61f-4a59-9311-65f140a354a4-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.385052 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" event={"ID":"e96da373-c61f-4a59-9311-65f140a354a4","Type":"ContainerDied","Data":"29394f67334e83e67eda76d071643e6a82eb5e42f9506be9f8b3cba2f0463934"} Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.385146 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29394f67334e83e67eda76d071643e6a82eb5e42f9506be9f8b3cba2f0463934" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.385094 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-ccdq2" Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.457792 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n"] Jan 30 06:30:04 crc kubenswrapper[4931]: I0130 06:30:04.470807 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-q2t6n"] Jan 30 06:30:05 crc kubenswrapper[4931]: I0130 06:30:05.436548 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ad7b66-28c5-436b-9dc4-86be3d48787b" path="/var/lib/kubelet/pods/71ad7b66-28c5-436b-9dc4-86be3d48787b/volumes" Jan 30 06:30:27 crc kubenswrapper[4931]: I0130 06:30:27.363040 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:30:27 crc kubenswrapper[4931]: I0130 06:30:27.363601 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:30:27 crc kubenswrapper[4931]: I0130 06:30:27.363647 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:30:27 crc kubenswrapper[4931]: I0130 06:30:27.364330 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:30:27 crc kubenswrapper[4931]: I0130 06:30:27.364386 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" gracePeriod=600 Jan 30 06:30:28 crc kubenswrapper[4931]: E0130 06:30:28.591115 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:30:28 crc kubenswrapper[4931]: I0130 06:30:28.623772 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" exitCode=0 Jan 30 06:30:28 crc kubenswrapper[4931]: I0130 06:30:28.623872 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b"} Jan 30 06:30:28 crc kubenswrapper[4931]: I0130 06:30:28.624083 4931 scope.go:117] "RemoveContainer" containerID="2c01137c96244e8746673f3822245d96905f563a3e4aa39b2a8d7db22e60ff5b" Jan 30 06:30:28 crc kubenswrapper[4931]: I0130 06:30:28.624657 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:30:28 crc kubenswrapper[4931]: E0130 06:30:28.624935 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.770319 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6b9gc"] Jan 30 06:30:32 crc kubenswrapper[4931]: E0130 06:30:32.771038 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96da373-c61f-4a59-9311-65f140a354a4" containerName="collect-profiles" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.771050 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96da373-c61f-4a59-9311-65f140a354a4" containerName="collect-profiles" Jan 30 06:30:32 crc kubenswrapper[4931]: E0130 06:30:32.771076 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7624857-270e-497b-b3b1-51df662ce3dc" containerName="mariadb-client" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.771082 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7624857-270e-497b-b3b1-51df662ce3dc" containerName="mariadb-client" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.771216 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96da373-c61f-4a59-9311-65f140a354a4" containerName="collect-profiles" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.771226 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7624857-270e-497b-b3b1-51df662ce3dc" containerName="mariadb-client" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.772335 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.776198 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6b9gc"] Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.976332 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-catalog-content\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.976489 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-utilities\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:32 crc kubenswrapper[4931]: I0130 06:30:32.976804 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfcln\" (UniqueName: \"kubernetes.io/projected/99ddeac4-7ac5-423d-8eba-59f5162fa8df-kube-api-access-pfcln\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.078168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfcln\" (UniqueName: \"kubernetes.io/projected/99ddeac4-7ac5-423d-8eba-59f5162fa8df-kube-api-access-pfcln\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.078243 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-catalog-content\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.078292 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-utilities\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.078735 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-utilities\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.078981 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-catalog-content\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.098943 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfcln\" (UniqueName: \"kubernetes.io/projected/99ddeac4-7ac5-423d-8eba-59f5162fa8df-kube-api-access-pfcln\") pod \"redhat-operators-6b9gc\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.144231 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.577185 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6b9gc"] Jan 30 06:30:33 crc kubenswrapper[4931]: I0130 06:30:33.673284 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerStarted","Data":"1de6c21a8cc92620f7e298031965ad6e56e27bd6a60708eec9d6fd5a55666e05"} Jan 30 06:30:34 crc kubenswrapper[4931]: I0130 06:30:34.686126 4931 generic.go:334] "Generic (PLEG): container finished" podID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerID="4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f" exitCode=0 Jan 30 06:30:34 crc kubenswrapper[4931]: I0130 06:30:34.686186 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerDied","Data":"4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f"} Jan 30 06:30:34 crc kubenswrapper[4931]: I0130 06:30:34.689735 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:30:36 crc kubenswrapper[4931]: I0130 06:30:36.705955 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerStarted","Data":"bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac"} Jan 30 06:30:37 crc kubenswrapper[4931]: I0130 06:30:37.719490 4931 generic.go:334] "Generic (PLEG): container finished" podID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerID="bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac" exitCode=0 Jan 30 06:30:37 crc kubenswrapper[4931]: I0130 06:30:37.719541 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerDied","Data":"bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac"} Jan 30 06:30:38 crc kubenswrapper[4931]: I0130 06:30:38.728508 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerStarted","Data":"a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760"} Jan 30 06:30:38 crc kubenswrapper[4931]: I0130 06:30:38.756256 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6b9gc" podStartSLOduration=3.079488651 podStartE2EDuration="6.756226891s" podCreationTimestamp="2026-01-30 06:30:32 +0000 UTC" firstStartedPulling="2026-01-30 06:30:34.689448777 +0000 UTC m=+4970.059359044" lastFinishedPulling="2026-01-30 06:30:38.366187027 +0000 UTC m=+4973.736097284" observedRunningTime="2026-01-30 06:30:38.7499109 +0000 UTC m=+4974.119821187" watchObservedRunningTime="2026-01-30 06:30:38.756226891 +0000 UTC m=+4974.126137168" Jan 30 06:30:43 crc kubenswrapper[4931]: I0130 06:30:43.144652 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:43 crc kubenswrapper[4931]: I0130 06:30:43.144937 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:44 crc kubenswrapper[4931]: I0130 06:30:44.214569 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6b9gc" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="registry-server" probeResult="failure" output=< Jan 30 06:30:44 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:30:44 crc kubenswrapper[4931]: > Jan 30 06:30:44 crc kubenswrapper[4931]: I0130 06:30:44.422216 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:30:44 crc kubenswrapper[4931]: E0130 06:30:44.422452 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:30:53 crc kubenswrapper[4931]: I0130 06:30:53.204749 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:53 crc kubenswrapper[4931]: I0130 06:30:53.279457 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:53 crc kubenswrapper[4931]: I0130 06:30:53.450814 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6b9gc"] Jan 30 06:30:53 crc kubenswrapper[4931]: I0130 06:30:53.914733 4931 scope.go:117] "RemoveContainer" containerID="0262628a4935b4dc10f986d98e7493ff62eab4841805fce6eb8783a9ef5f62e3" Jan 30 06:30:54 crc kubenswrapper[4931]: I0130 06:30:54.891730 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6b9gc" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="registry-server" containerID="cri-o://a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760" gracePeriod=2 Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.759724 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.861363 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-utilities\") pod \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.861562 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfcln\" (UniqueName: \"kubernetes.io/projected/99ddeac4-7ac5-423d-8eba-59f5162fa8df-kube-api-access-pfcln\") pod \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.861632 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-catalog-content\") pod \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\" (UID: \"99ddeac4-7ac5-423d-8eba-59f5162fa8df\") " Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.862797 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-utilities" (OuterVolumeSpecName: "utilities") pod "99ddeac4-7ac5-423d-8eba-59f5162fa8df" (UID: "99ddeac4-7ac5-423d-8eba-59f5162fa8df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.870309 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ddeac4-7ac5-423d-8eba-59f5162fa8df-kube-api-access-pfcln" (OuterVolumeSpecName: "kube-api-access-pfcln") pod "99ddeac4-7ac5-423d-8eba-59f5162fa8df" (UID: "99ddeac4-7ac5-423d-8eba-59f5162fa8df"). InnerVolumeSpecName "kube-api-access-pfcln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.904521 4931 generic.go:334] "Generic (PLEG): container finished" podID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerID="a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760" exitCode=0 Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.904573 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerDied","Data":"a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760"} Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.904595 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b9gc" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.904620 4931 scope.go:117] "RemoveContainer" containerID="a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.904606 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b9gc" event={"ID":"99ddeac4-7ac5-423d-8eba-59f5162fa8df","Type":"ContainerDied","Data":"1de6c21a8cc92620f7e298031965ad6e56e27bd6a60708eec9d6fd5a55666e05"} Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.941469 4931 scope.go:117] "RemoveContainer" containerID="bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.962767 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.962809 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfcln\" (UniqueName: \"kubernetes.io/projected/99ddeac4-7ac5-423d-8eba-59f5162fa8df-kube-api-access-pfcln\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:55 crc kubenswrapper[4931]: I0130 06:30:55.972652 4931 scope.go:117] "RemoveContainer" containerID="4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.020385 4931 scope.go:117] "RemoveContainer" containerID="a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760" Jan 30 06:30:56 crc kubenswrapper[4931]: E0130 06:30:56.020925 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760\": container with ID starting with a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760 not found: ID does not exist" containerID="a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.020953 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760"} err="failed to get container status \"a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760\": rpc error: code = NotFound desc = could not find container \"a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760\": container with ID starting with a5cad92a5a23adb264c0785ebc4f46dcc7a51b1aaa9e8165b6fabd6e6aa56760 not found: ID does not exist" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.020973 4931 scope.go:117] "RemoveContainer" containerID="bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac" Jan 30 06:30:56 crc kubenswrapper[4931]: E0130 06:30:56.021343 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac\": container with ID starting with bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac not found: ID does not exist" containerID="bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.021371 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac"} err="failed to get container status \"bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac\": rpc error: code = NotFound desc = could not find container \"bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac\": container with ID starting with bf2670308f447f9ad8224e13b2d9d4ca3519f70e61f61c7aad9f77a0366e7eac not found: ID does not exist" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.021386 4931 scope.go:117] "RemoveContainer" containerID="4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f" Jan 30 06:30:56 crc kubenswrapper[4931]: E0130 06:30:56.021735 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f\": container with ID starting with 4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f not found: ID does not exist" containerID="4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.021761 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f"} err="failed to get container status \"4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f\": rpc error: code = NotFound desc = could not find container \"4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f\": container with ID starting with 4356cf57ad96373bd7eaa048120ef15b76fbf1b09320028365330caad7053f7f not found: ID does not exist" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.037159 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99ddeac4-7ac5-423d-8eba-59f5162fa8df" (UID: "99ddeac4-7ac5-423d-8eba-59f5162fa8df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.063898 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ddeac4-7ac5-423d-8eba-59f5162fa8df-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.256085 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6b9gc"] Jan 30 06:30:56 crc kubenswrapper[4931]: I0130 06:30:56.261344 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6b9gc"] Jan 30 06:30:57 crc kubenswrapper[4931]: I0130 06:30:57.437036 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" path="/var/lib/kubelet/pods/99ddeac4-7ac5-423d-8eba-59f5162fa8df/volumes" Jan 30 06:30:58 crc kubenswrapper[4931]: I0130 06:30:58.422388 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:30:58 crc kubenswrapper[4931]: E0130 06:30:58.423217 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:31:12 crc kubenswrapper[4931]: I0130 06:31:12.421737 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:31:12 crc kubenswrapper[4931]: E0130 06:31:12.422762 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:31:23 crc kubenswrapper[4931]: I0130 06:31:23.421927 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:31:23 crc kubenswrapper[4931]: E0130 06:31:23.422784 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:31:38 crc kubenswrapper[4931]: I0130 06:31:38.422490 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:31:38 crc kubenswrapper[4931]: E0130 06:31:38.425349 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:31:48 crc kubenswrapper[4931]: I0130 06:31:48.107747 4931 trace.go:236] Trace[1878925583]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (30-Jan-2026 06:31:47.081) (total time: 1026ms): Jan 30 06:31:48 crc kubenswrapper[4931]: Trace[1878925583]: [1.026658415s] [1.026658415s] END Jan 30 06:31:49 crc kubenswrapper[4931]: I0130 06:31:49.423987 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:31:49 crc kubenswrapper[4931]: E0130 06:31:49.424553 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:31:56 crc kubenswrapper[4931]: E0130 06:31:56.771555 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.179:44662->38.102.83.179:45103: write tcp 38.102.83.179:44662->38.102.83.179:45103: write: broken pipe Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.108406 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 06:31:59 crc kubenswrapper[4931]: E0130 06:31:59.109112 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="registry-server" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.109130 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="registry-server" Jan 30 06:31:59 crc kubenswrapper[4931]: E0130 06:31:59.109146 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="extract-utilities" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.109154 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="extract-utilities" Jan 30 06:31:59 crc kubenswrapper[4931]: E0130 06:31:59.109174 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="extract-content" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.109184 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="extract-content" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.109372 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ddeac4-7ac5-423d-8eba-59f5162fa8df" containerName="registry-server" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.110292 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.118270 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.118783 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.119177 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5gfqb" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.127400 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.147978 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.160309 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.179238 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.182289 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.215727 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.222058 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.268858 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83787678-4305-4893-8aa4-d1ddd8c15343-config\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.268910 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83787678-4305-4893-8aa4-d1ddd8c15343-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.268942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.268975 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7l2x\" (UniqueName: \"kubernetes.io/projected/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-kube-api-access-f7l2x\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269018 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-config\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269048 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc86399-3f01-4c6a-942d-b255a957dc52-config\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269074 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269176 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83787678-4305-4893-8aa4-d1ddd8c15343-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269222 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269281 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82lrm\" (UniqueName: \"kubernetes.io/projected/83787678-4305-4893-8aa4-d1ddd8c15343-kube-api-access-82lrm\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269358 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269377 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269398 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwbf\" (UniqueName: \"kubernetes.io/projected/ffc86399-3f01-4c6a-942d-b255a957dc52-kube-api-access-clwbf\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269431 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffc86399-3f01-4c6a-942d-b255a957dc52-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269464 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffc86399-3f01-4c6a-942d-b255a957dc52-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269489 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc86399-3f01-4c6a-942d-b255a957dc52-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.269506 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83787678-4305-4893-8aa4-d1ddd8c15343-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.283056 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.284482 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.286503 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jvt4p" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.286630 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.287038 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.310525 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.321589 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.323896 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.331980 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.334912 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.337783 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.350163 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.370989 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83787678-4305-4893-8aa4-d1ddd8c15343-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371041 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371067 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp68\" (UniqueName: \"kubernetes.io/projected/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-kube-api-access-7gp68\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371085 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82lrm\" (UniqueName: \"kubernetes.io/projected/83787678-4305-4893-8aa4-d1ddd8c15343-kube-api-access-82lrm\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371114 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371152 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371171 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371189 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371206 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwbf\" (UniqueName: \"kubernetes.io/projected/ffc86399-3f01-4c6a-942d-b255a957dc52-kube-api-access-clwbf\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371224 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffc86399-3f01-4c6a-942d-b255a957dc52-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371265 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffc86399-3f01-4c6a-942d-b255a957dc52-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371279 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc86399-3f01-4c6a-942d-b255a957dc52-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371292 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83787678-4305-4893-8aa4-d1ddd8c15343-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371317 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-config\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371340 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83787678-4305-4893-8aa4-d1ddd8c15343-config\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371357 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83787678-4305-4893-8aa4-d1ddd8c15343-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371392 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371410 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371440 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7l2x\" (UniqueName: \"kubernetes.io/projected/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-kube-api-access-f7l2x\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371470 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-config\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371491 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc86399-3f01-4c6a-942d-b255a957dc52-config\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371507 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371873 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffc86399-3f01-4c6a-942d-b255a957dc52-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.371944 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.372237 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83787678-4305-4893-8aa4-d1ddd8c15343-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.372912 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-config\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.373562 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffc86399-3f01-4c6a-942d-b255a957dc52-config\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.373846 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83787678-4305-4893-8aa4-d1ddd8c15343-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.374246 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.374545 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83787678-4305-4893-8aa4-d1ddd8c15343-config\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.374606 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffc86399-3f01-4c6a-942d-b255a957dc52-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.379078 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83787678-4305-4893-8aa4-d1ddd8c15343-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.379824 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.379852 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/38ceb5621d454c1692b720cf30f10e3b664762114fdc3e7d5b38f883b904b6d8/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.379898 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.379934 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc86399-3f01-4c6a-942d-b255a957dc52-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.379932 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/542e0ca3a7e896c286d5fb36340e8db4366e8a6a9e4a26986a8767cc153e14e9/globalmount\"" pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.380142 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.380160 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0df4613f538e56a2f714ec1b65ac6bc28b7d7e300b847859a521a3448da838ee/globalmount\"" pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.384905 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.391265 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7l2x\" (UniqueName: \"kubernetes.io/projected/78634c9d-d8d8-4eed-adc7-fe9fdbf69a11-kube-api-access-f7l2x\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.391861 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwbf\" (UniqueName: \"kubernetes.io/projected/ffc86399-3f01-4c6a-942d-b255a957dc52-kube-api-access-clwbf\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.396612 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82lrm\" (UniqueName: \"kubernetes.io/projected/83787678-4305-4893-8aa4-d1ddd8c15343-kube-api-access-82lrm\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.407899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0e9ceaa-7ade-4f8a-bd83-6f37f450fbc6\") pod \"ovsdbserver-sb-2\" (UID: \"83787678-4305-4893-8aa4-d1ddd8c15343\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.409820 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0aaad663-0830-4310-a7ba-80cfdf78a81e\") pod \"ovsdbserver-sb-0\" (UID: \"ffc86399-3f01-4c6a-942d-b255a957dc52\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.410087 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2318bf44-a8fd-4089-a284-9ff294284d8e\") pod \"ovsdbserver-sb-1\" (UID: \"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.449552 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473062 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473120 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-config\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473160 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473183 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/60983391-7945-4efe-ae6d-7c6ae80e2df8-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473209 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473234 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473268 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60983391-7945-4efe-ae6d-7c6ae80e2df8-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473318 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e651c73-1761-4cda-83b7-5a80fa3af6f4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e651c73-1761-4cda-83b7-5a80fa3af6f4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473362 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/60983391-7945-4efe-ae6d-7c6ae80e2df8-kube-api-access-mn774\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473435 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp68\" (UniqueName: \"kubernetes.io/projected/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-kube-api-access-7gp68\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473476 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60983391-7945-4efe-ae6d-7c6ae80e2df8-config\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473508 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xvkx\" (UniqueName: \"kubernetes.io/projected/6e651c73-1761-4cda-83b7-5a80fa3af6f4-kube-api-access-9xvkx\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e651c73-1761-4cda-83b7-5a80fa3af6f4-config\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473614 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60983391-7945-4efe-ae6d-7c6ae80e2df8-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.473644 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6e651c73-1761-4cda-83b7-5a80fa3af6f4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.474270 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-config\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.474354 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.475089 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.476527 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.476564 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1fb14bde885f1ccb5a7f001f4547bab047e87e1e6bfcc6b23b1fd9ec2ced19f1/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.477972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.486995 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.492801 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp68\" (UniqueName: \"kubernetes.io/projected/a05ee1a7-c012-4766-8d48-3b508d4f8cd2-kube-api-access-7gp68\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.509568 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-779531dd-d96b-4d2a-bdb2-66dcf36f819e\") pod \"ovsdbserver-nb-0\" (UID: \"a05ee1a7-c012-4766-8d48-3b508d4f8cd2\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.523797 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575001 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60983391-7945-4efe-ae6d-7c6ae80e2df8-config\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575382 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xvkx\" (UniqueName: \"kubernetes.io/projected/6e651c73-1761-4cda-83b7-5a80fa3af6f4-kube-api-access-9xvkx\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575433 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e651c73-1761-4cda-83b7-5a80fa3af6f4-config\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575482 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60983391-7945-4efe-ae6d-7c6ae80e2df8-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575515 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6e651c73-1761-4cda-83b7-5a80fa3af6f4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575554 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575594 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/60983391-7945-4efe-ae6d-7c6ae80e2df8-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575620 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60983391-7945-4efe-ae6d-7c6ae80e2df8-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575701 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e651c73-1761-4cda-83b7-5a80fa3af6f4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575721 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e651c73-1761-4cda-83b7-5a80fa3af6f4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.575779 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/60983391-7945-4efe-ae6d-7c6ae80e2df8-kube-api-access-mn774\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.576219 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/60983391-7945-4efe-ae6d-7c6ae80e2df8-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.576307 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e651c73-1761-4cda-83b7-5a80fa3af6f4-config\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.576636 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6e651c73-1761-4cda-83b7-5a80fa3af6f4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.577179 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60983391-7945-4efe-ae6d-7c6ae80e2df8-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.578794 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60983391-7945-4efe-ae6d-7c6ae80e2df8-config\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.579069 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.579109 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d65f24d1bc535aa5e17a6778d651148a816e7ea430f8275e0cc96cf15566781c/globalmount\"" pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.579277 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.579310 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/190f236266ca958f7a75057eeaca477769ef49c449509c364a9625d5cefac56c/globalmount\"" pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.586113 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e651c73-1761-4cda-83b7-5a80fa3af6f4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.586303 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60983391-7945-4efe-ae6d-7c6ae80e2df8-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.591508 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e651c73-1761-4cda-83b7-5a80fa3af6f4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.598782 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xvkx\" (UniqueName: \"kubernetes.io/projected/6e651c73-1761-4cda-83b7-5a80fa3af6f4-kube-api-access-9xvkx\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.610758 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/60983391-7945-4efe-ae6d-7c6ae80e2df8-kube-api-access-mn774\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.613107 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c5ea30-aac7-470e-8fac-d4800be43434\") pod \"ovsdbserver-nb-2\" (UID: \"60983391-7945-4efe-ae6d-7c6ae80e2df8\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.616652 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a42d7d3-d189-4f42-920b-cb33dd5ab83b\") pod \"ovsdbserver-nb-1\" (UID: \"6e651c73-1761-4cda-83b7-5a80fa3af6f4\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.624598 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.658761 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 30 06:31:59 crc kubenswrapper[4931]: I0130 06:31:59.665720 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.019258 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.129766 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.215586 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.422465 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:32:00 crc kubenswrapper[4931]: E0130 06:32:00.422952 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.488018 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11","Type":"ContainerStarted","Data":"f8fe52323f9c565992ae49c5761a423a277d792d28745a9d5c213776cbc6f203"} Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.489538 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ffc86399-3f01-4c6a-942d-b255a957dc52","Type":"ContainerStarted","Data":"635b99e9ac33ac446dad4dec7dab360cedda8cf2f3dd5235c3061e01fc59154c"} Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.492168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"83787678-4305-4893-8aa4-d1ddd8c15343","Type":"ContainerStarted","Data":"55cf12283c75a6bfec709c842346840b484deeea772453851bdf7ddca790cb95"} Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.836314 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 30 06:32:00 crc kubenswrapper[4931]: W0130 06:32:00.844252 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60983391_7945_4efe_ae6d_7c6ae80e2df8.slice/crio-0ef299b8cff4ed4f3c0da7b0f9cf13517fb2cb74d4316f5878117a686e945840 WatchSource:0}: Error finding container 0ef299b8cff4ed4f3c0da7b0f9cf13517fb2cb74d4316f5878117a686e945840: Status 404 returned error can't find the container with id 0ef299b8cff4ed4f3c0da7b0f9cf13517fb2cb74d4316f5878117a686e945840 Jan 30 06:32:00 crc kubenswrapper[4931]: I0130 06:32:00.983558 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 30 06:32:00 crc kubenswrapper[4931]: W0130 06:32:00.992393 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e651c73_1761_4cda_83b7_5a80fa3af6f4.slice/crio-1496e030a874e43133b8be8391443ccb9a7aa903acecd2b41639c0b1993fe1f9 WatchSource:0}: Error finding container 1496e030a874e43133b8be8391443ccb9a7aa903acecd2b41639c0b1993fe1f9: Status 404 returned error can't find the container with id 1496e030a874e43133b8be8391443ccb9a7aa903acecd2b41639c0b1993fe1f9 Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.299630 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 06:32:01 crc kubenswrapper[4931]: W0130 06:32:01.299925 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda05ee1a7_c012_4766_8d48_3b508d4f8cd2.slice/crio-d1346e33e80a02288e81934c023d42a1fb25770c363f448e46a776f8431d5f87 WatchSource:0}: Error finding container d1346e33e80a02288e81934c023d42a1fb25770c363f448e46a776f8431d5f87: Status 404 returned error can't find the container with id d1346e33e80a02288e81934c023d42a1fb25770c363f448e46a776f8431d5f87 Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.502998 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"60983391-7945-4efe-ae6d-7c6ae80e2df8","Type":"ContainerStarted","Data":"01c71b8f3a33ee73298e57d599111b20b88828a20aa1da43a4b20be7c4387d06"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.503051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"60983391-7945-4efe-ae6d-7c6ae80e2df8","Type":"ContainerStarted","Data":"31caf1041cf37cf962e9d2b83850f5f354cc0afb4bdf81669ba1bbd50b0bbe78"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.503066 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"60983391-7945-4efe-ae6d-7c6ae80e2df8","Type":"ContainerStarted","Data":"0ef299b8cff4ed4f3c0da7b0f9cf13517fb2cb74d4316f5878117a686e945840"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.504341 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11","Type":"ContainerStarted","Data":"4f12060011670fe522a97b40f56bf9d2a90759d74f09b317785487900e74d7db"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.504392 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"78634c9d-d8d8-4eed-adc7-fe9fdbf69a11","Type":"ContainerStarted","Data":"c871d6390c1e6a86d07486d70659af4126b81190649deba63824081281713824"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.507856 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ffc86399-3f01-4c6a-942d-b255a957dc52","Type":"ContainerStarted","Data":"26e82e9bee92974c55056f63e2d4f94a18bf881e9e5653b9428fdf0212c0ed99"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.507893 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ffc86399-3f01-4c6a-942d-b255a957dc52","Type":"ContainerStarted","Data":"a8ed37e9fdc67d5e6115c2e9955c4ee9a40a50e46f2229729d527f86cdff4778"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.509672 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"83787678-4305-4893-8aa4-d1ddd8c15343","Type":"ContainerStarted","Data":"b6167738d8312c199a5c0c4f4ee2b22c6d683af17022db00425e1ce30b4f2501"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.509704 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"83787678-4305-4893-8aa4-d1ddd8c15343","Type":"ContainerStarted","Data":"4d224c8fcd758cd92a7aef687ec318745ec7670c5feb656d3acb2ff97fc3fb87"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.511185 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a05ee1a7-c012-4766-8d48-3b508d4f8cd2","Type":"ContainerStarted","Data":"d1346e33e80a02288e81934c023d42a1fb25770c363f448e46a776f8431d5f87"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.512678 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"6e651c73-1761-4cda-83b7-5a80fa3af6f4","Type":"ContainerStarted","Data":"bb9e29a43f2b8697926e39bee430d1df985a5bf7c849bb28641863b0ea609a4a"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.512700 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"6e651c73-1761-4cda-83b7-5a80fa3af6f4","Type":"ContainerStarted","Data":"90da14e6adc2c79bf540a8391491eb7642d8f94543af62652a399ece12ea1dce"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.512710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"6e651c73-1761-4cda-83b7-5a80fa3af6f4","Type":"ContainerStarted","Data":"1496e030a874e43133b8be8391443ccb9a7aa903acecd2b41639c0b1993fe1f9"} Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.524138 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.524116073 podStartE2EDuration="3.524116073s" podCreationTimestamp="2026-01-30 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:01.521371444 +0000 UTC m=+5056.891281731" watchObservedRunningTime="2026-01-30 06:32:01.524116073 +0000 UTC m=+5056.894026330" Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.557342 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.557322706 podStartE2EDuration="3.557322706s" podCreationTimestamp="2026-01-30 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:01.545155677 +0000 UTC m=+5056.915065924" watchObservedRunningTime="2026-01-30 06:32:01.557322706 +0000 UTC m=+5056.927232973" Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.568474 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.5684580649999997 podStartE2EDuration="3.568458065s" podCreationTimestamp="2026-01-30 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:01.566123438 +0000 UTC m=+5056.936033695" watchObservedRunningTime="2026-01-30 06:32:01.568458065 +0000 UTC m=+5056.938368322" Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.593368 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.59335308 podStartE2EDuration="3.59335308s" podCreationTimestamp="2026-01-30 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:01.588557912 +0000 UTC m=+5056.958468169" watchObservedRunningTime="2026-01-30 06:32:01.59335308 +0000 UTC m=+5056.963263337" Jan 30 06:32:01 crc kubenswrapper[4931]: I0130 06:32:01.620615 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.620591522 podStartE2EDuration="3.620591522s" podCreationTimestamp="2026-01-30 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:01.61704831 +0000 UTC m=+5056.986958567" watchObservedRunningTime="2026-01-30 06:32:01.620591522 +0000 UTC m=+5056.990501799" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.449989 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.487415 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.523116 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a05ee1a7-c012-4766-8d48-3b508d4f8cd2","Type":"ContainerStarted","Data":"1ed8a4f374983be1cab867ee0d2de08020dd1e52f75d07cd64012203969c23c3"} Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.523187 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a05ee1a7-c012-4766-8d48-3b508d4f8cd2","Type":"ContainerStarted","Data":"1abd673149956f76172a016558a841e76630b83138e04eef0eec03140deed6ac"} Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.523948 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.561388 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.561363811 podStartE2EDuration="4.561363811s" podCreationTimestamp="2026-01-30 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:02.54877238 +0000 UTC m=+5057.918682717" watchObservedRunningTime="2026-01-30 06:32:02.561363811 +0000 UTC m=+5057.931274108" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.625686 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.659615 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Jan 30 06:32:02 crc kubenswrapper[4931]: I0130 06:32:02.665868 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Jan 30 06:32:04 crc kubenswrapper[4931]: I0130 06:32:04.450559 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 06:32:04 crc kubenswrapper[4931]: I0130 06:32:04.487926 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Jan 30 06:32:04 crc kubenswrapper[4931]: I0130 06:32:04.525239 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Jan 30 06:32:04 crc kubenswrapper[4931]: I0130 06:32:04.625809 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 06:32:04 crc kubenswrapper[4931]: I0130 06:32:04.659273 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Jan 30 06:32:04 crc kubenswrapper[4931]: I0130 06:32:04.666090 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.501531 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.572880 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.589254 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.622805 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.627734 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.651551 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.705108 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.743036 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.748927 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.835500 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df6c6d7b7-lzfwn"] Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.838214 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.841867 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.846780 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df6c6d7b7-lzfwn"] Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.936869 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-config\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.936944 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-ovsdbserver-sb\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.937363 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-dns-svc\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:05 crc kubenswrapper[4931]: I0130 06:32:05.937466 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h659w\" (UniqueName: \"kubernetes.io/projected/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-kube-api-access-h659w\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.038532 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-dns-svc\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.038632 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h659w\" (UniqueName: \"kubernetes.io/projected/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-kube-api-access-h659w\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.038673 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-config\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.038696 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-ovsdbserver-sb\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.039583 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-config\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.039589 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-ovsdbserver-sb\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.042068 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-dns-svc\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.061212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h659w\" (UniqueName: \"kubernetes.io/projected/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-kube-api-access-h659w\") pod \"dnsmasq-dns-df6c6d7b7-lzfwn\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.170367 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.594812 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df6c6d7b7-lzfwn"] Jan 30 06:32:06 crc kubenswrapper[4931]: W0130 06:32:06.601694 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe396dfb_9a46_4bb7_9374_3ffe00f58db8.slice/crio-a84c445d4b480f9e8a3200bbdd1a8a6e97bfb9e01ef343ee8be5fd00bdf76110 WatchSource:0}: Error finding container a84c445d4b480f9e8a3200bbdd1a8a6e97bfb9e01ef343ee8be5fd00bdf76110: Status 404 returned error can't find the container with id a84c445d4b480f9e8a3200bbdd1a8a6e97bfb9e01ef343ee8be5fd00bdf76110 Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.602106 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.884184 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df6c6d7b7-lzfwn"] Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.918614 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666dc49759-6999t"] Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.929298 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.932896 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 06:32:06 crc kubenswrapper[4931]: I0130 06:32:06.935333 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666dc49759-6999t"] Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.061798 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-dns-svc\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.061847 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-nb\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.061907 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-config\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.061951 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-sb\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.062021 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfn7\" (UniqueName: \"kubernetes.io/projected/0e31871f-729e-4b67-98d0-96973ea90de3-kube-api-access-qcfn7\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.163190 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-config\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.163555 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-sb\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.163755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfn7\" (UniqueName: \"kubernetes.io/projected/0e31871f-729e-4b67-98d0-96973ea90de3-kube-api-access-qcfn7\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.163895 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-dns-svc\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.163998 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-nb\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.165071 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-nb\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.165090 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-sb\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.165097 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-dns-svc\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.165510 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-config\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.201625 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfn7\" (UniqueName: \"kubernetes.io/projected/0e31871f-729e-4b67-98d0-96973ea90de3-kube-api-access-qcfn7\") pod \"dnsmasq-dns-666dc49759-6999t\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.258223 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.566536 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerID="0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1" exitCode=0 Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.566579 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" event={"ID":"fe396dfb-9a46-4bb7-9374-3ffe00f58db8","Type":"ContainerDied","Data":"0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1"} Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.566922 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" event={"ID":"fe396dfb-9a46-4bb7-9374-3ffe00f58db8","Type":"ContainerStarted","Data":"a84c445d4b480f9e8a3200bbdd1a8a6e97bfb9e01ef343ee8be5fd00bdf76110"} Jan 30 06:32:07 crc kubenswrapper[4931]: W0130 06:32:07.703849 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e31871f_729e_4b67_98d0_96973ea90de3.slice/crio-040800f5a3860baea453ed500ca3333fe258941fbbb6b519709bd70c1f55e9fe WatchSource:0}: Error finding container 040800f5a3860baea453ed500ca3333fe258941fbbb6b519709bd70c1f55e9fe: Status 404 returned error can't find the container with id 040800f5a3860baea453ed500ca3333fe258941fbbb6b519709bd70c1f55e9fe Jan 30 06:32:07 crc kubenswrapper[4931]: I0130 06:32:07.704582 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666dc49759-6999t"] Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.577990 4931 generic.go:334] "Generic (PLEG): container finished" podID="0e31871f-729e-4b67-98d0-96973ea90de3" containerID="89484ad9976e7f0f1e67abb8cfa05b476c12211570ced1863487804ae3932924" exitCode=0 Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.578321 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666dc49759-6999t" event={"ID":"0e31871f-729e-4b67-98d0-96973ea90de3","Type":"ContainerDied","Data":"89484ad9976e7f0f1e67abb8cfa05b476c12211570ced1863487804ae3932924"} Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.578586 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666dc49759-6999t" event={"ID":"0e31871f-729e-4b67-98d0-96973ea90de3","Type":"ContainerStarted","Data":"040800f5a3860baea453ed500ca3333fe258941fbbb6b519709bd70c1f55e9fe"} Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.583965 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" event={"ID":"fe396dfb-9a46-4bb7-9374-3ffe00f58db8","Type":"ContainerStarted","Data":"ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b"} Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.584822 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerName="dnsmasq-dns" containerID="cri-o://ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b" gracePeriod=10 Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.585082 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:08 crc kubenswrapper[4931]: I0130 06:32:08.636449 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" podStartSLOduration=3.6363999700000003 podStartE2EDuration="3.63639997s" podCreationTimestamp="2026-01-30 06:32:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:08.629489442 +0000 UTC m=+5063.999399709" watchObservedRunningTime="2026-01-30 06:32:08.63639997 +0000 UTC m=+5064.006310237" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.013859 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.097213 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h659w\" (UniqueName: \"kubernetes.io/projected/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-kube-api-access-h659w\") pod \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.097310 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-dns-svc\") pod \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.097378 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-ovsdbserver-sb\") pod \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.097401 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-config\") pod \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\" (UID: \"fe396dfb-9a46-4bb7-9374-3ffe00f58db8\") " Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.111782 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-kube-api-access-h659w" (OuterVolumeSpecName: "kube-api-access-h659w") pod "fe396dfb-9a46-4bb7-9374-3ffe00f58db8" (UID: "fe396dfb-9a46-4bb7-9374-3ffe00f58db8"). InnerVolumeSpecName "kube-api-access-h659w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.136648 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-config" (OuterVolumeSpecName: "config") pod "fe396dfb-9a46-4bb7-9374-3ffe00f58db8" (UID: "fe396dfb-9a46-4bb7-9374-3ffe00f58db8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.138918 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe396dfb-9a46-4bb7-9374-3ffe00f58db8" (UID: "fe396dfb-9a46-4bb7-9374-3ffe00f58db8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.155397 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe396dfb-9a46-4bb7-9374-3ffe00f58db8" (UID: "fe396dfb-9a46-4bb7-9374-3ffe00f58db8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.200469 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h659w\" (UniqueName: \"kubernetes.io/projected/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-kube-api-access-h659w\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.200504 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.200620 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.200629 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe396dfb-9a46-4bb7-9374-3ffe00f58db8-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.595744 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666dc49759-6999t" event={"ID":"0e31871f-729e-4b67-98d0-96973ea90de3","Type":"ContainerStarted","Data":"287747b607e19df8febc584091bc73c9472de152802c4b7543aad2f9bb9038dd"} Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.596539 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.601216 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerID="ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b" exitCode=0 Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.601484 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" event={"ID":"fe396dfb-9a46-4bb7-9374-3ffe00f58db8","Type":"ContainerDied","Data":"ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b"} Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.601567 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" event={"ID":"fe396dfb-9a46-4bb7-9374-3ffe00f58db8","Type":"ContainerDied","Data":"a84c445d4b480f9e8a3200bbdd1a8a6e97bfb9e01ef343ee8be5fd00bdf76110"} Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.601593 4931 scope.go:117] "RemoveContainer" containerID="ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.601816 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df6c6d7b7-lzfwn" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.622222 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666dc49759-6999t" podStartSLOduration=3.622204292 podStartE2EDuration="3.622204292s" podCreationTimestamp="2026-01-30 06:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:09.616729215 +0000 UTC m=+5064.986639482" watchObservedRunningTime="2026-01-30 06:32:09.622204292 +0000 UTC m=+5064.992114559" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.627490 4931 scope.go:117] "RemoveContainer" containerID="0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.641286 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df6c6d7b7-lzfwn"] Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.659309 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-df6c6d7b7-lzfwn"] Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.664051 4931 scope.go:117] "RemoveContainer" containerID="ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b" Jan 30 06:32:09 crc kubenswrapper[4931]: E0130 06:32:09.664394 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b\": container with ID starting with ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b not found: ID does not exist" containerID="ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.664434 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b"} err="failed to get container status \"ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b\": rpc error: code = NotFound desc = could not find container \"ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b\": container with ID starting with ae734a6ebd6227f8fcf285169a0430e26869fe5fdc3662f764a1920e525cd16b not found: ID does not exist" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.664453 4931 scope.go:117] "RemoveContainer" containerID="0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1" Jan 30 06:32:09 crc kubenswrapper[4931]: E0130 06:32:09.664718 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1\": container with ID starting with 0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1 not found: ID does not exist" containerID="0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.664774 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1"} err="failed to get container status \"0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1\": rpc error: code = NotFound desc = could not find container \"0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1\": container with ID starting with 0d7e74b9ea8646e46814295903083348e0de1587c8a621c11772b41b5f2780d1 not found: ID does not exist" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.698697 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Jan 30 06:32:09 crc kubenswrapper[4931]: I0130 06:32:09.703339 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Jan 30 06:32:11 crc kubenswrapper[4931]: I0130 06:32:11.439628 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" path="/var/lib/kubelet/pods/fe396dfb-9a46-4bb7-9374-3ffe00f58db8/volumes" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.757791 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Jan 30 06:32:12 crc kubenswrapper[4931]: E0130 06:32:12.758300 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerName="dnsmasq-dns" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.758324 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerName="dnsmasq-dns" Jan 30 06:32:12 crc kubenswrapper[4931]: E0130 06:32:12.758362 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerName="init" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.758378 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerName="init" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.758662 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe396dfb-9a46-4bb7-9374-3ffe00f58db8" containerName="dnsmasq-dns" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.759680 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.762860 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.774413 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.865968 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/49126964-dfd0-4103-a3fd-5244d9b49b9d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.866089 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c442539e-eb47-4d14-9776-8bd69ac65863\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c442539e-eb47-4d14-9776-8bd69ac65863\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.866190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6f4\" (UniqueName: \"kubernetes.io/projected/49126964-dfd0-4103-a3fd-5244d9b49b9d-kube-api-access-cw6f4\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.969758 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/49126964-dfd0-4103-a3fd-5244d9b49b9d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.969977 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c442539e-eb47-4d14-9776-8bd69ac65863\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c442539e-eb47-4d14-9776-8bd69ac65863\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.970145 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6f4\" (UniqueName: \"kubernetes.io/projected/49126964-dfd0-4103-a3fd-5244d9b49b9d-kube-api-access-cw6f4\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.975704 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.976081 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c442539e-eb47-4d14-9776-8bd69ac65863\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c442539e-eb47-4d14-9776-8bd69ac65863\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66f725498011e0f8f1b50f15f3253b4003921dec61a01597fc0176e58e7ec4fd/globalmount\"" pod="openstack/ovn-copy-data" Jan 30 06:32:12 crc kubenswrapper[4931]: I0130 06:32:12.980650 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/49126964-dfd0-4103-a3fd-5244d9b49b9d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:13 crc kubenswrapper[4931]: I0130 06:32:13.422592 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:32:13 crc kubenswrapper[4931]: E0130 06:32:13.423070 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:32:13 crc kubenswrapper[4931]: I0130 06:32:13.505232 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6f4\" (UniqueName: \"kubernetes.io/projected/49126964-dfd0-4103-a3fd-5244d9b49b9d-kube-api-access-cw6f4\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:13 crc kubenswrapper[4931]: I0130 06:32:13.646687 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c442539e-eb47-4d14-9776-8bd69ac65863\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c442539e-eb47-4d14-9776-8bd69ac65863\") pod \"ovn-copy-data\" (UID: \"49126964-dfd0-4103-a3fd-5244d9b49b9d\") " pod="openstack/ovn-copy-data" Jan 30 06:32:13 crc kubenswrapper[4931]: I0130 06:32:13.702218 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 30 06:32:14 crc kubenswrapper[4931]: I0130 06:32:14.119910 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 30 06:32:14 crc kubenswrapper[4931]: W0130 06:32:14.124529 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49126964_dfd0_4103_a3fd_5244d9b49b9d.slice/crio-6bd556cbb3b2ed9e4e6f66be6766dfb9de16e482a68ca2ec035ae43e8913d115 WatchSource:0}: Error finding container 6bd556cbb3b2ed9e4e6f66be6766dfb9de16e482a68ca2ec035ae43e8913d115: Status 404 returned error can't find the container with id 6bd556cbb3b2ed9e4e6f66be6766dfb9de16e482a68ca2ec035ae43e8913d115 Jan 30 06:32:14 crc kubenswrapper[4931]: I0130 06:32:14.657895 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"49126964-dfd0-4103-a3fd-5244d9b49b9d","Type":"ContainerStarted","Data":"6bd556cbb3b2ed9e4e6f66be6766dfb9de16e482a68ca2ec035ae43e8913d115"} Jan 30 06:32:15 crc kubenswrapper[4931]: I0130 06:32:15.670078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"49126964-dfd0-4103-a3fd-5244d9b49b9d","Type":"ContainerStarted","Data":"0617797c0cfbc4de0b816608cfa5f5f2f1749246b5d34b737bd828707ea93886"} Jan 30 06:32:15 crc kubenswrapper[4931]: I0130 06:32:15.698969 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=4.202444561 podStartE2EDuration="4.69894336s" podCreationTimestamp="2026-01-30 06:32:11 +0000 UTC" firstStartedPulling="2026-01-30 06:32:14.126671598 +0000 UTC m=+5069.496581885" lastFinishedPulling="2026-01-30 06:32:14.623170387 +0000 UTC m=+5069.993080684" observedRunningTime="2026-01-30 06:32:15.690732395 +0000 UTC m=+5071.060642692" watchObservedRunningTime="2026-01-30 06:32:15.69894336 +0000 UTC m=+5071.068853647" Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.260691 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.339007 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-8dc7b"] Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.339636 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerName="dnsmasq-dns" containerID="cri-o://c38a0669baf43cbcb3f248e3fbd7e19a5da64c872efcf71b2a52e9bac3e9cedd" gracePeriod=10 Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.696938 4931 generic.go:334] "Generic (PLEG): container finished" podID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerID="c38a0669baf43cbcb3f248e3fbd7e19a5da64c872efcf71b2a52e9bac3e9cedd" exitCode=0 Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.696974 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" event={"ID":"df16978b-d22c-4dd1-87d8-330cf82a859d","Type":"ContainerDied","Data":"c38a0669baf43cbcb3f248e3fbd7e19a5da64c872efcf71b2a52e9bac3e9cedd"} Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.797176 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.970936 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-config\") pod \"df16978b-d22c-4dd1-87d8-330cf82a859d\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.971042 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/df16978b-d22c-4dd1-87d8-330cf82a859d-kube-api-access-r68qr\") pod \"df16978b-d22c-4dd1-87d8-330cf82a859d\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.971255 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-dns-svc\") pod \"df16978b-d22c-4dd1-87d8-330cf82a859d\" (UID: \"df16978b-d22c-4dd1-87d8-330cf82a859d\") " Jan 30 06:32:17 crc kubenswrapper[4931]: I0130 06:32:17.996471 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df16978b-d22c-4dd1-87d8-330cf82a859d-kube-api-access-r68qr" (OuterVolumeSpecName: "kube-api-access-r68qr") pod "df16978b-d22c-4dd1-87d8-330cf82a859d" (UID: "df16978b-d22c-4dd1-87d8-330cf82a859d"). InnerVolumeSpecName "kube-api-access-r68qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.033702 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df16978b-d22c-4dd1-87d8-330cf82a859d" (UID: "df16978b-d22c-4dd1-87d8-330cf82a859d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.043405 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-config" (OuterVolumeSpecName: "config") pod "df16978b-d22c-4dd1-87d8-330cf82a859d" (UID: "df16978b-d22c-4dd1-87d8-330cf82a859d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.073161 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.073199 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/df16978b-d22c-4dd1-87d8-330cf82a859d-kube-api-access-r68qr\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.073213 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df16978b-d22c-4dd1-87d8-330cf82a859d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.712974 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" event={"ID":"df16978b-d22c-4dd1-87d8-330cf82a859d","Type":"ContainerDied","Data":"e39274ddf419bf92df1598af572cac14fef4ddf978c753b8f221ebec54b897c7"} Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.713035 4931 scope.go:117] "RemoveContainer" containerID="c38a0669baf43cbcb3f248e3fbd7e19a5da64c872efcf71b2a52e9bac3e9cedd" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.713045 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-8dc7b" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.740195 4931 scope.go:117] "RemoveContainer" containerID="a56ecc0c98ffc762965d58506b3e81c6c6637f6a00e16f27ab2be355f3d037e0" Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.788887 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-8dc7b"] Jan 30 06:32:18 crc kubenswrapper[4931]: I0130 06:32:18.801364 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-8dc7b"] Jan 30 06:32:19 crc kubenswrapper[4931]: I0130 06:32:19.437589 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" path="/var/lib/kubelet/pods/df16978b-d22c-4dd1-87d8-330cf82a859d/volumes" Jan 30 06:32:20 crc kubenswrapper[4931]: E0130 06:32:20.494961 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.179:59698->38.102.83.179:45103: write tcp 38.102.83.179:59698->38.102.83.179:45103: write: broken pipe Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.324600 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 06:32:21 crc kubenswrapper[4931]: E0130 06:32:21.325570 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerName="dnsmasq-dns" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.325664 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerName="dnsmasq-dns" Jan 30 06:32:21 crc kubenswrapper[4931]: E0130 06:32:21.325742 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerName="init" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.325809 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerName="init" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.326091 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="df16978b-d22c-4dd1-87d8-330cf82a859d" containerName="dnsmasq-dns" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.327172 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.334792 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.335125 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.335125 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5bcn4" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.336536 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.433176 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wp5w\" (UniqueName: \"kubernetes.io/projected/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-kube-api-access-9wp5w\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.433236 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-config\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.433280 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.433316 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-scripts\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.433380 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.535246 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.535371 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wp5w\" (UniqueName: \"kubernetes.io/projected/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-kube-api-access-9wp5w\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.535402 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-config\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.535438 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.535457 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-scripts\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.536166 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.537063 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-config\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.537733 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-scripts\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.545212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.551601 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wp5w\" (UniqueName: \"kubernetes.io/projected/c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe-kube-api-access-9wp5w\") pod \"ovn-northd-0\" (UID: \"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe\") " pod="openstack/ovn-northd-0" Jan 30 06:32:21 crc kubenswrapper[4931]: I0130 06:32:21.654221 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 06:32:22 crc kubenswrapper[4931]: I0130 06:32:22.190163 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 06:32:22 crc kubenswrapper[4931]: I0130 06:32:22.757507 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe","Type":"ContainerStarted","Data":"056c368bc681352e80840b63f8e913f984184e6ed487de39d00e2b5f9a6c4fe1"} Jan 30 06:32:22 crc kubenswrapper[4931]: I0130 06:32:22.758074 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe","Type":"ContainerStarted","Data":"1656972ea219ef061adb87d6947c3351e89efdbce4b92eea37cb77535ec15950"} Jan 30 06:32:22 crc kubenswrapper[4931]: I0130 06:32:22.758098 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe","Type":"ContainerStarted","Data":"7d52b8dd2614ca3683972322875ce159bcc9526f76eb6a9e1f344f059d916f3a"} Jan 30 06:32:22 crc kubenswrapper[4931]: I0130 06:32:22.758146 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 06:32:22 crc kubenswrapper[4931]: I0130 06:32:22.794495 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.794467698 podStartE2EDuration="1.794467698s" podCreationTimestamp="2026-01-30 06:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:22.782196756 +0000 UTC m=+5078.152107053" watchObservedRunningTime="2026-01-30 06:32:22.794467698 +0000 UTC m=+5078.164377985" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.765245 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4gjjc"] Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.767386 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.781373 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9c4e-account-create-update-vz7cn"] Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.783144 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.785674 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.790275 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4gjjc"] Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.796907 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9c4e-account-create-update-vz7cn"] Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.948329 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d95012d1-b402-48eb-baf4-36fabfd1e4f2-operator-scripts\") pod \"keystone-9c4e-account-create-update-vz7cn\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.948391 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5rh\" (UniqueName: \"kubernetes.io/projected/d5b58826-6a83-4c91-a9f7-8c6c861c509b-kube-api-access-zn5rh\") pod \"keystone-db-create-4gjjc\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.948516 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvc8r\" (UniqueName: \"kubernetes.io/projected/d95012d1-b402-48eb-baf4-36fabfd1e4f2-kube-api-access-xvc8r\") pod \"keystone-9c4e-account-create-update-vz7cn\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:26 crc kubenswrapper[4931]: I0130 06:32:26.948599 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b58826-6a83-4c91-a9f7-8c6c861c509b-operator-scripts\") pod \"keystone-db-create-4gjjc\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.050285 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvc8r\" (UniqueName: \"kubernetes.io/projected/d95012d1-b402-48eb-baf4-36fabfd1e4f2-kube-api-access-xvc8r\") pod \"keystone-9c4e-account-create-update-vz7cn\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.050372 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b58826-6a83-4c91-a9f7-8c6c861c509b-operator-scripts\") pod \"keystone-db-create-4gjjc\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.050413 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d95012d1-b402-48eb-baf4-36fabfd1e4f2-operator-scripts\") pod \"keystone-9c4e-account-create-update-vz7cn\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.050449 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5rh\" (UniqueName: \"kubernetes.io/projected/d5b58826-6a83-4c91-a9f7-8c6c861c509b-kube-api-access-zn5rh\") pod \"keystone-db-create-4gjjc\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.051140 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b58826-6a83-4c91-a9f7-8c6c861c509b-operator-scripts\") pod \"keystone-db-create-4gjjc\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.051331 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d95012d1-b402-48eb-baf4-36fabfd1e4f2-operator-scripts\") pod \"keystone-9c4e-account-create-update-vz7cn\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.082324 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvc8r\" (UniqueName: \"kubernetes.io/projected/d95012d1-b402-48eb-baf4-36fabfd1e4f2-kube-api-access-xvc8r\") pod \"keystone-9c4e-account-create-update-vz7cn\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.083866 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5rh\" (UniqueName: \"kubernetes.io/projected/d5b58826-6a83-4c91-a9f7-8c6c861c509b-kube-api-access-zn5rh\") pod \"keystone-db-create-4gjjc\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.116993 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.125114 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.426820 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:32:27 crc kubenswrapper[4931]: E0130 06:32:27.427220 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.576181 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9c4e-account-create-update-vz7cn"] Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.777935 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4gjjc"] Jan 30 06:32:27 crc kubenswrapper[4931]: W0130 06:32:27.801481 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5b58826_6a83_4c91_a9f7_8c6c861c509b.slice/crio-92b95bd2e19b7758033797d9e3976f9ced241ca140f8908b86261492c6bc081f WatchSource:0}: Error finding container 92b95bd2e19b7758033797d9e3976f9ced241ca140f8908b86261492c6bc081f: Status 404 returned error can't find the container with id 92b95bd2e19b7758033797d9e3976f9ced241ca140f8908b86261492c6bc081f Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.818191 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4gjjc" event={"ID":"d5b58826-6a83-4c91-a9f7-8c6c861c509b","Type":"ContainerStarted","Data":"92b95bd2e19b7758033797d9e3976f9ced241ca140f8908b86261492c6bc081f"} Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.819361 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c4e-account-create-update-vz7cn" event={"ID":"d95012d1-b402-48eb-baf4-36fabfd1e4f2","Type":"ContainerStarted","Data":"cbeb8eeb50114e9f7323f10c636e65f223a4a4ce3cb8536dd62a68b32d0fcd46"} Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.819404 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c4e-account-create-update-vz7cn" event={"ID":"d95012d1-b402-48eb-baf4-36fabfd1e4f2","Type":"ContainerStarted","Data":"be2c32b6f745597d8479e5ac5b6c48a807f5085a073d891cb3a28dbc0b82adf3"} Jan 30 06:32:27 crc kubenswrapper[4931]: I0130 06:32:27.853195 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9c4e-account-create-update-vz7cn" podStartSLOduration=1.853143578 podStartE2EDuration="1.853143578s" podCreationTimestamp="2026-01-30 06:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:27.840114764 +0000 UTC m=+5083.210025061" watchObservedRunningTime="2026-01-30 06:32:27.853143578 +0000 UTC m=+5083.223053865" Jan 30 06:32:28 crc kubenswrapper[4931]: I0130 06:32:28.832416 4931 generic.go:334] "Generic (PLEG): container finished" podID="d5b58826-6a83-4c91-a9f7-8c6c861c509b" containerID="a4061e639e286e3a321d0a950315a3048946e43d437d1b9673f6d152b515bf12" exitCode=0 Jan 30 06:32:28 crc kubenswrapper[4931]: I0130 06:32:28.832591 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4gjjc" event={"ID":"d5b58826-6a83-4c91-a9f7-8c6c861c509b","Type":"ContainerDied","Data":"a4061e639e286e3a321d0a950315a3048946e43d437d1b9673f6d152b515bf12"} Jan 30 06:32:29 crc kubenswrapper[4931]: I0130 06:32:29.843970 4931 generic.go:334] "Generic (PLEG): container finished" podID="d95012d1-b402-48eb-baf4-36fabfd1e4f2" containerID="cbeb8eeb50114e9f7323f10c636e65f223a4a4ce3cb8536dd62a68b32d0fcd46" exitCode=0 Jan 30 06:32:29 crc kubenswrapper[4931]: I0130 06:32:29.844006 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c4e-account-create-update-vz7cn" event={"ID":"d95012d1-b402-48eb-baf4-36fabfd1e4f2","Type":"ContainerDied","Data":"cbeb8eeb50114e9f7323f10c636e65f223a4a4ce3cb8536dd62a68b32d0fcd46"} Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.207571 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.313694 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn5rh\" (UniqueName: \"kubernetes.io/projected/d5b58826-6a83-4c91-a9f7-8c6c861c509b-kube-api-access-zn5rh\") pod \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.313982 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b58826-6a83-4c91-a9f7-8c6c861c509b-operator-scripts\") pod \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\" (UID: \"d5b58826-6a83-4c91-a9f7-8c6c861c509b\") " Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.314756 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b58826-6a83-4c91-a9f7-8c6c861c509b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5b58826-6a83-4c91-a9f7-8c6c861c509b" (UID: "d5b58826-6a83-4c91-a9f7-8c6c861c509b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.320623 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b58826-6a83-4c91-a9f7-8c6c861c509b-kube-api-access-zn5rh" (OuterVolumeSpecName: "kube-api-access-zn5rh") pod "d5b58826-6a83-4c91-a9f7-8c6c861c509b" (UID: "d5b58826-6a83-4c91-a9f7-8c6c861c509b"). InnerVolumeSpecName "kube-api-access-zn5rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.415495 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5b58826-6a83-4c91-a9f7-8c6c861c509b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.415936 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn5rh\" (UniqueName: \"kubernetes.io/projected/d5b58826-6a83-4c91-a9f7-8c6c861c509b-kube-api-access-zn5rh\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.854973 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4gjjc" Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.855079 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4gjjc" event={"ID":"d5b58826-6a83-4c91-a9f7-8c6c861c509b","Type":"ContainerDied","Data":"92b95bd2e19b7758033797d9e3976f9ced241ca140f8908b86261492c6bc081f"} Jan 30 06:32:30 crc kubenswrapper[4931]: I0130 06:32:30.855114 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92b95bd2e19b7758033797d9e3976f9ced241ca140f8908b86261492c6bc081f" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.274157 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.431593 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d95012d1-b402-48eb-baf4-36fabfd1e4f2-operator-scripts\") pod \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.431711 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvc8r\" (UniqueName: \"kubernetes.io/projected/d95012d1-b402-48eb-baf4-36fabfd1e4f2-kube-api-access-xvc8r\") pod \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\" (UID: \"d95012d1-b402-48eb-baf4-36fabfd1e4f2\") " Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.432842 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95012d1-b402-48eb-baf4-36fabfd1e4f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d95012d1-b402-48eb-baf4-36fabfd1e4f2" (UID: "d95012d1-b402-48eb-baf4-36fabfd1e4f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.438450 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95012d1-b402-48eb-baf4-36fabfd1e4f2-kube-api-access-xvc8r" (OuterVolumeSpecName: "kube-api-access-xvc8r") pod "d95012d1-b402-48eb-baf4-36fabfd1e4f2" (UID: "d95012d1-b402-48eb-baf4-36fabfd1e4f2"). InnerVolumeSpecName "kube-api-access-xvc8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.535183 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d95012d1-b402-48eb-baf4-36fabfd1e4f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.535222 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvc8r\" (UniqueName: \"kubernetes.io/projected/d95012d1-b402-48eb-baf4-36fabfd1e4f2-kube-api-access-xvc8r\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.872464 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c4e-account-create-update-vz7cn" event={"ID":"d95012d1-b402-48eb-baf4-36fabfd1e4f2","Type":"ContainerDied","Data":"be2c32b6f745597d8479e5ac5b6c48a807f5085a073d891cb3a28dbc0b82adf3"} Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.872508 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be2c32b6f745597d8479e5ac5b6c48a807f5085a073d891cb3a28dbc0b82adf3" Jan 30 06:32:31 crc kubenswrapper[4931]: I0130 06:32:31.872627 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c4e-account-create-update-vz7cn" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.566523 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ckz5s"] Jan 30 06:32:32 crc kubenswrapper[4931]: E0130 06:32:32.567338 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b58826-6a83-4c91-a9f7-8c6c861c509b" containerName="mariadb-database-create" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.567368 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b58826-6a83-4c91-a9f7-8c6c861c509b" containerName="mariadb-database-create" Jan 30 06:32:32 crc kubenswrapper[4931]: E0130 06:32:32.567417 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95012d1-b402-48eb-baf4-36fabfd1e4f2" containerName="mariadb-account-create-update" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.567452 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95012d1-b402-48eb-baf4-36fabfd1e4f2" containerName="mariadb-account-create-update" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.567700 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b58826-6a83-4c91-a9f7-8c6c861c509b" containerName="mariadb-database-create" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.567733 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95012d1-b402-48eb-baf4-36fabfd1e4f2" containerName="mariadb-account-create-update" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.568639 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.574346 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.574624 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.574831 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.575210 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lg2q7" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.575962 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ckz5s"] Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.656192 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9pdl\" (UniqueName: \"kubernetes.io/projected/65588685-2245-486d-b7a9-95b8a71f8ff7-kube-api-access-t9pdl\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.656292 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-config-data\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.656351 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-combined-ca-bundle\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.758059 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-combined-ca-bundle\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.758140 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9pdl\" (UniqueName: \"kubernetes.io/projected/65588685-2245-486d-b7a9-95b8a71f8ff7-kube-api-access-t9pdl\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.758239 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-config-data\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.762612 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-combined-ca-bundle\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.762725 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-config-data\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.783111 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9pdl\" (UniqueName: \"kubernetes.io/projected/65588685-2245-486d-b7a9-95b8a71f8ff7-kube-api-access-t9pdl\") pod \"keystone-db-sync-ckz5s\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:32 crc kubenswrapper[4931]: I0130 06:32:32.890996 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:33 crc kubenswrapper[4931]: I0130 06:32:33.419211 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ckz5s"] Jan 30 06:32:33 crc kubenswrapper[4931]: W0130 06:32:33.429409 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65588685_2245_486d_b7a9_95b8a71f8ff7.slice/crio-455c2a7f6e8e121d57f56247f3e15caf25d977cfac9d6d567b2d39e578602343 WatchSource:0}: Error finding container 455c2a7f6e8e121d57f56247f3e15caf25d977cfac9d6d567b2d39e578602343: Status 404 returned error can't find the container with id 455c2a7f6e8e121d57f56247f3e15caf25d977cfac9d6d567b2d39e578602343 Jan 30 06:32:33 crc kubenswrapper[4931]: I0130 06:32:33.890180 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ckz5s" event={"ID":"65588685-2245-486d-b7a9-95b8a71f8ff7","Type":"ContainerStarted","Data":"cdd69294edc054ababcd2665bc64a680aa08a0804663ff331148d5c4aedf9140"} Jan 30 06:32:33 crc kubenswrapper[4931]: I0130 06:32:33.890741 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ckz5s" event={"ID":"65588685-2245-486d-b7a9-95b8a71f8ff7","Type":"ContainerStarted","Data":"455c2a7f6e8e121d57f56247f3e15caf25d977cfac9d6d567b2d39e578602343"} Jan 30 06:32:33 crc kubenswrapper[4931]: I0130 06:32:33.922744 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ckz5s" podStartSLOduration=1.922710881 podStartE2EDuration="1.922710881s" podCreationTimestamp="2026-01-30 06:32:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:33.916127242 +0000 UTC m=+5089.286037519" watchObservedRunningTime="2026-01-30 06:32:33.922710881 +0000 UTC m=+5089.292621178" Jan 30 06:32:35 crc kubenswrapper[4931]: I0130 06:32:35.911536 4931 generic.go:334] "Generic (PLEG): container finished" podID="65588685-2245-486d-b7a9-95b8a71f8ff7" containerID="cdd69294edc054ababcd2665bc64a680aa08a0804663ff331148d5c4aedf9140" exitCode=0 Jan 30 06:32:35 crc kubenswrapper[4931]: I0130 06:32:35.911611 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ckz5s" event={"ID":"65588685-2245-486d-b7a9-95b8a71f8ff7","Type":"ContainerDied","Data":"cdd69294edc054ababcd2665bc64a680aa08a0804663ff331148d5c4aedf9140"} Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.316943 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.445032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9pdl\" (UniqueName: \"kubernetes.io/projected/65588685-2245-486d-b7a9-95b8a71f8ff7-kube-api-access-t9pdl\") pod \"65588685-2245-486d-b7a9-95b8a71f8ff7\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.445204 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-combined-ca-bundle\") pod \"65588685-2245-486d-b7a9-95b8a71f8ff7\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.445293 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-config-data\") pod \"65588685-2245-486d-b7a9-95b8a71f8ff7\" (UID: \"65588685-2245-486d-b7a9-95b8a71f8ff7\") " Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.452219 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65588685-2245-486d-b7a9-95b8a71f8ff7-kube-api-access-t9pdl" (OuterVolumeSpecName: "kube-api-access-t9pdl") pod "65588685-2245-486d-b7a9-95b8a71f8ff7" (UID: "65588685-2245-486d-b7a9-95b8a71f8ff7"). InnerVolumeSpecName "kube-api-access-t9pdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.466781 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65588685-2245-486d-b7a9-95b8a71f8ff7" (UID: "65588685-2245-486d-b7a9-95b8a71f8ff7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.497370 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-config-data" (OuterVolumeSpecName: "config-data") pod "65588685-2245-486d-b7a9-95b8a71f8ff7" (UID: "65588685-2245-486d-b7a9-95b8a71f8ff7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.547606 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.547653 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65588685-2245-486d-b7a9-95b8a71f8ff7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.547663 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9pdl\" (UniqueName: \"kubernetes.io/projected/65588685-2245-486d-b7a9-95b8a71f8ff7-kube-api-access-t9pdl\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.937273 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ckz5s" event={"ID":"65588685-2245-486d-b7a9-95b8a71f8ff7","Type":"ContainerDied","Data":"455c2a7f6e8e121d57f56247f3e15caf25d977cfac9d6d567b2d39e578602343"} Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.937654 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="455c2a7f6e8e121d57f56247f3e15caf25d977cfac9d6d567b2d39e578602343" Jan 30 06:32:37 crc kubenswrapper[4931]: I0130 06:32:37.937364 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ckz5s" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.184673 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cc57676c-79k7x"] Jan 30 06:32:38 crc kubenswrapper[4931]: E0130 06:32:38.185030 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65588685-2245-486d-b7a9-95b8a71f8ff7" containerName="keystone-db-sync" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.185044 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="65588685-2245-486d-b7a9-95b8a71f8ff7" containerName="keystone-db-sync" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.185196 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="65588685-2245-486d-b7a9-95b8a71f8ff7" containerName="keystone-db-sync" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.186091 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.198757 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc57676c-79k7x"] Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.264803 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jmsq5"] Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.268205 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.272244 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.272476 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lg2q7" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.272942 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.276342 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.279327 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.284828 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-sb\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.284991 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-nb\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.285117 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-dns-svc\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.285155 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-config\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.285182 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7x6\" (UniqueName: \"kubernetes.io/projected/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-kube-api-access-dd7x6\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.302413 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jmsq5"] Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386528 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-dns-svc\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-config\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386606 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7x6\" (UniqueName: \"kubernetes.io/projected/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-kube-api-access-dd7x6\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386644 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-credential-keys\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386682 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-config-data\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386700 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-sb\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386728 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzw69\" (UniqueName: \"kubernetes.io/projected/3ab4d491-82ea-4973-a8e9-ef26ba522b43-kube-api-access-zzw69\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386767 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-combined-ca-bundle\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386794 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-scripts\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386809 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-nb\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.386830 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-fernet-keys\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.387577 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-config\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.387574 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-dns-svc\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.388245 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-nb\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.388259 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-sb\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.405566 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7x6\" (UniqueName: \"kubernetes.io/projected/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-kube-api-access-dd7x6\") pod \"dnsmasq-dns-cc57676c-79k7x\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.422838 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:32:38 crc kubenswrapper[4931]: E0130 06:32:38.423112 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.488689 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-combined-ca-bundle\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.488752 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-scripts\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.488776 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-fernet-keys\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.489230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-credential-keys\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.489293 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-config-data\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.489310 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzw69\" (UniqueName: \"kubernetes.io/projected/3ab4d491-82ea-4973-a8e9-ef26ba522b43-kube-api-access-zzw69\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.492172 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-scripts\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.492509 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-combined-ca-bundle\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.492799 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-config-data\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.493734 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-fernet-keys\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.495877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-credential-keys\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.511389 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzw69\" (UniqueName: \"kubernetes.io/projected/3ab4d491-82ea-4973-a8e9-ef26ba522b43-kube-api-access-zzw69\") pod \"keystone-bootstrap-jmsq5\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.550325 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.603055 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.889899 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jmsq5"] Jan 30 06:32:38 crc kubenswrapper[4931]: W0130 06:32:38.898531 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab4d491_82ea_4973_a8e9_ef26ba522b43.slice/crio-384326f31c2144527b26181eb4cdf0836f375339b22144caee3d08cacb27cb53 WatchSource:0}: Error finding container 384326f31c2144527b26181eb4cdf0836f375339b22144caee3d08cacb27cb53: Status 404 returned error can't find the container with id 384326f31c2144527b26181eb4cdf0836f375339b22144caee3d08cacb27cb53 Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.947270 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jmsq5" event={"ID":"3ab4d491-82ea-4973-a8e9-ef26ba522b43","Type":"ContainerStarted","Data":"384326f31c2144527b26181eb4cdf0836f375339b22144caee3d08cacb27cb53"} Jan 30 06:32:38 crc kubenswrapper[4931]: I0130 06:32:38.982537 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc57676c-79k7x"] Jan 30 06:32:38 crc kubenswrapper[4931]: W0130 06:32:38.985650 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5868fc_e1b4_4f28_ba8d_6b0d9fad2db1.slice/crio-3961340eb18072ebf266ecd7fc39a9becdb1fb9de589cabfaaf5229e8191f71b WatchSource:0}: Error finding container 3961340eb18072ebf266ecd7fc39a9becdb1fb9de589cabfaaf5229e8191f71b: Status 404 returned error can't find the container with id 3961340eb18072ebf266ecd7fc39a9becdb1fb9de589cabfaaf5229e8191f71b Jan 30 06:32:39 crc kubenswrapper[4931]: I0130 06:32:39.970297 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jmsq5" event={"ID":"3ab4d491-82ea-4973-a8e9-ef26ba522b43","Type":"ContainerStarted","Data":"eb43994040c586b7da805891db0738dbfdbd3eca5875691d38856aeb8f3f02e2"} Jan 30 06:32:39 crc kubenswrapper[4931]: I0130 06:32:39.972785 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerID="376254fe1540e485a58048ed599a0e1e2664491a5fb008a7c2926d2f38ef5153" exitCode=0 Jan 30 06:32:39 crc kubenswrapper[4931]: I0130 06:32:39.972830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc57676c-79k7x" event={"ID":"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1","Type":"ContainerDied","Data":"376254fe1540e485a58048ed599a0e1e2664491a5fb008a7c2926d2f38ef5153"} Jan 30 06:32:39 crc kubenswrapper[4931]: I0130 06:32:39.972862 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc57676c-79k7x" event={"ID":"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1","Type":"ContainerStarted","Data":"3961340eb18072ebf266ecd7fc39a9becdb1fb9de589cabfaaf5229e8191f71b"} Jan 30 06:32:40 crc kubenswrapper[4931]: I0130 06:32:40.012034 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jmsq5" podStartSLOduration=2.012009069 podStartE2EDuration="2.012009069s" podCreationTimestamp="2026-01-30 06:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:39.998485041 +0000 UTC m=+5095.368395338" watchObservedRunningTime="2026-01-30 06:32:40.012009069 +0000 UTC m=+5095.381919356" Jan 30 06:32:40 crc kubenswrapper[4931]: I0130 06:32:40.982969 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc57676c-79k7x" event={"ID":"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1","Type":"ContainerStarted","Data":"d7ae0ff751a18019fedb3fe9938258f86e6d1759fbc3c988b2c850d94022c34d"} Jan 30 06:32:40 crc kubenswrapper[4931]: I0130 06:32:40.983332 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:41 crc kubenswrapper[4931]: I0130 06:32:41.006617 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cc57676c-79k7x" podStartSLOduration=3.006596443 podStartE2EDuration="3.006596443s" podCreationTimestamp="2026-01-30 06:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:41.004575555 +0000 UTC m=+5096.374485812" watchObservedRunningTime="2026-01-30 06:32:41.006596443 +0000 UTC m=+5096.376506700" Jan 30 06:32:41 crc kubenswrapper[4931]: I0130 06:32:41.742998 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 06:32:43 crc kubenswrapper[4931]: I0130 06:32:43.002094 4931 generic.go:334] "Generic (PLEG): container finished" podID="3ab4d491-82ea-4973-a8e9-ef26ba522b43" containerID="eb43994040c586b7da805891db0738dbfdbd3eca5875691d38856aeb8f3f02e2" exitCode=0 Jan 30 06:32:43 crc kubenswrapper[4931]: I0130 06:32:43.002163 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jmsq5" event={"ID":"3ab4d491-82ea-4973-a8e9-ef26ba522b43","Type":"ContainerDied","Data":"eb43994040c586b7da805891db0738dbfdbd3eca5875691d38856aeb8f3f02e2"} Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.446383 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.524185 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-combined-ca-bundle\") pod \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.524574 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-config-data\") pod \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.524599 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-credential-keys\") pod \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.524649 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-scripts\") pod \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.524726 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-fernet-keys\") pod \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.524876 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzw69\" (UniqueName: \"kubernetes.io/projected/3ab4d491-82ea-4973-a8e9-ef26ba522b43-kube-api-access-zzw69\") pod \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\" (UID: \"3ab4d491-82ea-4973-a8e9-ef26ba522b43\") " Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.532899 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab4d491-82ea-4973-a8e9-ef26ba522b43-kube-api-access-zzw69" (OuterVolumeSpecName: "kube-api-access-zzw69") pod "3ab4d491-82ea-4973-a8e9-ef26ba522b43" (UID: "3ab4d491-82ea-4973-a8e9-ef26ba522b43"). InnerVolumeSpecName "kube-api-access-zzw69". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.551982 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-scripts" (OuterVolumeSpecName: "scripts") pod "3ab4d491-82ea-4973-a8e9-ef26ba522b43" (UID: "3ab4d491-82ea-4973-a8e9-ef26ba522b43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.552652 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3ab4d491-82ea-4973-a8e9-ef26ba522b43" (UID: "3ab4d491-82ea-4973-a8e9-ef26ba522b43"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.572712 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ab4d491-82ea-4973-a8e9-ef26ba522b43" (UID: "3ab4d491-82ea-4973-a8e9-ef26ba522b43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.573553 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3ab4d491-82ea-4973-a8e9-ef26ba522b43" (UID: "3ab4d491-82ea-4973-a8e9-ef26ba522b43"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.578115 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-config-data" (OuterVolumeSpecName: "config-data") pod "3ab4d491-82ea-4973-a8e9-ef26ba522b43" (UID: "3ab4d491-82ea-4973-a8e9-ef26ba522b43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.626595 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.626641 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzw69\" (UniqueName: \"kubernetes.io/projected/3ab4d491-82ea-4973-a8e9-ef26ba522b43-kube-api-access-zzw69\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.626659 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.626673 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.626686 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:44 crc kubenswrapper[4931]: I0130 06:32:44.626697 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab4d491-82ea-4973-a8e9-ef26ba522b43-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.024142 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jmsq5" event={"ID":"3ab4d491-82ea-4973-a8e9-ef26ba522b43","Type":"ContainerDied","Data":"384326f31c2144527b26181eb4cdf0836f375339b22144caee3d08cacb27cb53"} Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.024194 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384326f31c2144527b26181eb4cdf0836f375339b22144caee3d08cacb27cb53" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.024217 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jmsq5" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.128667 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jmsq5"] Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.137087 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jmsq5"] Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.202490 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dj4rz"] Jan 30 06:32:45 crc kubenswrapper[4931]: E0130 06:32:45.202908 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab4d491-82ea-4973-a8e9-ef26ba522b43" containerName="keystone-bootstrap" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.202930 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab4d491-82ea-4973-a8e9-ef26ba522b43" containerName="keystone-bootstrap" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.203112 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab4d491-82ea-4973-a8e9-ef26ba522b43" containerName="keystone-bootstrap" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.203798 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.209561 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.209619 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.209757 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.209809 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lg2q7" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.209937 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.223353 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dj4rz"] Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.341107 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-credential-keys\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.341168 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-fernet-keys\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.341214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-scripts\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.341309 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-combined-ca-bundle\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.341357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcfr5\" (UniqueName: \"kubernetes.io/projected/26e6e702-ef29-49bd-836a-f46b2abd51cc-kube-api-access-zcfr5\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.341415 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-config-data\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.435998 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab4d491-82ea-4973-a8e9-ef26ba522b43" path="/var/lib/kubelet/pods/3ab4d491-82ea-4973-a8e9-ef26ba522b43/volumes" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.442305 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-combined-ca-bundle\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.442368 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcfr5\" (UniqueName: \"kubernetes.io/projected/26e6e702-ef29-49bd-836a-f46b2abd51cc-kube-api-access-zcfr5\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.442419 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-config-data\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.442571 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-credential-keys\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.442599 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-fernet-keys\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.442639 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-scripts\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.450276 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-credential-keys\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.450407 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-scripts\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.450742 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-combined-ca-bundle\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.451470 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-fernet-keys\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.460332 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-config-data\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.474839 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcfr5\" (UniqueName: \"kubernetes.io/projected/26e6e702-ef29-49bd-836a-f46b2abd51cc-kube-api-access-zcfr5\") pod \"keystone-bootstrap-dj4rz\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:45 crc kubenswrapper[4931]: I0130 06:32:45.558028 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:46 crc kubenswrapper[4931]: I0130 06:32:46.087368 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dj4rz"] Jan 30 06:32:47 crc kubenswrapper[4931]: I0130 06:32:47.048908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dj4rz" event={"ID":"26e6e702-ef29-49bd-836a-f46b2abd51cc","Type":"ContainerStarted","Data":"4c48b12d2b0648e9dfb8706d20edad9baaeb7875a958059b6dc53ace27e47c1e"} Jan 30 06:32:47 crc kubenswrapper[4931]: I0130 06:32:47.049319 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dj4rz" event={"ID":"26e6e702-ef29-49bd-836a-f46b2abd51cc","Type":"ContainerStarted","Data":"5059fe1de470f8b35ea58caed468c9ea16c78ba228c422fecb4fd0e2aa96cc2d"} Jan 30 06:32:47 crc kubenswrapper[4931]: I0130 06:32:47.089157 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dj4rz" podStartSLOduration=2.089133469 podStartE2EDuration="2.089133469s" podCreationTimestamp="2026-01-30 06:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:47.07834251 +0000 UTC m=+5102.448252817" watchObservedRunningTime="2026-01-30 06:32:47.089133469 +0000 UTC m=+5102.459043746" Jan 30 06:32:48 crc kubenswrapper[4931]: I0130 06:32:48.552756 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:32:48 crc kubenswrapper[4931]: I0130 06:32:48.625275 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666dc49759-6999t"] Jan 30 06:32:48 crc kubenswrapper[4931]: I0130 06:32:48.625625 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666dc49759-6999t" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" containerName="dnsmasq-dns" containerID="cri-o://287747b607e19df8febc584091bc73c9472de152802c4b7543aad2f9bb9038dd" gracePeriod=10 Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.064560 4931 generic.go:334] "Generic (PLEG): container finished" podID="0e31871f-729e-4b67-98d0-96973ea90de3" containerID="287747b607e19df8febc584091bc73c9472de152802c4b7543aad2f9bb9038dd" exitCode=0 Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.064618 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666dc49759-6999t" event={"ID":"0e31871f-729e-4b67-98d0-96973ea90de3","Type":"ContainerDied","Data":"287747b607e19df8febc584091bc73c9472de152802c4b7543aad2f9bb9038dd"} Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.066409 4931 generic.go:334] "Generic (PLEG): container finished" podID="26e6e702-ef29-49bd-836a-f46b2abd51cc" containerID="4c48b12d2b0648e9dfb8706d20edad9baaeb7875a958059b6dc53ace27e47c1e" exitCode=0 Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.066468 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dj4rz" event={"ID":"26e6e702-ef29-49bd-836a-f46b2abd51cc","Type":"ContainerDied","Data":"4c48b12d2b0648e9dfb8706d20edad9baaeb7875a958059b6dc53ace27e47c1e"} Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.143562 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.209257 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcfn7\" (UniqueName: \"kubernetes.io/projected/0e31871f-729e-4b67-98d0-96973ea90de3-kube-api-access-qcfn7\") pod \"0e31871f-729e-4b67-98d0-96973ea90de3\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.209332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-config\") pod \"0e31871f-729e-4b67-98d0-96973ea90de3\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.209370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-dns-svc\") pod \"0e31871f-729e-4b67-98d0-96973ea90de3\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.209573 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-nb\") pod \"0e31871f-729e-4b67-98d0-96973ea90de3\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.209632 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-sb\") pod \"0e31871f-729e-4b67-98d0-96973ea90de3\" (UID: \"0e31871f-729e-4b67-98d0-96973ea90de3\") " Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.217006 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e31871f-729e-4b67-98d0-96973ea90de3-kube-api-access-qcfn7" (OuterVolumeSpecName: "kube-api-access-qcfn7") pod "0e31871f-729e-4b67-98d0-96973ea90de3" (UID: "0e31871f-729e-4b67-98d0-96973ea90de3"). InnerVolumeSpecName "kube-api-access-qcfn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.250454 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0e31871f-729e-4b67-98d0-96973ea90de3" (UID: "0e31871f-729e-4b67-98d0-96973ea90de3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.258235 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0e31871f-729e-4b67-98d0-96973ea90de3" (UID: "0e31871f-729e-4b67-98d0-96973ea90de3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.266015 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-config" (OuterVolumeSpecName: "config") pod "0e31871f-729e-4b67-98d0-96973ea90de3" (UID: "0e31871f-729e-4b67-98d0-96973ea90de3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.268718 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e31871f-729e-4b67-98d0-96973ea90de3" (UID: "0e31871f-729e-4b67-98d0-96973ea90de3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.311874 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcfn7\" (UniqueName: \"kubernetes.io/projected/0e31871f-729e-4b67-98d0-96973ea90de3-kube-api-access-qcfn7\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.312068 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.312149 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.312213 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:49 crc kubenswrapper[4931]: I0130 06:32:49.312289 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e31871f-729e-4b67-98d0-96973ea90de3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.075262 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666dc49759-6999t" event={"ID":"0e31871f-729e-4b67-98d0-96973ea90de3","Type":"ContainerDied","Data":"040800f5a3860baea453ed500ca3333fe258941fbbb6b519709bd70c1f55e9fe"} Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.075315 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666dc49759-6999t" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.077197 4931 scope.go:117] "RemoveContainer" containerID="287747b607e19df8febc584091bc73c9472de152802c4b7543aad2f9bb9038dd" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.102152 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666dc49759-6999t"] Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.111782 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666dc49759-6999t"] Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.113764 4931 scope.go:117] "RemoveContainer" containerID="89484ad9976e7f0f1e67abb8cfa05b476c12211570ced1863487804ae3932924" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.465526 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.536810 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcfr5\" (UniqueName: \"kubernetes.io/projected/26e6e702-ef29-49bd-836a-f46b2abd51cc-kube-api-access-zcfr5\") pod \"26e6e702-ef29-49bd-836a-f46b2abd51cc\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.536898 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-combined-ca-bundle\") pod \"26e6e702-ef29-49bd-836a-f46b2abd51cc\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.536925 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-config-data\") pod \"26e6e702-ef29-49bd-836a-f46b2abd51cc\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.536953 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-fernet-keys\") pod \"26e6e702-ef29-49bd-836a-f46b2abd51cc\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.536984 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-scripts\") pod \"26e6e702-ef29-49bd-836a-f46b2abd51cc\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.537006 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-credential-keys\") pod \"26e6e702-ef29-49bd-836a-f46b2abd51cc\" (UID: \"26e6e702-ef29-49bd-836a-f46b2abd51cc\") " Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.541239 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "26e6e702-ef29-49bd-836a-f46b2abd51cc" (UID: "26e6e702-ef29-49bd-836a-f46b2abd51cc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.542664 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e6e702-ef29-49bd-836a-f46b2abd51cc-kube-api-access-zcfr5" (OuterVolumeSpecName: "kube-api-access-zcfr5") pod "26e6e702-ef29-49bd-836a-f46b2abd51cc" (UID: "26e6e702-ef29-49bd-836a-f46b2abd51cc"). InnerVolumeSpecName "kube-api-access-zcfr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.545901 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "26e6e702-ef29-49bd-836a-f46b2abd51cc" (UID: "26e6e702-ef29-49bd-836a-f46b2abd51cc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.547474 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-scripts" (OuterVolumeSpecName: "scripts") pod "26e6e702-ef29-49bd-836a-f46b2abd51cc" (UID: "26e6e702-ef29-49bd-836a-f46b2abd51cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.565555 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-config-data" (OuterVolumeSpecName: "config-data") pod "26e6e702-ef29-49bd-836a-f46b2abd51cc" (UID: "26e6e702-ef29-49bd-836a-f46b2abd51cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.572131 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e6e702-ef29-49bd-836a-f46b2abd51cc" (UID: "26e6e702-ef29-49bd-836a-f46b2abd51cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.639324 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcfr5\" (UniqueName: \"kubernetes.io/projected/26e6e702-ef29-49bd-836a-f46b2abd51cc-kube-api-access-zcfr5\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.639371 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.639381 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.639390 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.639559 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:50 crc kubenswrapper[4931]: I0130 06:32:50.639572 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26e6e702-ef29-49bd-836a-f46b2abd51cc-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.085989 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dj4rz" event={"ID":"26e6e702-ef29-49bd-836a-f46b2abd51cc","Type":"ContainerDied","Data":"5059fe1de470f8b35ea58caed468c9ea16c78ba228c422fecb4fd0e2aa96cc2d"} Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.086346 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5059fe1de470f8b35ea58caed468c9ea16c78ba228c422fecb4fd0e2aa96cc2d" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.086166 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dj4rz" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.437192 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" path="/var/lib/kubelet/pods/0e31871f-729e-4b67-98d0-96973ea90de3/volumes" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.571735 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6bc679c867-wth9b"] Jan 30 06:32:51 crc kubenswrapper[4931]: E0130 06:32:51.572199 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" containerName="dnsmasq-dns" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.572228 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" containerName="dnsmasq-dns" Jan 30 06:32:51 crc kubenswrapper[4931]: E0130 06:32:51.572279 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" containerName="init" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.572290 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" containerName="init" Jan 30 06:32:51 crc kubenswrapper[4931]: E0130 06:32:51.572310 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e6e702-ef29-49bd-836a-f46b2abd51cc" containerName="keystone-bootstrap" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.572324 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e6e702-ef29-49bd-836a-f46b2abd51cc" containerName="keystone-bootstrap" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.572602 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e31871f-729e-4b67-98d0-96973ea90de3" containerName="dnsmasq-dns" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.572636 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e6e702-ef29-49bd-836a-f46b2abd51cc" containerName="keystone-bootstrap" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.573485 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.576603 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lg2q7" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.576724 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.576825 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.579547 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.588853 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bc679c867-wth9b"] Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.654547 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-combined-ca-bundle\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.654610 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-fernet-keys\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.654635 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-config-data\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.654658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crh9k\" (UniqueName: \"kubernetes.io/projected/18835617-9ad2-4502-bbda-d4ac538081bd-kube-api-access-crh9k\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.654689 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-scripts\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.654730 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-credential-keys\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.756465 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-credential-keys\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.756528 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-combined-ca-bundle\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.756565 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-fernet-keys\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.756587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-config-data\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.756610 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crh9k\" (UniqueName: \"kubernetes.io/projected/18835617-9ad2-4502-bbda-d4ac538081bd-kube-api-access-crh9k\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.756640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-scripts\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.760447 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-fernet-keys\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.761330 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-scripts\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.761513 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-combined-ca-bundle\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.761578 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-credential-keys\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.761683 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18835617-9ad2-4502-bbda-d4ac538081bd-config-data\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.778046 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crh9k\" (UniqueName: \"kubernetes.io/projected/18835617-9ad2-4502-bbda-d4ac538081bd-kube-api-access-crh9k\") pod \"keystone-6bc679c867-wth9b\" (UID: \"18835617-9ad2-4502-bbda-d4ac538081bd\") " pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:51 crc kubenswrapper[4931]: I0130 06:32:51.891180 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:52 crc kubenswrapper[4931]: I0130 06:32:52.359560 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bc679c867-wth9b"] Jan 30 06:32:53 crc kubenswrapper[4931]: I0130 06:32:53.107283 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bc679c867-wth9b" event={"ID":"18835617-9ad2-4502-bbda-d4ac538081bd","Type":"ContainerStarted","Data":"e36b0ed7b56ce182ac97bba3fef122763c524704b25abc5f048a9cb86089ee31"} Jan 30 06:32:53 crc kubenswrapper[4931]: I0130 06:32:53.107553 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bc679c867-wth9b" event={"ID":"18835617-9ad2-4502-bbda-d4ac538081bd","Type":"ContainerStarted","Data":"804b21fe14aafecc45d89562e1b5cfdd1acc840acca33979e160c0fd6b4599e9"} Jan 30 06:32:53 crc kubenswrapper[4931]: I0130 06:32:53.107593 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:32:53 crc kubenswrapper[4931]: I0130 06:32:53.134656 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6bc679c867-wth9b" podStartSLOduration=2.134629321 podStartE2EDuration="2.134629321s" podCreationTimestamp="2026-01-30 06:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:32:53.12588998 +0000 UTC m=+5108.495800247" watchObservedRunningTime="2026-01-30 06:32:53.134629321 +0000 UTC m=+5108.504539588" Jan 30 06:32:53 crc kubenswrapper[4931]: I0130 06:32:53.422802 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:32:53 crc kubenswrapper[4931]: E0130 06:32:53.422975 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:33:04 crc kubenswrapper[4931]: I0130 06:33:04.422744 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:33:04 crc kubenswrapper[4931]: E0130 06:33:04.425276 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:33:15 crc kubenswrapper[4931]: I0130 06:33:15.434946 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:33:15 crc kubenswrapper[4931]: E0130 06:33:15.437478 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:33:23 crc kubenswrapper[4931]: I0130 06:33:23.259680 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6bc679c867-wth9b" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.733325 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.735292 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.737706 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.738044 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.740059 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8ssd9" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.746410 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.815044 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59zl\" (UniqueName: \"kubernetes.io/projected/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-kube-api-access-c59zl\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.815104 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config-secret\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.815140 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.917249 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59zl\" (UniqueName: \"kubernetes.io/projected/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-kube-api-access-c59zl\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.917311 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config-secret\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.917357 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.918408 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.934828 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config-secret\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:26 crc kubenswrapper[4931]: I0130 06:33:26.949975 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59zl\" (UniqueName: \"kubernetes.io/projected/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-kube-api-access-c59zl\") pod \"openstackclient\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " pod="openstack/openstackclient" Jan 30 06:33:27 crc kubenswrapper[4931]: I0130 06:33:27.058531 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:33:27 crc kubenswrapper[4931]: I0130 06:33:27.539515 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:33:27 crc kubenswrapper[4931]: W0130 06:33:27.560874 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod459f1ff6_e3cb_45a8_9a4a_0e24e7881407.slice/crio-f119add10e3e59ef80f35a8b0a1d9d2e4bc8283a75ba76bc8e9aac00049c6ac6 WatchSource:0}: Error finding container f119add10e3e59ef80f35a8b0a1d9d2e4bc8283a75ba76bc8e9aac00049c6ac6: Status 404 returned error can't find the container with id f119add10e3e59ef80f35a8b0a1d9d2e4bc8283a75ba76bc8e9aac00049c6ac6 Jan 30 06:33:28 crc kubenswrapper[4931]: I0130 06:33:28.480767 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"459f1ff6-e3cb-45a8-9a4a-0e24e7881407","Type":"ContainerStarted","Data":"73102981ee7cbbfca21cfe57a2e7dc11e7bcd50d611864190f31f741ca04bd1e"} Jan 30 06:33:28 crc kubenswrapper[4931]: I0130 06:33:28.481145 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"459f1ff6-e3cb-45a8-9a4a-0e24e7881407","Type":"ContainerStarted","Data":"f119add10e3e59ef80f35a8b0a1d9d2e4bc8283a75ba76bc8e9aac00049c6ac6"} Jan 30 06:33:28 crc kubenswrapper[4931]: I0130 06:33:28.515508 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.515482197 podStartE2EDuration="2.515482197s" podCreationTimestamp="2026-01-30 06:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:33:28.504012608 +0000 UTC m=+5143.873922905" watchObservedRunningTime="2026-01-30 06:33:28.515482197 +0000 UTC m=+5143.885392494" Jan 30 06:33:30 crc kubenswrapper[4931]: I0130 06:33:30.422630 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:33:30 crc kubenswrapper[4931]: E0130 06:33:30.423544 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:33:41 crc kubenswrapper[4931]: I0130 06:33:41.421957 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:33:41 crc kubenswrapper[4931]: E0130 06:33:41.423026 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:33:55 crc kubenswrapper[4931]: I0130 06:33:55.431894 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:33:55 crc kubenswrapper[4931]: E0130 06:33:55.432734 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:33:59 crc kubenswrapper[4931]: I0130 06:33:59.089859 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4x52j"] Jan 30 06:33:59 crc kubenswrapper[4931]: I0130 06:33:59.104129 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4x52j"] Jan 30 06:33:59 crc kubenswrapper[4931]: I0130 06:33:59.447652 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56513a2a-14aa-4055-8b35-de5c272faab9" path="/var/lib/kubelet/pods/56513a2a-14aa-4055-8b35-de5c272faab9/volumes" Jan 30 06:34:10 crc kubenswrapper[4931]: I0130 06:34:10.422355 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:34:10 crc kubenswrapper[4931]: E0130 06:34:10.423641 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:34:21 crc kubenswrapper[4931]: I0130 06:34:21.422962 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:34:21 crc kubenswrapper[4931]: E0130 06:34:21.424000 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:34:34 crc kubenswrapper[4931]: I0130 06:34:34.422038 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:34:34 crc kubenswrapper[4931]: E0130 06:34:34.422783 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:34:49 crc kubenswrapper[4931]: I0130 06:34:49.424753 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:34:49 crc kubenswrapper[4931]: E0130 06:34:49.425800 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:34:54 crc kubenswrapper[4931]: I0130 06:34:54.111028 4931 scope.go:117] "RemoveContainer" containerID="88feac8d2b5d033c066731d3fa1d66cc34b935b576cc56423d76770840e869ad" Jan 30 06:35:00 crc kubenswrapper[4931]: I0130 06:35:00.422497 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:35:00 crc kubenswrapper[4931]: E0130 06:35:00.425306 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:35:07 crc kubenswrapper[4931]: I0130 06:35:07.990230 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cxbxk"] Jan 30 06:35:07 crc kubenswrapper[4931]: I0130 06:35:07.991735 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.001239 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cxbxk"] Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.086619 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-64f9-account-create-update-sm7kp"] Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.087887 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.091257 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.096188 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vf22\" (UniqueName: \"kubernetes.io/projected/7e8b686f-89e9-4561-b4da-73c3087f1913-kube-api-access-4vf22\") pod \"barbican-db-create-cxbxk\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.096262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e8b686f-89e9-4561-b4da-73c3087f1913-operator-scripts\") pod \"barbican-db-create-cxbxk\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.098578 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-64f9-account-create-update-sm7kp"] Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.198168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vf22\" (UniqueName: \"kubernetes.io/projected/7e8b686f-89e9-4561-b4da-73c3087f1913-kube-api-access-4vf22\") pod \"barbican-db-create-cxbxk\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.198223 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e8b686f-89e9-4561-b4da-73c3087f1913-operator-scripts\") pod \"barbican-db-create-cxbxk\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.198283 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7cl5\" (UniqueName: \"kubernetes.io/projected/f1237d07-19d9-47bb-8fb8-42e905fcc41b-kube-api-access-x7cl5\") pod \"barbican-64f9-account-create-update-sm7kp\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.198301 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1237d07-19d9-47bb-8fb8-42e905fcc41b-operator-scripts\") pod \"barbican-64f9-account-create-update-sm7kp\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.198953 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e8b686f-89e9-4561-b4da-73c3087f1913-operator-scripts\") pod \"barbican-db-create-cxbxk\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.221092 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vf22\" (UniqueName: \"kubernetes.io/projected/7e8b686f-89e9-4561-b4da-73c3087f1913-kube-api-access-4vf22\") pod \"barbican-db-create-cxbxk\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.299623 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7cl5\" (UniqueName: \"kubernetes.io/projected/f1237d07-19d9-47bb-8fb8-42e905fcc41b-kube-api-access-x7cl5\") pod \"barbican-64f9-account-create-update-sm7kp\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.299673 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1237d07-19d9-47bb-8fb8-42e905fcc41b-operator-scripts\") pod \"barbican-64f9-account-create-update-sm7kp\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.300473 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1237d07-19d9-47bb-8fb8-42e905fcc41b-operator-scripts\") pod \"barbican-64f9-account-create-update-sm7kp\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.309258 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.317167 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7cl5\" (UniqueName: \"kubernetes.io/projected/f1237d07-19d9-47bb-8fb8-42e905fcc41b-kube-api-access-x7cl5\") pod \"barbican-64f9-account-create-update-sm7kp\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.416883 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.717740 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cxbxk"] Jan 30 06:35:08 crc kubenswrapper[4931]: I0130 06:35:08.944945 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-64f9-account-create-update-sm7kp"] Jan 30 06:35:08 crc kubenswrapper[4931]: W0130 06:35:08.945754 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1237d07_19d9_47bb_8fb8_42e905fcc41b.slice/crio-8b58d2af4ffc0ac411e7a90473fbcac724986ee3773fc8ecb3af3a66e1ccd3b2 WatchSource:0}: Error finding container 8b58d2af4ffc0ac411e7a90473fbcac724986ee3773fc8ecb3af3a66e1ccd3b2: Status 404 returned error can't find the container with id 8b58d2af4ffc0ac411e7a90473fbcac724986ee3773fc8ecb3af3a66e1ccd3b2 Jan 30 06:35:09 crc kubenswrapper[4931]: E0130 06:35:09.161957 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e8b686f_89e9_4561_b4da_73c3087f1913.slice/crio-d12d2fc2afba982df405694f131f75a0e5433ce67ce82580ab99cf0746dfdbc2.scope\": RecentStats: unable to find data in memory cache]" Jan 30 06:35:09 crc kubenswrapper[4931]: I0130 06:35:09.559934 4931 generic.go:334] "Generic (PLEG): container finished" podID="f1237d07-19d9-47bb-8fb8-42e905fcc41b" containerID="328e8eda0559ed6f531366255d38e56b8621e4607eb9add0633123842cfdda68" exitCode=0 Jan 30 06:35:09 crc kubenswrapper[4931]: I0130 06:35:09.560019 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-64f9-account-create-update-sm7kp" event={"ID":"f1237d07-19d9-47bb-8fb8-42e905fcc41b","Type":"ContainerDied","Data":"328e8eda0559ed6f531366255d38e56b8621e4607eb9add0633123842cfdda68"} Jan 30 06:35:09 crc kubenswrapper[4931]: I0130 06:35:09.560412 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-64f9-account-create-update-sm7kp" event={"ID":"f1237d07-19d9-47bb-8fb8-42e905fcc41b","Type":"ContainerStarted","Data":"8b58d2af4ffc0ac411e7a90473fbcac724986ee3773fc8ecb3af3a66e1ccd3b2"} Jan 30 06:35:09 crc kubenswrapper[4931]: I0130 06:35:09.570974 4931 generic.go:334] "Generic (PLEG): container finished" podID="7e8b686f-89e9-4561-b4da-73c3087f1913" containerID="d12d2fc2afba982df405694f131f75a0e5433ce67ce82580ab99cf0746dfdbc2" exitCode=0 Jan 30 06:35:09 crc kubenswrapper[4931]: I0130 06:35:09.571078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cxbxk" event={"ID":"7e8b686f-89e9-4561-b4da-73c3087f1913","Type":"ContainerDied","Data":"d12d2fc2afba982df405694f131f75a0e5433ce67ce82580ab99cf0746dfdbc2"} Jan 30 06:35:09 crc kubenswrapper[4931]: I0130 06:35:09.571127 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cxbxk" event={"ID":"7e8b686f-89e9-4561-b4da-73c3087f1913","Type":"ContainerStarted","Data":"7ec2122d80a12a975a10746666145e6ab74df22454658140eeceda76eef19a0c"} Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.003404 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.008410 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.054082 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7cl5\" (UniqueName: \"kubernetes.io/projected/f1237d07-19d9-47bb-8fb8-42e905fcc41b-kube-api-access-x7cl5\") pod \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.054137 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e8b686f-89e9-4561-b4da-73c3087f1913-operator-scripts\") pod \"7e8b686f-89e9-4561-b4da-73c3087f1913\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.054161 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1237d07-19d9-47bb-8fb8-42e905fcc41b-operator-scripts\") pod \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\" (UID: \"f1237d07-19d9-47bb-8fb8-42e905fcc41b\") " Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.054209 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vf22\" (UniqueName: \"kubernetes.io/projected/7e8b686f-89e9-4561-b4da-73c3087f1913-kube-api-access-4vf22\") pod \"7e8b686f-89e9-4561-b4da-73c3087f1913\" (UID: \"7e8b686f-89e9-4561-b4da-73c3087f1913\") " Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.059823 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e8b686f-89e9-4561-b4da-73c3087f1913-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e8b686f-89e9-4561-b4da-73c3087f1913" (UID: "7e8b686f-89e9-4561-b4da-73c3087f1913"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.059866 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1237d07-19d9-47bb-8fb8-42e905fcc41b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1237d07-19d9-47bb-8fb8-42e905fcc41b" (UID: "f1237d07-19d9-47bb-8fb8-42e905fcc41b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.062683 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8b686f-89e9-4561-b4da-73c3087f1913-kube-api-access-4vf22" (OuterVolumeSpecName: "kube-api-access-4vf22") pod "7e8b686f-89e9-4561-b4da-73c3087f1913" (UID: "7e8b686f-89e9-4561-b4da-73c3087f1913"). InnerVolumeSpecName "kube-api-access-4vf22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.070234 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1237d07-19d9-47bb-8fb8-42e905fcc41b-kube-api-access-x7cl5" (OuterVolumeSpecName: "kube-api-access-x7cl5") pod "f1237d07-19d9-47bb-8fb8-42e905fcc41b" (UID: "f1237d07-19d9-47bb-8fb8-42e905fcc41b"). InnerVolumeSpecName "kube-api-access-x7cl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.155969 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7cl5\" (UniqueName: \"kubernetes.io/projected/f1237d07-19d9-47bb-8fb8-42e905fcc41b-kube-api-access-x7cl5\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.156008 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e8b686f-89e9-4561-b4da-73c3087f1913-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.156021 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1237d07-19d9-47bb-8fb8-42e905fcc41b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.156035 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vf22\" (UniqueName: \"kubernetes.io/projected/7e8b686f-89e9-4561-b4da-73c3087f1913-kube-api-access-4vf22\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.591129 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cxbxk" event={"ID":"7e8b686f-89e9-4561-b4da-73c3087f1913","Type":"ContainerDied","Data":"7ec2122d80a12a975a10746666145e6ab74df22454658140eeceda76eef19a0c"} Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.591559 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ec2122d80a12a975a10746666145e6ab74df22454658140eeceda76eef19a0c" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.591164 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cxbxk" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.593204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-64f9-account-create-update-sm7kp" event={"ID":"f1237d07-19d9-47bb-8fb8-42e905fcc41b","Type":"ContainerDied","Data":"8b58d2af4ffc0ac411e7a90473fbcac724986ee3773fc8ecb3af3a66e1ccd3b2"} Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.593260 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b58d2af4ffc0ac411e7a90473fbcac724986ee3773fc8ecb3af3a66e1ccd3b2" Jan 30 06:35:11 crc kubenswrapper[4931]: I0130 06:35:11.593268 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-64f9-account-create-update-sm7kp" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.373171 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zzqsh"] Jan 30 06:35:13 crc kubenswrapper[4931]: E0130 06:35:13.374277 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1237d07-19d9-47bb-8fb8-42e905fcc41b" containerName="mariadb-account-create-update" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.374304 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1237d07-19d9-47bb-8fb8-42e905fcc41b" containerName="mariadb-account-create-update" Jan 30 06:35:13 crc kubenswrapper[4931]: E0130 06:35:13.374338 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8b686f-89e9-4561-b4da-73c3087f1913" containerName="mariadb-database-create" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.374351 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8b686f-89e9-4561-b4da-73c3087f1913" containerName="mariadb-database-create" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.374991 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8b686f-89e9-4561-b4da-73c3087f1913" containerName="mariadb-database-create" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.375064 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1237d07-19d9-47bb-8fb8-42e905fcc41b" containerName="mariadb-account-create-update" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.377125 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.386142 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.386211 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dnzcg" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.389528 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zzqsh"] Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.413878 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-combined-ca-bundle\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.414276 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-db-sync-config-data\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.414400 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-262pn\" (UniqueName: \"kubernetes.io/projected/60872807-e034-4844-9f79-8005640c308c-kube-api-access-262pn\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.515893 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-combined-ca-bundle\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.516292 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-db-sync-config-data\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.516409 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-262pn\" (UniqueName: \"kubernetes.io/projected/60872807-e034-4844-9f79-8005640c308c-kube-api-access-262pn\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.522412 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-combined-ca-bundle\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.522844 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-db-sync-config-data\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.532614 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-262pn\" (UniqueName: \"kubernetes.io/projected/60872807-e034-4844-9f79-8005640c308c-kube-api-access-262pn\") pod \"barbican-db-sync-zzqsh\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:13 crc kubenswrapper[4931]: I0130 06:35:13.714304 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:14 crc kubenswrapper[4931]: I0130 06:35:14.052398 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zzqsh"] Jan 30 06:35:14 crc kubenswrapper[4931]: I0130 06:35:14.619745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzqsh" event={"ID":"60872807-e034-4844-9f79-8005640c308c","Type":"ContainerStarted","Data":"c1b6dac81e48bbee78ee0997dd94e0a4ebd87490b839272e84bc72df445ca206"} Jan 30 06:35:14 crc kubenswrapper[4931]: I0130 06:35:14.620041 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzqsh" event={"ID":"60872807-e034-4844-9f79-8005640c308c","Type":"ContainerStarted","Data":"63d64ee019ed4b571b958873b656eed687082260a1ff967bfdde1ccd255d06fc"} Jan 30 06:35:14 crc kubenswrapper[4931]: I0130 06:35:14.640727 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zzqsh" podStartSLOduration=1.640699692 podStartE2EDuration="1.640699692s" podCreationTimestamp="2026-01-30 06:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:14.637447978 +0000 UTC m=+5250.007358235" watchObservedRunningTime="2026-01-30 06:35:14.640699692 +0000 UTC m=+5250.010609969" Jan 30 06:35:15 crc kubenswrapper[4931]: I0130 06:35:15.434336 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:35:15 crc kubenswrapper[4931]: E0130 06:35:15.434598 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:35:15 crc kubenswrapper[4931]: I0130 06:35:15.630878 4931 generic.go:334] "Generic (PLEG): container finished" podID="60872807-e034-4844-9f79-8005640c308c" containerID="c1b6dac81e48bbee78ee0997dd94e0a4ebd87490b839272e84bc72df445ca206" exitCode=0 Jan 30 06:35:15 crc kubenswrapper[4931]: I0130 06:35:15.630917 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzqsh" event={"ID":"60872807-e034-4844-9f79-8005640c308c","Type":"ContainerDied","Data":"c1b6dac81e48bbee78ee0997dd94e0a4ebd87490b839272e84bc72df445ca206"} Jan 30 06:35:16 crc kubenswrapper[4931]: I0130 06:35:16.988332 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.070371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-db-sync-config-data\") pod \"60872807-e034-4844-9f79-8005640c308c\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.070467 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-combined-ca-bundle\") pod \"60872807-e034-4844-9f79-8005640c308c\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.070708 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-262pn\" (UniqueName: \"kubernetes.io/projected/60872807-e034-4844-9f79-8005640c308c-kube-api-access-262pn\") pod \"60872807-e034-4844-9f79-8005640c308c\" (UID: \"60872807-e034-4844-9f79-8005640c308c\") " Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.076140 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60872807-e034-4844-9f79-8005640c308c-kube-api-access-262pn" (OuterVolumeSpecName: "kube-api-access-262pn") pod "60872807-e034-4844-9f79-8005640c308c" (UID: "60872807-e034-4844-9f79-8005640c308c"). InnerVolumeSpecName "kube-api-access-262pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.084590 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "60872807-e034-4844-9f79-8005640c308c" (UID: "60872807-e034-4844-9f79-8005640c308c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.097008 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60872807-e034-4844-9f79-8005640c308c" (UID: "60872807-e034-4844-9f79-8005640c308c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.172946 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.172976 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-262pn\" (UniqueName: \"kubernetes.io/projected/60872807-e034-4844-9f79-8005640c308c-kube-api-access-262pn\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.172988 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60872807-e034-4844-9f79-8005640c308c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.657279 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zzqsh" event={"ID":"60872807-e034-4844-9f79-8005640c308c","Type":"ContainerDied","Data":"63d64ee019ed4b571b958873b656eed687082260a1ff967bfdde1ccd255d06fc"} Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.657332 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d64ee019ed4b571b958873b656eed687082260a1ff967bfdde1ccd255d06fc" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.657336 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zzqsh" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.879849 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-784d6f7789-45xt8"] Jan 30 06:35:17 crc kubenswrapper[4931]: E0130 06:35:17.880242 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60872807-e034-4844-9f79-8005640c308c" containerName="barbican-db-sync" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.880263 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="60872807-e034-4844-9f79-8005640c308c" containerName="barbican-db-sync" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.881196 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="60872807-e034-4844-9f79-8005640c308c" containerName="barbican-db-sync" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.888068 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.913056 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.915224 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dnzcg" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.915359 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.939032 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-784d6f7789-45xt8"] Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.953805 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-67d8db4f6b-c2v48"] Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.955536 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:17 crc kubenswrapper[4931]: I0130 06:35:17.959903 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:17.998575 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-logs\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.001299 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-combined-ca-bundle\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.001352 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-config-data-custom\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.001634 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twf5p\" (UniqueName: \"kubernetes.io/projected/a72d7303-20af-4fe7-be58-962eaa52c31a-kube-api-access-twf5p\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.001792 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-config-data\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.001957 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-config-data-custom\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.002196 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72d7303-20af-4fe7-be58-962eaa52c31a-logs\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.002331 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-combined-ca-bundle\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.002449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-config-data\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.002768 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgbvb\" (UniqueName: \"kubernetes.io/projected/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-kube-api-access-sgbvb\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.006948 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67d8db4f6b-c2v48"] Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.044400 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-669bb76d6c-ld8p5"] Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.054219 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.064275 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-669bb76d6c-ld8p5"] Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.070623 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-847c6776d8-4sw8x"] Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.075917 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.080236 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-847c6776d8-4sw8x"] Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.080669 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104667 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-combined-ca-bundle\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-config-data-custom\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104739 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twf5p\" (UniqueName: \"kubernetes.io/projected/a72d7303-20af-4fe7-be58-962eaa52c31a-kube-api-access-twf5p\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104772 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-config-data\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104797 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-config-data-custom\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t4f9\" (UniqueName: \"kubernetes.io/projected/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-kube-api-access-4t4f9\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72d7303-20af-4fe7-be58-962eaa52c31a-logs\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104856 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-dns-svc\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104872 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-combined-ca-bundle\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-config-data\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104940 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-config\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104960 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-sb\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.104994 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgbvb\" (UniqueName: \"kubernetes.io/projected/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-kube-api-access-sgbvb\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.105025 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-nb\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.105044 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-logs\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.105459 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-logs\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.105813 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72d7303-20af-4fe7-be58-962eaa52c31a-logs\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.108558 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-combined-ca-bundle\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.111016 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-combined-ca-bundle\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.111672 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-config-data\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.114532 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-config-data-custom\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.116510 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-config-data\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.126986 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a72d7303-20af-4fe7-be58-962eaa52c31a-config-data-custom\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.130802 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgbvb\" (UniqueName: \"kubernetes.io/projected/acbd1f75-958f-4fe5-8d52-f32c4d6c53f1-kube-api-access-sgbvb\") pod \"barbican-keystone-listener-67d8db4f6b-c2v48\" (UID: \"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1\") " pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.135476 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twf5p\" (UniqueName: \"kubernetes.io/projected/a72d7303-20af-4fe7-be58-962eaa52c31a-kube-api-access-twf5p\") pod \"barbican-worker-784d6f7789-45xt8\" (UID: \"a72d7303-20af-4fe7-be58-962eaa52c31a\") " pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206173 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-config-data\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206215 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t4f9\" (UniqueName: \"kubernetes.io/projected/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-kube-api-access-4t4f9\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206247 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-dns-svc\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206348 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-logs\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206387 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-combined-ca-bundle\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206434 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tskh5\" (UniqueName: \"kubernetes.io/projected/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-kube-api-access-tskh5\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206563 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-config\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-sb\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206618 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-config-data-custom\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206808 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-nb\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.206960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-dns-svc\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.207317 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-config\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.207726 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-nb\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.207764 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-sb\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.222912 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t4f9\" (UniqueName: \"kubernetes.io/projected/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-kube-api-access-4t4f9\") pod \"dnsmasq-dns-669bb76d6c-ld8p5\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.239339 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-784d6f7789-45xt8" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.285876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.308109 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-config-data\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.308166 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-logs\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.308187 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-combined-ca-bundle\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.308208 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tskh5\" (UniqueName: \"kubernetes.io/projected/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-kube-api-access-tskh5\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.308260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-config-data-custom\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.309039 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-logs\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.312766 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-config-data-custom\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.312763 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-combined-ca-bundle\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.312864 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-config-data\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.329896 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tskh5\" (UniqueName: \"kubernetes.io/projected/62d9ff65-c8d2-413f-b323-47a1db5ea2ed-kube-api-access-tskh5\") pod \"barbican-api-847c6776d8-4sw8x\" (UID: \"62d9ff65-c8d2-413f-b323-47a1db5ea2ed\") " pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.370794 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.396381 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.748642 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-784d6f7789-45xt8"] Jan 30 06:35:18 crc kubenswrapper[4931]: W0130 06:35:18.776093 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda72d7303_20af_4fe7_be58_962eaa52c31a.slice/crio-9d6b3a10315ecaa3f9e38ab7113d934bf12afc216592235bbbc7e3070dbd10bf WatchSource:0}: Error finding container 9d6b3a10315ecaa3f9e38ab7113d934bf12afc216592235bbbc7e3070dbd10bf: Status 404 returned error can't find the container with id 9d6b3a10315ecaa3f9e38ab7113d934bf12afc216592235bbbc7e3070dbd10bf Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.832155 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-669bb76d6c-ld8p5"] Jan 30 06:35:18 crc kubenswrapper[4931]: W0130 06:35:18.837988 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66eeada2_dfe0_4ebc_af62_17af9f1ce15e.slice/crio-509c57c8ee870d1521b8c78081fe670bec4a3e4b66684ebdaba1d4543435f196 WatchSource:0}: Error finding container 509c57c8ee870d1521b8c78081fe670bec4a3e4b66684ebdaba1d4543435f196: Status 404 returned error can't find the container with id 509c57c8ee870d1521b8c78081fe670bec4a3e4b66684ebdaba1d4543435f196 Jan 30 06:35:18 crc kubenswrapper[4931]: I0130 06:35:18.887558 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67d8db4f6b-c2v48"] Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.081842 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-847c6776d8-4sw8x"] Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.681058 4931 generic.go:334] "Generic (PLEG): container finished" podID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerID="fac0f8070b365b5149eb56040af36ca71ef68ff65a5f0aef7d169c86e39479df" exitCode=0 Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.681105 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" event={"ID":"66eeada2-dfe0-4ebc-af62-17af9f1ce15e","Type":"ContainerDied","Data":"fac0f8070b365b5149eb56040af36ca71ef68ff65a5f0aef7d169c86e39479df"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.681455 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" event={"ID":"66eeada2-dfe0-4ebc-af62-17af9f1ce15e","Type":"ContainerStarted","Data":"509c57c8ee870d1521b8c78081fe670bec4a3e4b66684ebdaba1d4543435f196"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.684745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-847c6776d8-4sw8x" event={"ID":"62d9ff65-c8d2-413f-b323-47a1db5ea2ed","Type":"ContainerStarted","Data":"8947c9d734a9979b06fa28d152fd4c11e04ce4dd699bc499c5572e840e332e86"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.684774 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-847c6776d8-4sw8x" event={"ID":"62d9ff65-c8d2-413f-b323-47a1db5ea2ed","Type":"ContainerStarted","Data":"502fc034db2d6646ab4a40fd5b988d6eb4f191d62fc0b96eb5a13ea1ffbbafbd"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.684783 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-847c6776d8-4sw8x" event={"ID":"62d9ff65-c8d2-413f-b323-47a1db5ea2ed","Type":"ContainerStarted","Data":"22ebe02ddcdace17f784f5c14374e84999e89084d51838f74e523549180e92d5"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.685714 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.685746 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.687862 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" event={"ID":"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1","Type":"ContainerStarted","Data":"fb1b28820a1408e8b9dd3f7446fa1adfd285d8b751ba27847ffcb7971471416c"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.687884 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" event={"ID":"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1","Type":"ContainerStarted","Data":"8c78e4e5c912e139faa60e53e6a80dc09c1d87c087efc4e5107e4fab1c5ea953"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.687894 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" event={"ID":"acbd1f75-958f-4fe5-8d52-f32c4d6c53f1","Type":"ContainerStarted","Data":"91165a486ae888d1f97850ffacd0dd3f78aad39b0075225e061af9f4bae73ca4"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.688868 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-784d6f7789-45xt8" event={"ID":"a72d7303-20af-4fe7-be58-962eaa52c31a","Type":"ContainerStarted","Data":"af9911eb0aac4d0427a84aac61b693548615547737bc7bcb9df3ccd210735a79"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.688883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-784d6f7789-45xt8" event={"ID":"a72d7303-20af-4fe7-be58-962eaa52c31a","Type":"ContainerStarted","Data":"2f35f310b30d2e5188e1f321d43675adde801898c34e058b69ee5e5c218a7827"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.688892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-784d6f7789-45xt8" event={"ID":"a72d7303-20af-4fe7-be58-962eaa52c31a","Type":"ContainerStarted","Data":"9d6b3a10315ecaa3f9e38ab7113d934bf12afc216592235bbbc7e3070dbd10bf"} Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.767592 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-784d6f7789-45xt8" podStartSLOduration=2.767576769 podStartE2EDuration="2.767576769s" podCreationTimestamp="2026-01-30 06:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:19.745110945 +0000 UTC m=+5255.115021202" watchObservedRunningTime="2026-01-30 06:35:19.767576769 +0000 UTC m=+5255.137487026" Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.821918 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-67d8db4f6b-c2v48" podStartSLOduration=2.821902248 podStartE2EDuration="2.821902248s" podCreationTimestamp="2026-01-30 06:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:19.792686 +0000 UTC m=+5255.162596257" watchObservedRunningTime="2026-01-30 06:35:19.821902248 +0000 UTC m=+5255.191812505" Jan 30 06:35:19 crc kubenswrapper[4931]: I0130 06:35:19.824458 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-847c6776d8-4sw8x" podStartSLOduration=1.8244511719999998 podStartE2EDuration="1.824451172s" podCreationTimestamp="2026-01-30 06:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:19.814243269 +0000 UTC m=+5255.184153516" watchObservedRunningTime="2026-01-30 06:35:19.824451172 +0000 UTC m=+5255.194361429" Jan 30 06:35:20 crc kubenswrapper[4931]: I0130 06:35:20.701518 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" event={"ID":"66eeada2-dfe0-4ebc-af62-17af9f1ce15e","Type":"ContainerStarted","Data":"ae5834eb7aa94f3cb815f35e315bb6ed049e191afc33534b32776fd0debd1c4a"} Jan 30 06:35:20 crc kubenswrapper[4931]: I0130 06:35:20.702042 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:20 crc kubenswrapper[4931]: I0130 06:35:20.722057 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" podStartSLOduration=3.7220374720000002 podStartE2EDuration="3.722037472s" podCreationTimestamp="2026-01-30 06:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:20.718379047 +0000 UTC m=+5256.088289314" watchObservedRunningTime="2026-01-30 06:35:20.722037472 +0000 UTC m=+5256.091947729" Jan 30 06:35:27 crc kubenswrapper[4931]: I0130 06:35:27.422341 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:35:27 crc kubenswrapper[4931]: E0130 06:35:27.423667 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.372613 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.453373 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc57676c-79k7x"] Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.453777 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cc57676c-79k7x" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="dnsmasq-dns" containerID="cri-o://d7ae0ff751a18019fedb3fe9938258f86e6d1759fbc3c988b2c850d94022c34d" gracePeriod=10 Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.553882 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cc57676c-79k7x" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.24:5353: connect: connection refused" Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.832777 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerID="d7ae0ff751a18019fedb3fe9938258f86e6d1759fbc3c988b2c850d94022c34d" exitCode=0 Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.832825 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc57676c-79k7x" event={"ID":"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1","Type":"ContainerDied","Data":"d7ae0ff751a18019fedb3fe9938258f86e6d1759fbc3c988b2c850d94022c34d"} Jan 30 06:35:28 crc kubenswrapper[4931]: I0130 06:35:28.972033 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.060179 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-sb\") pod \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.060278 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-config\") pod \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.060365 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-dns-svc\") pod \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.060398 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-nb\") pod \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.060495 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd7x6\" (UniqueName: \"kubernetes.io/projected/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-kube-api-access-dd7x6\") pod \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\" (UID: \"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1\") " Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.065992 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-kube-api-access-dd7x6" (OuterVolumeSpecName: "kube-api-access-dd7x6") pod "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" (UID: "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1"). InnerVolumeSpecName "kube-api-access-dd7x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.107765 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" (UID: "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.112001 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-config" (OuterVolumeSpecName: "config") pod "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" (UID: "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.113491 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" (UID: "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.118946 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" (UID: "fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.162923 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.162972 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.162996 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd7x6\" (UniqueName: \"kubernetes.io/projected/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-kube-api-access-dd7x6\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.163014 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.163031 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:29 crc kubenswrapper[4931]: E0130 06:35:29.641390 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5868fc_e1b4_4f28_ba8d_6b0d9fad2db1.slice/crio-3961340eb18072ebf266ecd7fc39a9becdb1fb9de589cabfaaf5229e8191f71b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5868fc_e1b4_4f28_ba8d_6b0d9fad2db1.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.754828 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.800226 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-847c6776d8-4sw8x" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.893366 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc57676c-79k7x" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.893778 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc57676c-79k7x" event={"ID":"fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1","Type":"ContainerDied","Data":"3961340eb18072ebf266ecd7fc39a9becdb1fb9de589cabfaaf5229e8191f71b"} Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.893808 4931 scope.go:117] "RemoveContainer" containerID="d7ae0ff751a18019fedb3fe9938258f86e6d1759fbc3c988b2c850d94022c34d" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.914579 4931 scope.go:117] "RemoveContainer" containerID="376254fe1540e485a58048ed599a0e1e2664491a5fb008a7c2926d2f38ef5153" Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.937993 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc57676c-79k7x"] Jan 30 06:35:29 crc kubenswrapper[4931]: I0130 06:35:29.944776 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cc57676c-79k7x"] Jan 30 06:35:31 crc kubenswrapper[4931]: I0130 06:35:31.434006 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" path="/var/lib/kubelet/pods/fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1/volumes" Jan 30 06:35:38 crc kubenswrapper[4931]: E0130 06:35:38.661643 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.179:42406->38.102.83.179:45103: write tcp 38.102.83.179:42406->38.102.83.179:45103: write: connection reset by peer Jan 30 06:35:41 crc kubenswrapper[4931]: I0130 06:35:41.422253 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.022478 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"7acc7dd93c5206c6ff8ebf3271a041083dbd9bcf0e00cc88a42d6c0b4c7429dd"} Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.508482 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-w8ln6"] Jan 30 06:35:42 crc kubenswrapper[4931]: E0130 06:35:42.511299 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="dnsmasq-dns" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.511323 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="dnsmasq-dns" Jan 30 06:35:42 crc kubenswrapper[4931]: E0130 06:35:42.511346 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="init" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.511355 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="init" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.511534 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5868fc-e1b4-4f28-ba8d-6b0d9fad2db1" containerName="dnsmasq-dns" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.512040 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.526118 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-w8ln6"] Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.619072 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ee04-account-create-update-5mxt8"] Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.620301 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.622135 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.627345 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ee04-account-create-update-5mxt8"] Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.644307 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259088b5-f22c-4773-a526-5ce0d618a3c9-operator-scripts\") pod \"neutron-db-create-w8ln6\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.644390 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v7c9\" (UniqueName: \"kubernetes.io/projected/259088b5-f22c-4773-a526-5ce0d618a3c9-kube-api-access-6v7c9\") pod \"neutron-db-create-w8ln6\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.746255 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259088b5-f22c-4773-a526-5ce0d618a3c9-operator-scripts\") pod \"neutron-db-create-w8ln6\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.746383 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v7c9\" (UniqueName: \"kubernetes.io/projected/259088b5-f22c-4773-a526-5ce0d618a3c9-kube-api-access-6v7c9\") pod \"neutron-db-create-w8ln6\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.746449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnm2q\" (UniqueName: \"kubernetes.io/projected/da3ee3e2-1067-4d91-8780-4ee1442ddccd-kube-api-access-wnm2q\") pod \"neutron-ee04-account-create-update-5mxt8\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.746712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3ee3e2-1067-4d91-8780-4ee1442ddccd-operator-scripts\") pod \"neutron-ee04-account-create-update-5mxt8\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.747542 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259088b5-f22c-4773-a526-5ce0d618a3c9-operator-scripts\") pod \"neutron-db-create-w8ln6\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.778240 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v7c9\" (UniqueName: \"kubernetes.io/projected/259088b5-f22c-4773-a526-5ce0d618a3c9-kube-api-access-6v7c9\") pod \"neutron-db-create-w8ln6\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.827986 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.848901 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnm2q\" (UniqueName: \"kubernetes.io/projected/da3ee3e2-1067-4d91-8780-4ee1442ddccd-kube-api-access-wnm2q\") pod \"neutron-ee04-account-create-update-5mxt8\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.849017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3ee3e2-1067-4d91-8780-4ee1442ddccd-operator-scripts\") pod \"neutron-ee04-account-create-update-5mxt8\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.850855 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3ee3e2-1067-4d91-8780-4ee1442ddccd-operator-scripts\") pod \"neutron-ee04-account-create-update-5mxt8\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.882882 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnm2q\" (UniqueName: \"kubernetes.io/projected/da3ee3e2-1067-4d91-8780-4ee1442ddccd-kube-api-access-wnm2q\") pod \"neutron-ee04-account-create-update-5mxt8\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:42 crc kubenswrapper[4931]: I0130 06:35:42.945371 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:43 crc kubenswrapper[4931]: I0130 06:35:43.322392 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ee04-account-create-update-5mxt8"] Jan 30 06:35:43 crc kubenswrapper[4931]: W0130 06:35:43.324099 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda3ee3e2_1067_4d91_8780_4ee1442ddccd.slice/crio-d2fc1bc0b64af85fdcaae625f7a2cd72818f6a6c69c4e57aef2ea89ed03c3e40 WatchSource:0}: Error finding container d2fc1bc0b64af85fdcaae625f7a2cd72818f6a6c69c4e57aef2ea89ed03c3e40: Status 404 returned error can't find the container with id d2fc1bc0b64af85fdcaae625f7a2cd72818f6a6c69c4e57aef2ea89ed03c3e40 Jan 30 06:35:43 crc kubenswrapper[4931]: I0130 06:35:43.389713 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-w8ln6"] Jan 30 06:35:43 crc kubenswrapper[4931]: W0130 06:35:43.398534 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod259088b5_f22c_4773_a526_5ce0d618a3c9.slice/crio-f98acd6750b37318271f0a4ec84a4e883f736cce807edcbf3eac1ea04f125692 WatchSource:0}: Error finding container f98acd6750b37318271f0a4ec84a4e883f736cce807edcbf3eac1ea04f125692: Status 404 returned error can't find the container with id f98acd6750b37318271f0a4ec84a4e883f736cce807edcbf3eac1ea04f125692 Jan 30 06:35:44 crc kubenswrapper[4931]: I0130 06:35:44.046493 4931 generic.go:334] "Generic (PLEG): container finished" podID="259088b5-f22c-4773-a526-5ce0d618a3c9" containerID="8641f9d89c670b316ae569c652c473fa47969340118c8804760552f9529867f0" exitCode=0 Jan 30 06:35:44 crc kubenswrapper[4931]: I0130 06:35:44.046592 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8ln6" event={"ID":"259088b5-f22c-4773-a526-5ce0d618a3c9","Type":"ContainerDied","Data":"8641f9d89c670b316ae569c652c473fa47969340118c8804760552f9529867f0"} Jan 30 06:35:44 crc kubenswrapper[4931]: I0130 06:35:44.046928 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8ln6" event={"ID":"259088b5-f22c-4773-a526-5ce0d618a3c9","Type":"ContainerStarted","Data":"f98acd6750b37318271f0a4ec84a4e883f736cce807edcbf3eac1ea04f125692"} Jan 30 06:35:44 crc kubenswrapper[4931]: I0130 06:35:44.051401 4931 generic.go:334] "Generic (PLEG): container finished" podID="da3ee3e2-1067-4d91-8780-4ee1442ddccd" containerID="2a367b7f63781dff8719e328044d9f7bfe39229339b2c9fd8828dc6b757b0a29" exitCode=0 Jan 30 06:35:44 crc kubenswrapper[4931]: I0130 06:35:44.051472 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ee04-account-create-update-5mxt8" event={"ID":"da3ee3e2-1067-4d91-8780-4ee1442ddccd","Type":"ContainerDied","Data":"2a367b7f63781dff8719e328044d9f7bfe39229339b2c9fd8828dc6b757b0a29"} Jan 30 06:35:44 crc kubenswrapper[4931]: I0130 06:35:44.051506 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ee04-account-create-update-5mxt8" event={"ID":"da3ee3e2-1067-4d91-8780-4ee1442ddccd","Type":"ContainerStarted","Data":"d2fc1bc0b64af85fdcaae625f7a2cd72818f6a6c69c4e57aef2ea89ed03c3e40"} Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.542881 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.559065 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.611871 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3ee3e2-1067-4d91-8780-4ee1442ddccd-operator-scripts\") pod \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.611957 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnm2q\" (UniqueName: \"kubernetes.io/projected/da3ee3e2-1067-4d91-8780-4ee1442ddccd-kube-api-access-wnm2q\") pod \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\" (UID: \"da3ee3e2-1067-4d91-8780-4ee1442ddccd\") " Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.612012 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v7c9\" (UniqueName: \"kubernetes.io/projected/259088b5-f22c-4773-a526-5ce0d618a3c9-kube-api-access-6v7c9\") pod \"259088b5-f22c-4773-a526-5ce0d618a3c9\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.612140 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259088b5-f22c-4773-a526-5ce0d618a3c9-operator-scripts\") pod \"259088b5-f22c-4773-a526-5ce0d618a3c9\" (UID: \"259088b5-f22c-4773-a526-5ce0d618a3c9\") " Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.613050 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259088b5-f22c-4773-a526-5ce0d618a3c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "259088b5-f22c-4773-a526-5ce0d618a3c9" (UID: "259088b5-f22c-4773-a526-5ce0d618a3c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.613497 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da3ee3e2-1067-4d91-8780-4ee1442ddccd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da3ee3e2-1067-4d91-8780-4ee1442ddccd" (UID: "da3ee3e2-1067-4d91-8780-4ee1442ddccd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.618135 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3ee3e2-1067-4d91-8780-4ee1442ddccd-kube-api-access-wnm2q" (OuterVolumeSpecName: "kube-api-access-wnm2q") pod "da3ee3e2-1067-4d91-8780-4ee1442ddccd" (UID: "da3ee3e2-1067-4d91-8780-4ee1442ddccd"). InnerVolumeSpecName "kube-api-access-wnm2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.618526 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259088b5-f22c-4773-a526-5ce0d618a3c9-kube-api-access-6v7c9" (OuterVolumeSpecName: "kube-api-access-6v7c9") pod "259088b5-f22c-4773-a526-5ce0d618a3c9" (UID: "259088b5-f22c-4773-a526-5ce0d618a3c9"). InnerVolumeSpecName "kube-api-access-6v7c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.714844 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3ee3e2-1067-4d91-8780-4ee1442ddccd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.715202 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnm2q\" (UniqueName: \"kubernetes.io/projected/da3ee3e2-1067-4d91-8780-4ee1442ddccd-kube-api-access-wnm2q\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.715224 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v7c9\" (UniqueName: \"kubernetes.io/projected/259088b5-f22c-4773-a526-5ce0d618a3c9-kube-api-access-6v7c9\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:45 crc kubenswrapper[4931]: I0130 06:35:45.715243 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/259088b5-f22c-4773-a526-5ce0d618a3c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:46 crc kubenswrapper[4931]: I0130 06:35:46.076213 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8ln6" Jan 30 06:35:46 crc kubenswrapper[4931]: I0130 06:35:46.076712 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8ln6" event={"ID":"259088b5-f22c-4773-a526-5ce0d618a3c9","Type":"ContainerDied","Data":"f98acd6750b37318271f0a4ec84a4e883f736cce807edcbf3eac1ea04f125692"} Jan 30 06:35:46 crc kubenswrapper[4931]: I0130 06:35:46.076778 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f98acd6750b37318271f0a4ec84a4e883f736cce807edcbf3eac1ea04f125692" Jan 30 06:35:46 crc kubenswrapper[4931]: I0130 06:35:46.078899 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ee04-account-create-update-5mxt8" event={"ID":"da3ee3e2-1067-4d91-8780-4ee1442ddccd","Type":"ContainerDied","Data":"d2fc1bc0b64af85fdcaae625f7a2cd72818f6a6c69c4e57aef2ea89ed03c3e40"} Jan 30 06:35:46 crc kubenswrapper[4931]: I0130 06:35:46.078936 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2fc1bc0b64af85fdcaae625f7a2cd72818f6a6c69c4e57aef2ea89ed03c3e40" Jan 30 06:35:46 crc kubenswrapper[4931]: I0130 06:35:46.079053 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ee04-account-create-update-5mxt8" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.820081 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vxl4f"] Jan 30 06:35:47 crc kubenswrapper[4931]: E0130 06:35:47.820779 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259088b5-f22c-4773-a526-5ce0d618a3c9" containerName="mariadb-database-create" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.820795 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="259088b5-f22c-4773-a526-5ce0d618a3c9" containerName="mariadb-database-create" Jan 30 06:35:47 crc kubenswrapper[4931]: E0130 06:35:47.820812 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3ee3e2-1067-4d91-8780-4ee1442ddccd" containerName="mariadb-account-create-update" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.820823 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3ee3e2-1067-4d91-8780-4ee1442ddccd" containerName="mariadb-account-create-update" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.820992 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3ee3e2-1067-4d91-8780-4ee1442ddccd" containerName="mariadb-account-create-update" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.821027 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="259088b5-f22c-4773-a526-5ce0d618a3c9" containerName="mariadb-database-create" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.821773 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.823973 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.824461 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-twcp7" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.824604 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.835210 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vxl4f"] Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.857864 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-config\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.857920 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-combined-ca-bundle\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.858128 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2dqp\" (UniqueName: \"kubernetes.io/projected/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-kube-api-access-p2dqp\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.959390 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-config\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.959544 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-combined-ca-bundle\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.961159 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2dqp\" (UniqueName: \"kubernetes.io/projected/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-kube-api-access-p2dqp\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.967131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-combined-ca-bundle\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.973222 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-config\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:47 crc kubenswrapper[4931]: I0130 06:35:47.987415 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2dqp\" (UniqueName: \"kubernetes.io/projected/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-kube-api-access-p2dqp\") pod \"neutron-db-sync-vxl4f\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:48 crc kubenswrapper[4931]: I0130 06:35:48.157753 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:48 crc kubenswrapper[4931]: I0130 06:35:48.646464 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vxl4f"] Jan 30 06:35:49 crc kubenswrapper[4931]: I0130 06:35:49.117232 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vxl4f" event={"ID":"a2c67196-2e21-4ca1-81c6-ae1d0b68d461","Type":"ContainerStarted","Data":"a418c4ea4e534161dc6a2d3882bfda776aacf77311c0dc409345881787f7574b"} Jan 30 06:35:49 crc kubenswrapper[4931]: I0130 06:35:49.117283 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vxl4f" event={"ID":"a2c67196-2e21-4ca1-81c6-ae1d0b68d461","Type":"ContainerStarted","Data":"3670fe8bc5a251e2f47fae900729e06863bb53e78c9777474cbdbd5c325116f9"} Jan 30 06:35:49 crc kubenswrapper[4931]: I0130 06:35:49.143108 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vxl4f" podStartSLOduration=2.143070398 podStartE2EDuration="2.143070398s" podCreationTimestamp="2026-01-30 06:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:49.140293058 +0000 UTC m=+5284.510203385" watchObservedRunningTime="2026-01-30 06:35:49.143070398 +0000 UTC m=+5284.512980695" Jan 30 06:35:53 crc kubenswrapper[4931]: I0130 06:35:53.158303 4931 generic.go:334] "Generic (PLEG): container finished" podID="a2c67196-2e21-4ca1-81c6-ae1d0b68d461" containerID="a418c4ea4e534161dc6a2d3882bfda776aacf77311c0dc409345881787f7574b" exitCode=0 Jan 30 06:35:53 crc kubenswrapper[4931]: I0130 06:35:53.158385 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vxl4f" event={"ID":"a2c67196-2e21-4ca1-81c6-ae1d0b68d461","Type":"ContainerDied","Data":"a418c4ea4e534161dc6a2d3882bfda776aacf77311c0dc409345881787f7574b"} Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.527021 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.595990 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-combined-ca-bundle\") pod \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.596092 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-config\") pod \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.596147 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2dqp\" (UniqueName: \"kubernetes.io/projected/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-kube-api-access-p2dqp\") pod \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\" (UID: \"a2c67196-2e21-4ca1-81c6-ae1d0b68d461\") " Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.602706 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-kube-api-access-p2dqp" (OuterVolumeSpecName: "kube-api-access-p2dqp") pod "a2c67196-2e21-4ca1-81c6-ae1d0b68d461" (UID: "a2c67196-2e21-4ca1-81c6-ae1d0b68d461"). InnerVolumeSpecName "kube-api-access-p2dqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.622338 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2c67196-2e21-4ca1-81c6-ae1d0b68d461" (UID: "a2c67196-2e21-4ca1-81c6-ae1d0b68d461"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.622855 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-config" (OuterVolumeSpecName: "config") pod "a2c67196-2e21-4ca1-81c6-ae1d0b68d461" (UID: "a2c67196-2e21-4ca1-81c6-ae1d0b68d461"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.697440 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.697480 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:54 crc kubenswrapper[4931]: I0130 06:35:54.697494 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2dqp\" (UniqueName: \"kubernetes.io/projected/a2c67196-2e21-4ca1-81c6-ae1d0b68d461-kube-api-access-p2dqp\") on node \"crc\" DevicePath \"\"" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.183531 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vxl4f" event={"ID":"a2c67196-2e21-4ca1-81c6-ae1d0b68d461","Type":"ContainerDied","Data":"3670fe8bc5a251e2f47fae900729e06863bb53e78c9777474cbdbd5c325116f9"} Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.184020 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3670fe8bc5a251e2f47fae900729e06863bb53e78c9777474cbdbd5c325116f9" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.183708 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vxl4f" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.484707 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d84887cc5-thvpx"] Jan 30 06:35:55 crc kubenswrapper[4931]: E0130 06:35:55.485082 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c67196-2e21-4ca1-81c6-ae1d0b68d461" containerName="neutron-db-sync" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.485097 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c67196-2e21-4ca1-81c6-ae1d0b68d461" containerName="neutron-db-sync" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.485255 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c67196-2e21-4ca1-81c6-ae1d0b68d461" containerName="neutron-db-sync" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.486089 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.511404 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-config\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.511596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-nb\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.511665 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-dns-svc\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.511712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv7bs\" (UniqueName: \"kubernetes.io/projected/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-kube-api-access-qv7bs\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.511750 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-sb\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.516487 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d84887cc5-thvpx"] Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.529208 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78978fdd5c-pqg87"] Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.530606 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.533090 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-twcp7" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.533696 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.533835 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.537966 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78978fdd5c-pqg87"] Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.614210 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-dns-svc\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.614365 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv7bs\" (UniqueName: \"kubernetes.io/projected/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-kube-api-access-qv7bs\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.615186 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-dns-svc\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.615503 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-config\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.615570 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-sb\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.616201 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-sb\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.616259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-config\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.616826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-config\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.616890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-nb\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.617487 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-nb\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.617558 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-combined-ca-bundle\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.617662 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkv6z\" (UniqueName: \"kubernetes.io/projected/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-kube-api-access-gkv6z\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.617710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-httpd-config\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.632330 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv7bs\" (UniqueName: \"kubernetes.io/projected/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-kube-api-access-qv7bs\") pod \"dnsmasq-dns-d84887cc5-thvpx\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.719518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-config\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.720011 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-combined-ca-bundle\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.720056 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkv6z\" (UniqueName: \"kubernetes.io/projected/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-kube-api-access-gkv6z\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.720074 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-httpd-config\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.722778 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-config\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.726119 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-httpd-config\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.726220 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-combined-ca-bundle\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.746808 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkv6z\" (UniqueName: \"kubernetes.io/projected/ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5-kube-api-access-gkv6z\") pod \"neutron-78978fdd5c-pqg87\" (UID: \"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5\") " pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.824733 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:35:55 crc kubenswrapper[4931]: I0130 06:35:55.847325 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:56 crc kubenswrapper[4931]: I0130 06:35:56.285313 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d84887cc5-thvpx"] Jan 30 06:35:56 crc kubenswrapper[4931]: I0130 06:35:56.477323 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78978fdd5c-pqg87"] Jan 30 06:35:56 crc kubenswrapper[4931]: W0130 06:35:56.482510 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef26e9fb_2e6d_4582_a140_1ebd8eebc9e5.slice/crio-4f58ba9c78a9aa618bd3163dd6eea495c1e01667778064d935f8061a0cbc231c WatchSource:0}: Error finding container 4f58ba9c78a9aa618bd3163dd6eea495c1e01667778064d935f8061a0cbc231c: Status 404 returned error can't find the container with id 4f58ba9c78a9aa618bd3163dd6eea495c1e01667778064d935f8061a0cbc231c Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.201175 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78978fdd5c-pqg87" event={"ID":"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5","Type":"ContainerStarted","Data":"5d3f791e0fbe47127d8892cc054af5a0452bbcb2d8edd1baaab73a68e999b293"} Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.201936 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78978fdd5c-pqg87" event={"ID":"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5","Type":"ContainerStarted","Data":"0051ae7f7b4dee62dc46439aad8d97f329741b7b6d43d866d13ee09e484ea0b2"} Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.201980 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78978fdd5c-pqg87" event={"ID":"ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5","Type":"ContainerStarted","Data":"4f58ba9c78a9aa618bd3163dd6eea495c1e01667778064d935f8061a0cbc231c"} Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.201999 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.217993 4931 generic.go:334] "Generic (PLEG): container finished" podID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerID="753bd88c07189bffc8d69c9bf6a34e6d86af3f56c735e6c62d25cd5e5e4da562" exitCode=0 Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.218220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" event={"ID":"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5","Type":"ContainerDied","Data":"753bd88c07189bffc8d69c9bf6a34e6d86af3f56c735e6c62d25cd5e5e4da562"} Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.218300 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" event={"ID":"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5","Type":"ContainerStarted","Data":"48bcc562af333fbf9303da09ed90c070b458fec2e34b83616d91ef56cccabb57"} Jan 30 06:35:57 crc kubenswrapper[4931]: I0130 06:35:57.248635 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78978fdd5c-pqg87" podStartSLOduration=2.248608211 podStartE2EDuration="2.248608211s" podCreationTimestamp="2026-01-30 06:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:57.225680583 +0000 UTC m=+5292.595590850" watchObservedRunningTime="2026-01-30 06:35:57.248608211 +0000 UTC m=+5292.618518468" Jan 30 06:35:58 crc kubenswrapper[4931]: I0130 06:35:58.238834 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" event={"ID":"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5","Type":"ContainerStarted","Data":"9a53ddc494ac40f2cd235561690db17e6fa36bf1d84c504bb1475d76a3081caf"} Jan 30 06:35:58 crc kubenswrapper[4931]: I0130 06:35:58.270918 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" podStartSLOduration=3.270900391 podStartE2EDuration="3.270900391s" podCreationTimestamp="2026-01-30 06:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:58.270223101 +0000 UTC m=+5293.640133388" watchObservedRunningTime="2026-01-30 06:35:58.270900391 +0000 UTC m=+5293.640810648" Jan 30 06:35:59 crc kubenswrapper[4931]: I0130 06:35:59.249203 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:36:05 crc kubenswrapper[4931]: I0130 06:36:05.825585 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:36:05 crc kubenswrapper[4931]: I0130 06:36:05.890571 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-669bb76d6c-ld8p5"] Jan 30 06:36:05 crc kubenswrapper[4931]: I0130 06:36:05.890813 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerName="dnsmasq-dns" containerID="cri-o://ae5834eb7aa94f3cb815f35e315bb6ed049e191afc33534b32776fd0debd1c4a" gracePeriod=10 Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.347278 4931 generic.go:334] "Generic (PLEG): container finished" podID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerID="ae5834eb7aa94f3cb815f35e315bb6ed049e191afc33534b32776fd0debd1c4a" exitCode=0 Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.347362 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" event={"ID":"66eeada2-dfe0-4ebc-af62-17af9f1ce15e","Type":"ContainerDied","Data":"ae5834eb7aa94f3cb815f35e315bb6ed049e191afc33534b32776fd0debd1c4a"} Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.406257 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.519141 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-dns-svc\") pod \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.519206 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-nb\") pod \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.519235 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-sb\") pod \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.519329 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-config\") pod \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.519354 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t4f9\" (UniqueName: \"kubernetes.io/projected/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-kube-api-access-4t4f9\") pod \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\" (UID: \"66eeada2-dfe0-4ebc-af62-17af9f1ce15e\") " Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.526671 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-kube-api-access-4t4f9" (OuterVolumeSpecName: "kube-api-access-4t4f9") pod "66eeada2-dfe0-4ebc-af62-17af9f1ce15e" (UID: "66eeada2-dfe0-4ebc-af62-17af9f1ce15e"). InnerVolumeSpecName "kube-api-access-4t4f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.577787 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66eeada2-dfe0-4ebc-af62-17af9f1ce15e" (UID: "66eeada2-dfe0-4ebc-af62-17af9f1ce15e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.578127 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66eeada2-dfe0-4ebc-af62-17af9f1ce15e" (UID: "66eeada2-dfe0-4ebc-af62-17af9f1ce15e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.578807 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-config" (OuterVolumeSpecName: "config") pod "66eeada2-dfe0-4ebc-af62-17af9f1ce15e" (UID: "66eeada2-dfe0-4ebc-af62-17af9f1ce15e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.578997 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66eeada2-dfe0-4ebc-af62-17af9f1ce15e" (UID: "66eeada2-dfe0-4ebc-af62-17af9f1ce15e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.621019 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.621052 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t4f9\" (UniqueName: \"kubernetes.io/projected/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-kube-api-access-4t4f9\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.621065 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.621075 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:06 crc kubenswrapper[4931]: I0130 06:36:06.621085 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66eeada2-dfe0-4ebc-af62-17af9f1ce15e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.357534 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" event={"ID":"66eeada2-dfe0-4ebc-af62-17af9f1ce15e","Type":"ContainerDied","Data":"509c57c8ee870d1521b8c78081fe670bec4a3e4b66684ebdaba1d4543435f196"} Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.357627 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669bb76d6c-ld8p5" Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.358079 4931 scope.go:117] "RemoveContainer" containerID="ae5834eb7aa94f3cb815f35e315bb6ed049e191afc33534b32776fd0debd1c4a" Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.379883 4931 scope.go:117] "RemoveContainer" containerID="fac0f8070b365b5149eb56040af36ca71ef68ff65a5f0aef7d169c86e39479df" Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.403222 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-669bb76d6c-ld8p5"] Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.409160 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-669bb76d6c-ld8p5"] Jan 30 06:36:07 crc kubenswrapper[4931]: I0130 06:36:07.433134 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" path="/var/lib/kubelet/pods/66eeada2-dfe0-4ebc-af62-17af9f1ce15e/volumes" Jan 30 06:36:25 crc kubenswrapper[4931]: I0130 06:36:25.863005 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78978fdd5c-pqg87" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.121902 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fcgh6"] Jan 30 06:36:33 crc kubenswrapper[4931]: E0130 06:36:33.122868 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerName="init" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.122888 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerName="init" Jan 30 06:36:33 crc kubenswrapper[4931]: E0130 06:36:33.122909 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerName="dnsmasq-dns" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.122916 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerName="dnsmasq-dns" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.123108 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="66eeada2-dfe0-4ebc-af62-17af9f1ce15e" containerName="dnsmasq-dns" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.123741 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.130623 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fcgh6"] Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.227999 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8be9-account-create-update-4qptt"] Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.229339 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.231044 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.245332 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8be9-account-create-update-4qptt"] Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.247337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5a82c2-728c-40a6-83b0-37ba70d84931-operator-scripts\") pod \"glance-db-create-fcgh6\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.247552 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjr9\" (UniqueName: \"kubernetes.io/projected/fe5a82c2-728c-40a6-83b0-37ba70d84931-kube-api-access-wrjr9\") pod \"glance-db-create-fcgh6\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.349495 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76ttz\" (UniqueName: \"kubernetes.io/projected/f6cc38ea-1412-4e17-9c74-779b7c6d701c-kube-api-access-76ttz\") pod \"glance-8be9-account-create-update-4qptt\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.349567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5a82c2-728c-40a6-83b0-37ba70d84931-operator-scripts\") pod \"glance-db-create-fcgh6\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.349732 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cc38ea-1412-4e17-9c74-779b7c6d701c-operator-scripts\") pod \"glance-8be9-account-create-update-4qptt\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.349882 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjr9\" (UniqueName: \"kubernetes.io/projected/fe5a82c2-728c-40a6-83b0-37ba70d84931-kube-api-access-wrjr9\") pod \"glance-db-create-fcgh6\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.350702 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5a82c2-728c-40a6-83b0-37ba70d84931-operator-scripts\") pod \"glance-db-create-fcgh6\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.373297 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjr9\" (UniqueName: \"kubernetes.io/projected/fe5a82c2-728c-40a6-83b0-37ba70d84931-kube-api-access-wrjr9\") pod \"glance-db-create-fcgh6\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.451684 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76ttz\" (UniqueName: \"kubernetes.io/projected/f6cc38ea-1412-4e17-9c74-779b7c6d701c-kube-api-access-76ttz\") pod \"glance-8be9-account-create-update-4qptt\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.451772 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cc38ea-1412-4e17-9c74-779b7c6d701c-operator-scripts\") pod \"glance-8be9-account-create-update-4qptt\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.452652 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cc38ea-1412-4e17-9c74-779b7c6d701c-operator-scripts\") pod \"glance-8be9-account-create-update-4qptt\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.476791 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76ttz\" (UniqueName: \"kubernetes.io/projected/f6cc38ea-1412-4e17-9c74-779b7c6d701c-kube-api-access-76ttz\") pod \"glance-8be9-account-create-update-4qptt\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.490708 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.543662 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:33 crc kubenswrapper[4931]: I0130 06:36:33.979658 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fcgh6"] Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.046598 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8be9-account-create-update-4qptt"] Jan 30 06:36:34 crc kubenswrapper[4931]: W0130 06:36:34.046846 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6cc38ea_1412_4e17_9c74_779b7c6d701c.slice/crio-8775712b1867d046021ff197ce3cb48cf5322c9e656c2529b4c751dd69d648f5 WatchSource:0}: Error finding container 8775712b1867d046021ff197ce3cb48cf5322c9e656c2529b4c751dd69d648f5: Status 404 returned error can't find the container with id 8775712b1867d046021ff197ce3cb48cf5322c9e656c2529b4c751dd69d648f5 Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.623823 4931 generic.go:334] "Generic (PLEG): container finished" podID="f6cc38ea-1412-4e17-9c74-779b7c6d701c" containerID="b12392121e0278ef6aaee0ef2cb91f20ce791df236403c3611d10649bcb909d3" exitCode=0 Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.623968 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8be9-account-create-update-4qptt" event={"ID":"f6cc38ea-1412-4e17-9c74-779b7c6d701c","Type":"ContainerDied","Data":"b12392121e0278ef6aaee0ef2cb91f20ce791df236403c3611d10649bcb909d3"} Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.624465 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8be9-account-create-update-4qptt" event={"ID":"f6cc38ea-1412-4e17-9c74-779b7c6d701c","Type":"ContainerStarted","Data":"8775712b1867d046021ff197ce3cb48cf5322c9e656c2529b4c751dd69d648f5"} Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.627166 4931 generic.go:334] "Generic (PLEG): container finished" podID="fe5a82c2-728c-40a6-83b0-37ba70d84931" containerID="4711e717af206225417ee23e6a5a6867fd0fca04b0c1bb798437c5d765e9f38b" exitCode=0 Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.627228 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fcgh6" event={"ID":"fe5a82c2-728c-40a6-83b0-37ba70d84931","Type":"ContainerDied","Data":"4711e717af206225417ee23e6a5a6867fd0fca04b0c1bb798437c5d765e9f38b"} Jan 30 06:36:34 crc kubenswrapper[4931]: I0130 06:36:34.627260 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fcgh6" event={"ID":"fe5a82c2-728c-40a6-83b0-37ba70d84931","Type":"ContainerStarted","Data":"4b0033d6fb69375936e0004de30bd4a3ca48c1df256811c61ab55f5388f86d4c"} Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.115310 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.126906 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.205150 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76ttz\" (UniqueName: \"kubernetes.io/projected/f6cc38ea-1412-4e17-9c74-779b7c6d701c-kube-api-access-76ttz\") pod \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.205267 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrjr9\" (UniqueName: \"kubernetes.io/projected/fe5a82c2-728c-40a6-83b0-37ba70d84931-kube-api-access-wrjr9\") pod \"fe5a82c2-728c-40a6-83b0-37ba70d84931\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.205324 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cc38ea-1412-4e17-9c74-779b7c6d701c-operator-scripts\") pod \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\" (UID: \"f6cc38ea-1412-4e17-9c74-779b7c6d701c\") " Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.205451 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5a82c2-728c-40a6-83b0-37ba70d84931-operator-scripts\") pod \"fe5a82c2-728c-40a6-83b0-37ba70d84931\" (UID: \"fe5a82c2-728c-40a6-83b0-37ba70d84931\") " Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.206362 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5a82c2-728c-40a6-83b0-37ba70d84931-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe5a82c2-728c-40a6-83b0-37ba70d84931" (UID: "fe5a82c2-728c-40a6-83b0-37ba70d84931"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.206915 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cc38ea-1412-4e17-9c74-779b7c6d701c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6cc38ea-1412-4e17-9c74-779b7c6d701c" (UID: "f6cc38ea-1412-4e17-9c74-779b7c6d701c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.209979 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5a82c2-728c-40a6-83b0-37ba70d84931-kube-api-access-wrjr9" (OuterVolumeSpecName: "kube-api-access-wrjr9") pod "fe5a82c2-728c-40a6-83b0-37ba70d84931" (UID: "fe5a82c2-728c-40a6-83b0-37ba70d84931"). InnerVolumeSpecName "kube-api-access-wrjr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.210518 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cc38ea-1412-4e17-9c74-779b7c6d701c-kube-api-access-76ttz" (OuterVolumeSpecName: "kube-api-access-76ttz") pod "f6cc38ea-1412-4e17-9c74-779b7c6d701c" (UID: "f6cc38ea-1412-4e17-9c74-779b7c6d701c"). InnerVolumeSpecName "kube-api-access-76ttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.307039 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrjr9\" (UniqueName: \"kubernetes.io/projected/fe5a82c2-728c-40a6-83b0-37ba70d84931-kube-api-access-wrjr9\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.307090 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6cc38ea-1412-4e17-9c74-779b7c6d701c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.307110 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5a82c2-728c-40a6-83b0-37ba70d84931-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.307129 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76ttz\" (UniqueName: \"kubernetes.io/projected/f6cc38ea-1412-4e17-9c74-779b7c6d701c-kube-api-access-76ttz\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.652068 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8be9-account-create-update-4qptt" event={"ID":"f6cc38ea-1412-4e17-9c74-779b7c6d701c","Type":"ContainerDied","Data":"8775712b1867d046021ff197ce3cb48cf5322c9e656c2529b4c751dd69d648f5"} Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.652485 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8775712b1867d046021ff197ce3cb48cf5322c9e656c2529b4c751dd69d648f5" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.652197 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be9-account-create-update-4qptt" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.654755 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fcgh6" event={"ID":"fe5a82c2-728c-40a6-83b0-37ba70d84931","Type":"ContainerDied","Data":"4b0033d6fb69375936e0004de30bd4a3ca48c1df256811c61ab55f5388f86d4c"} Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.654825 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b0033d6fb69375936e0004de30bd4a3ca48c1df256811c61ab55f5388f86d4c" Jan 30 06:36:36 crc kubenswrapper[4931]: I0130 06:36:36.654844 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fcgh6" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.299181 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-529p5"] Jan 30 06:36:38 crc kubenswrapper[4931]: E0130 06:36:38.299641 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5a82c2-728c-40a6-83b0-37ba70d84931" containerName="mariadb-database-create" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.299656 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5a82c2-728c-40a6-83b0-37ba70d84931" containerName="mariadb-database-create" Jan 30 06:36:38 crc kubenswrapper[4931]: E0130 06:36:38.299680 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cc38ea-1412-4e17-9c74-779b7c6d701c" containerName="mariadb-account-create-update" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.299688 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cc38ea-1412-4e17-9c74-779b7c6d701c" containerName="mariadb-account-create-update" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.299905 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6cc38ea-1412-4e17-9c74-779b7c6d701c" containerName="mariadb-account-create-update" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.299933 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5a82c2-728c-40a6-83b0-37ba70d84931" containerName="mariadb-database-create" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.300599 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.307772 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8bfcx" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.307796 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.310155 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-529p5"] Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.446006 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvk6\" (UniqueName: \"kubernetes.io/projected/bff91271-f1e2-4aaf-adec-bc61ce9dedad-kube-api-access-jfvk6\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.446098 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-config-data\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.446166 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-combined-ca-bundle\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.446245 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-db-sync-config-data\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.548090 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvk6\" (UniqueName: \"kubernetes.io/projected/bff91271-f1e2-4aaf-adec-bc61ce9dedad-kube-api-access-jfvk6\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.548501 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-config-data\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.548547 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-combined-ca-bundle\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.548569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-db-sync-config-data\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.570178 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-db-sync-config-data\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.570190 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-combined-ca-bundle\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.570193 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-config-data\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.581049 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvk6\" (UniqueName: \"kubernetes.io/projected/bff91271-f1e2-4aaf-adec-bc61ce9dedad-kube-api-access-jfvk6\") pod \"glance-db-sync-529p5\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.620979 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-529p5" Jan 30 06:36:38 crc kubenswrapper[4931]: I0130 06:36:38.941291 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-529p5"] Jan 30 06:36:39 crc kubenswrapper[4931]: I0130 06:36:39.691076 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-529p5" event={"ID":"bff91271-f1e2-4aaf-adec-bc61ce9dedad","Type":"ContainerStarted","Data":"6f79520471e9a429df8d71872cafba7c48f1385750af4061f0e5ea5c4355f53e"} Jan 30 06:36:39 crc kubenswrapper[4931]: I0130 06:36:39.691401 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-529p5" event={"ID":"bff91271-f1e2-4aaf-adec-bc61ce9dedad","Type":"ContainerStarted","Data":"2b95e40187ff1d07e7b9f697d565d12c73d63ef422465c2985d21a0747596235"} Jan 30 06:36:39 crc kubenswrapper[4931]: I0130 06:36:39.712578 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-529p5" podStartSLOduration=1.712560282 podStartE2EDuration="1.712560282s" podCreationTimestamp="2026-01-30 06:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:39.706920354 +0000 UTC m=+5335.076830611" watchObservedRunningTime="2026-01-30 06:36:39.712560282 +0000 UTC m=+5335.082470539" Jan 30 06:36:42 crc kubenswrapper[4931]: I0130 06:36:42.724029 4931 generic.go:334] "Generic (PLEG): container finished" podID="bff91271-f1e2-4aaf-adec-bc61ce9dedad" containerID="6f79520471e9a429df8d71872cafba7c48f1385750af4061f0e5ea5c4355f53e" exitCode=0 Jan 30 06:36:42 crc kubenswrapper[4931]: I0130 06:36:42.724093 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-529p5" event={"ID":"bff91271-f1e2-4aaf-adec-bc61ce9dedad","Type":"ContainerDied","Data":"6f79520471e9a429df8d71872cafba7c48f1385750af4061f0e5ea5c4355f53e"} Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.230044 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-529p5" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.266204 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-combined-ca-bundle\") pod \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.266317 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-db-sync-config-data\") pod \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.266378 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-config-data\") pod \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.266463 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfvk6\" (UniqueName: \"kubernetes.io/projected/bff91271-f1e2-4aaf-adec-bc61ce9dedad-kube-api-access-jfvk6\") pod \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\" (UID: \"bff91271-f1e2-4aaf-adec-bc61ce9dedad\") " Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.273280 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff91271-f1e2-4aaf-adec-bc61ce9dedad-kube-api-access-jfvk6" (OuterVolumeSpecName: "kube-api-access-jfvk6") pod "bff91271-f1e2-4aaf-adec-bc61ce9dedad" (UID: "bff91271-f1e2-4aaf-adec-bc61ce9dedad"). InnerVolumeSpecName "kube-api-access-jfvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.274047 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bff91271-f1e2-4aaf-adec-bc61ce9dedad" (UID: "bff91271-f1e2-4aaf-adec-bc61ce9dedad"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.312833 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bff91271-f1e2-4aaf-adec-bc61ce9dedad" (UID: "bff91271-f1e2-4aaf-adec-bc61ce9dedad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.319021 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-config-data" (OuterVolumeSpecName: "config-data") pod "bff91271-f1e2-4aaf-adec-bc61ce9dedad" (UID: "bff91271-f1e2-4aaf-adec-bc61ce9dedad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.367988 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.368029 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.368042 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfvk6\" (UniqueName: \"kubernetes.io/projected/bff91271-f1e2-4aaf-adec-bc61ce9dedad-kube-api-access-jfvk6\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.368055 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff91271-f1e2-4aaf-adec-bc61ce9dedad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.747450 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-529p5" event={"ID":"bff91271-f1e2-4aaf-adec-bc61ce9dedad","Type":"ContainerDied","Data":"2b95e40187ff1d07e7b9f697d565d12c73d63ef422465c2985d21a0747596235"} Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.747737 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b95e40187ff1d07e7b9f697d565d12c73d63ef422465c2985d21a0747596235" Jan 30 06:36:44 crc kubenswrapper[4931]: I0130 06:36:44.747600 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-529p5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.068843 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:45 crc kubenswrapper[4931]: E0130 06:36:45.069454 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff91271-f1e2-4aaf-adec-bc61ce9dedad" containerName="glance-db-sync" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.069578 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff91271-f1e2-4aaf-adec-bc61ce9dedad" containerName="glance-db-sync" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.069878 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff91271-f1e2-4aaf-adec-bc61ce9dedad" containerName="glance-db-sync" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.071047 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.073994 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.074087 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.074643 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.074990 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8bfcx" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.090332 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191260 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191661 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ls9v\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-kube-api-access-8ls9v\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191717 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191768 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191814 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191852 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-ceph\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.191884 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-logs\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.227651 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cf7fddbc7-982b5"] Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.228977 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.247212 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf7fddbc7-982b5"] Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.293257 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ls9v\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-kube-api-access-8ls9v\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.293315 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.293361 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294080 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294134 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294159 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-config\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294177 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-ceph\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294206 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6k6k\" (UniqueName: \"kubernetes.io/projected/e7f7543a-72f1-4937-95d9-8869b77ab81d-kube-api-access-s6k6k\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294223 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-logs\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294282 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-dns-svc\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294854 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.294864 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-logs\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.298945 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.311010 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.313160 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.313530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-ceph\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.317661 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ls9v\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-kube-api-access-8ls9v\") pod \"glance-default-external-api-0\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.319090 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.321052 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.326136 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.341864 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396046 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-dns-svc\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396302 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396446 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396533 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjfql\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-kube-api-access-mjfql\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396628 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396701 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396781 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396855 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.396925 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397004 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-config\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397070 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-dns-svc\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397081 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6k6k\" (UniqueName: \"kubernetes.io/projected/e7f7543a-72f1-4937-95d9-8869b77ab81d-kube-api-access-s6k6k\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397191 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397606 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.397859 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-config\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.399571 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.413382 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6k6k\" (UniqueName: \"kubernetes.io/projected/e7f7543a-72f1-4937-95d9-8869b77ab81d-kube-api-access-s6k6k\") pod \"dnsmasq-dns-7cf7fddbc7-982b5\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.498315 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.498615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499114 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499187 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499220 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499280 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499358 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjfql\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-kube-api-access-mjfql\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.499559 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.506063 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.518109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.521008 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.525569 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjfql\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-kube-api-access-mjfql\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.525936 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.560094 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:45 crc kubenswrapper[4931]: I0130 06:36:45.577051 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.101633 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.132741 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:46 crc kubenswrapper[4931]: W0130 06:36:46.153102 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ea0350_921c_4016_861b_f61da343aaa6.slice/crio-4a2868e0aa0bfd6a5e0709070490840695d3b5f971138818dd14c1cb2a50ec7b WatchSource:0}: Error finding container 4a2868e0aa0bfd6a5e0709070490840695d3b5f971138818dd14c1cb2a50ec7b: Status 404 returned error can't find the container with id 4a2868e0aa0bfd6a5e0709070490840695d3b5f971138818dd14c1cb2a50ec7b Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.184803 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf7fddbc7-982b5"] Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.237102 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:46 crc kubenswrapper[4931]: W0130 06:36:46.244322 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae4a7a6_c1f4_4fd4_a844_f35eef27ffbc.slice/crio-68279e3601f3bb1874e45c1fa4e255a1034408fe199037496d6fdf3b86235715 WatchSource:0}: Error finding container 68279e3601f3bb1874e45c1fa4e255a1034408fe199037496d6fdf3b86235715: Status 404 returned error can't find the container with id 68279e3601f3bb1874e45c1fa4e255a1034408fe199037496d6fdf3b86235715 Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.772390 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc","Type":"ContainerStarted","Data":"68279e3601f3bb1874e45c1fa4e255a1034408fe199037496d6fdf3b86235715"} Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.773613 4931 generic.go:334] "Generic (PLEG): container finished" podID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerID="7119d5284982674648c0826e4626a711e51ca2133b6b1310bb2a6ca06e64c6b3" exitCode=0 Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.773652 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" event={"ID":"e7f7543a-72f1-4937-95d9-8869b77ab81d","Type":"ContainerDied","Data":"7119d5284982674648c0826e4626a711e51ca2133b6b1310bb2a6ca06e64c6b3"} Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.773668 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" event={"ID":"e7f7543a-72f1-4937-95d9-8869b77ab81d","Type":"ContainerStarted","Data":"424c4112ef530e297d0cff0d4af771a05512506e55897ed954c10a6043fe7171"} Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.778127 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ea0350-921c-4016-861b-f61da343aaa6","Type":"ContainerStarted","Data":"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac"} Jan 30 06:36:46 crc kubenswrapper[4931]: I0130 06:36:46.778160 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ea0350-921c-4016-861b-f61da343aaa6","Type":"ContainerStarted","Data":"4a2868e0aa0bfd6a5e0709070490840695d3b5f971138818dd14c1cb2a50ec7b"} Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.804996 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" event={"ID":"e7f7543a-72f1-4937-95d9-8869b77ab81d","Type":"ContainerStarted","Data":"64b8f4307220d8bf205a24561416d6dc5c9e47a88b8a78a79bc73aae4b6035c4"} Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.805453 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.808969 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ea0350-921c-4016-861b-f61da343aaa6","Type":"ContainerStarted","Data":"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce"} Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.809074 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-log" containerID="cri-o://a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac" gracePeriod=30 Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.809251 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-httpd" containerID="cri-o://b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce" gracePeriod=30 Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.811701 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc","Type":"ContainerStarted","Data":"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a"} Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.811722 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc","Type":"ContainerStarted","Data":"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac"} Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.828134 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" podStartSLOduration=2.8281182620000003 podStartE2EDuration="2.828118262s" podCreationTimestamp="2026-01-30 06:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:47.822708611 +0000 UTC m=+5343.192618908" watchObservedRunningTime="2026-01-30 06:36:47.828118262 +0000 UTC m=+5343.198028519" Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.849157 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.84913923 podStartE2EDuration="2.84913923s" podCreationTimestamp="2026-01-30 06:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:47.845465018 +0000 UTC m=+5343.215375275" watchObservedRunningTime="2026-01-30 06:36:47.84913923 +0000 UTC m=+5343.219049487" Jan 30 06:36:47 crc kubenswrapper[4931]: I0130 06:36:47.876069 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.876054773 podStartE2EDuration="2.876054773s" podCreationTimestamp="2026-01-30 06:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:47.873324167 +0000 UTC m=+5343.243234424" watchObservedRunningTime="2026-01-30 06:36:47.876054773 +0000 UTC m=+5343.245965030" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.197980 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.428292 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.553883 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-combined-ca-bundle\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.553957 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-httpd-run\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.554067 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-config-data\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.554115 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-scripts\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.554134 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ls9v\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-kube-api-access-8ls9v\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.554154 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-ceph\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.554205 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-logs\") pod \"d5ea0350-921c-4016-861b-f61da343aaa6\" (UID: \"d5ea0350-921c-4016-861b-f61da343aaa6\") " Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.554960 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-logs" (OuterVolumeSpecName: "logs") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.557616 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.560235 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-kube-api-access-8ls9v" (OuterVolumeSpecName: "kube-api-access-8ls9v") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "kube-api-access-8ls9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.561700 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-scripts" (OuterVolumeSpecName: "scripts") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.562654 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-ceph" (OuterVolumeSpecName: "ceph") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.595288 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.622587 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-config-data" (OuterVolumeSpecName: "config-data") pod "d5ea0350-921c-4016-861b-f61da343aaa6" (UID: "d5ea0350-921c-4016-861b-f61da343aaa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656085 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656114 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656125 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ls9v\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-kube-api-access-8ls9v\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656134 4931 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ea0350-921c-4016-861b-f61da343aaa6-ceph\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656142 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656149 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ea0350-921c-4016-861b-f61da343aaa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.656157 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ea0350-921c-4016-861b-f61da343aaa6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.823071 4931 generic.go:334] "Generic (PLEG): container finished" podID="d5ea0350-921c-4016-861b-f61da343aaa6" containerID="b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce" exitCode=0 Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.823107 4931 generic.go:334] "Generic (PLEG): container finished" podID="d5ea0350-921c-4016-861b-f61da343aaa6" containerID="a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac" exitCode=143 Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.824023 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.825551 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ea0350-921c-4016-861b-f61da343aaa6","Type":"ContainerDied","Data":"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce"} Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.825593 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ea0350-921c-4016-861b-f61da343aaa6","Type":"ContainerDied","Data":"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac"} Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.825612 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ea0350-921c-4016-861b-f61da343aaa6","Type":"ContainerDied","Data":"4a2868e0aa0bfd6a5e0709070490840695d3b5f971138818dd14c1cb2a50ec7b"} Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.825631 4931 scope.go:117] "RemoveContainer" containerID="b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.861991 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.865872 4931 scope.go:117] "RemoveContainer" containerID="a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.870823 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.890286 4931 scope.go:117] "RemoveContainer" containerID="b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce" Jan 30 06:36:48 crc kubenswrapper[4931]: E0130 06:36:48.890921 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce\": container with ID starting with b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce not found: ID does not exist" containerID="b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.890980 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce"} err="failed to get container status \"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce\": rpc error: code = NotFound desc = could not find container \"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce\": container with ID starting with b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce not found: ID does not exist" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.891007 4931 scope.go:117] "RemoveContainer" containerID="a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.891143 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:48 crc kubenswrapper[4931]: E0130 06:36:48.891528 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac\": container with ID starting with a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac not found: ID does not exist" containerID="a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.891569 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac"} err="failed to get container status \"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac\": rpc error: code = NotFound desc = could not find container \"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac\": container with ID starting with a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac not found: ID does not exist" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.891608 4931 scope.go:117] "RemoveContainer" containerID="b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce" Jan 30 06:36:48 crc kubenswrapper[4931]: E0130 06:36:48.891547 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-log" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.891662 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-log" Jan 30 06:36:48 crc kubenswrapper[4931]: E0130 06:36:48.891714 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-httpd" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.891722 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-httpd" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.892262 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-log" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.892307 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" containerName="glance-httpd" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.892929 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce"} err="failed to get container status \"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce\": rpc error: code = NotFound desc = could not find container \"b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce\": container with ID starting with b115f2b1a1dc0c68c9d754fe7e52de76fd5115ea1dd25ac48c3263f542fd97ce not found: ID does not exist" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.892970 4931 scope.go:117] "RemoveContainer" containerID="a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.893595 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.894527 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac"} err="failed to get container status \"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac\": rpc error: code = NotFound desc = could not find container \"a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac\": container with ID starting with a9ae1b83e11509d273525802f75cf08999ac0b5f8cf14f23e66ab7bd4a18dcac not found: ID does not exist" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.896639 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 06:36:48 crc kubenswrapper[4931]: I0130 06:36:48.909018 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.062857 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.063003 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.063145 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.063214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.063272 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8j2s\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-kube-api-access-h8j2s\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.063447 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-logs\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.063517 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-ceph\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165207 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165271 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8j2s\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-kube-api-access-h8j2s\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165324 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-logs\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165355 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-ceph\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165387 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165468 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165502 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.165888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-logs\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.166068 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.168492 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-ceph\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.169304 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.169858 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.170343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.184861 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8j2s\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-kube-api-access-h8j2s\") pod \"glance-default-external-api-0\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.213952 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.436614 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ea0350-921c-4016-861b-f61da343aaa6" path="/var/lib/kubelet/pods/d5ea0350-921c-4016-861b-f61da343aaa6/volumes" Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.604947 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:36:49 crc kubenswrapper[4931]: W0130 06:36:49.607737 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab7585b_916e_4a6a_8aa8_da769aaa437e.slice/crio-266032ac228e0593b292c8f7becc87b38799dd433a3129a19387d1cf6ec27145 WatchSource:0}: Error finding container 266032ac228e0593b292c8f7becc87b38799dd433a3129a19387d1cf6ec27145: Status 404 returned error can't find the container with id 266032ac228e0593b292c8f7becc87b38799dd433a3129a19387d1cf6ec27145 Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.836964 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ab7585b-916e-4a6a-8aa8-da769aaa437e","Type":"ContainerStarted","Data":"266032ac228e0593b292c8f7becc87b38799dd433a3129a19387d1cf6ec27145"} Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.839900 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-log" containerID="cri-o://cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac" gracePeriod=30 Jan 30 06:36:49 crc kubenswrapper[4931]: I0130 06:36:49.840014 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-httpd" containerID="cri-o://467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a" gracePeriod=30 Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.378781 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509095 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-httpd-run\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509201 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-combined-ca-bundle\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-scripts\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509357 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-config-data\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509381 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-logs\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509412 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-ceph\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.509465 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjfql\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-kube-api-access-mjfql\") pod \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\" (UID: \"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc\") " Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.515826 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-logs" (OuterVolumeSpecName: "logs") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.520940 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-kube-api-access-mjfql" (OuterVolumeSpecName: "kube-api-access-mjfql") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "kube-api-access-mjfql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.531263 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-scripts" (OuterVolumeSpecName: "scripts") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.535719 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.537619 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-ceph" (OuterVolumeSpecName: "ceph") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.613023 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.613048 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.613056 4931 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-ceph\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.613064 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjfql\" (UniqueName: \"kubernetes.io/projected/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-kube-api-access-mjfql\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.613075 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.631793 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.648588 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-config-data" (OuterVolumeSpecName: "config-data") pod "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" (UID: "aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.715850 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.715886 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.848516 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ab7585b-916e-4a6a-8aa8-da769aaa437e","Type":"ContainerStarted","Data":"9f0644fcac396be5d0abd6c6b2b21170c61642dc6f5db73f638093f772151226"} Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.848558 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ab7585b-916e-4a6a-8aa8-da769aaa437e","Type":"ContainerStarted","Data":"d05f5e569f00e57864985b93b02a9cff6181003207de011c326a946bc0f2b2f3"} Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.850934 4931 generic.go:334] "Generic (PLEG): container finished" podID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerID="467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a" exitCode=0 Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.850971 4931 generic.go:334] "Generic (PLEG): container finished" podID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerID="cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac" exitCode=143 Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.850993 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc","Type":"ContainerDied","Data":"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a"} Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.851020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc","Type":"ContainerDied","Data":"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac"} Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.851032 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc","Type":"ContainerDied","Data":"68279e3601f3bb1874e45c1fa4e255a1034408fe199037496d6fdf3b86235715"} Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.851046 4931 scope.go:117] "RemoveContainer" containerID="467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.851145 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.882640 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.882623085 podStartE2EDuration="2.882623085s" podCreationTimestamp="2026-01-30 06:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:50.878014906 +0000 UTC m=+5346.247925173" watchObservedRunningTime="2026-01-30 06:36:50.882623085 +0000 UTC m=+5346.252533352" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.883525 4931 scope.go:117] "RemoveContainer" containerID="cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.901801 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.912131 4931 scope.go:117] "RemoveContainer" containerID="467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a" Jan 30 06:36:50 crc kubenswrapper[4931]: E0130 06:36:50.912863 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a\": container with ID starting with 467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a not found: ID does not exist" containerID="467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.912902 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a"} err="failed to get container status \"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a\": rpc error: code = NotFound desc = could not find container \"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a\": container with ID starting with 467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a not found: ID does not exist" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.912928 4931 scope.go:117] "RemoveContainer" containerID="cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.913311 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:50 crc kubenswrapper[4931]: E0130 06:36:50.913386 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac\": container with ID starting with cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac not found: ID does not exist" containerID="cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.913501 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac"} err="failed to get container status \"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac\": rpc error: code = NotFound desc = could not find container \"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac\": container with ID starting with cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac not found: ID does not exist" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.913535 4931 scope.go:117] "RemoveContainer" containerID="467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.913809 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a"} err="failed to get container status \"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a\": rpc error: code = NotFound desc = could not find container \"467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a\": container with ID starting with 467241cd20503a3411a5230450fcdfda57f7e8c27d8d4b3f5f4ca872d524f71a not found: ID does not exist" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.913842 4931 scope.go:117] "RemoveContainer" containerID="cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.914099 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac"} err="failed to get container status \"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac\": rpc error: code = NotFound desc = could not find container \"cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac\": container with ID starting with cbb16f4eedd7535a133df59a844dc93e0135e0f924a3118f1d6c3d755c5826ac not found: ID does not exist" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.930856 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:50 crc kubenswrapper[4931]: E0130 06:36:50.931273 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-log" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.931285 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-log" Jan 30 06:36:50 crc kubenswrapper[4931]: E0130 06:36:50.931305 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-httpd" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.931311 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-httpd" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.931494 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-httpd" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.931503 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" containerName="glance-log" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.932452 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.936348 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 06:36:50 crc kubenswrapper[4931]: I0130 06:36:50.938449 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.033288 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7hqx\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-kube-api-access-x7hqx\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.033379 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-ceph\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.033403 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-logs\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.033459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.033772 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.033875 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.034062 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.135868 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-ceph\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136117 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-logs\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136174 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136213 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136250 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136307 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136348 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7hqx\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-kube-api-access-x7hqx\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.136805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.137101 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-logs\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.139839 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.139920 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-ceph\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.139991 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.140916 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.158012 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7hqx\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-kube-api-access-x7hqx\") pod \"glance-default-internal-api-0\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.247654 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.442116 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc" path="/var/lib/kubelet/pods/aae4a7a6-c1f4-4fd4-a844-f35eef27ffbc/volumes" Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.625486 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:36:51 crc kubenswrapper[4931]: I0130 06:36:51.879920 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50e397ef-0630-40db-a591-28d7584dee76","Type":"ContainerStarted","Data":"a16e5756f39cf431b53b36d453d1cc052129da02cb0cac04cc9d6bca4777d6a5"} Jan 30 06:36:52 crc kubenswrapper[4931]: I0130 06:36:52.890043 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50e397ef-0630-40db-a591-28d7584dee76","Type":"ContainerStarted","Data":"d2e2adf695545abae4fa1879a24ad260b75515f25d086b16166bdfe80e55cc22"} Jan 30 06:36:52 crc kubenswrapper[4931]: I0130 06:36:52.890575 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50e397ef-0630-40db-a591-28d7584dee76","Type":"ContainerStarted","Data":"32f7ab137db348695ddb60d19f60238d0ad9feb2d7bfb8d7925247bd1ac76d50"} Jan 30 06:36:52 crc kubenswrapper[4931]: I0130 06:36:52.920975 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.92095278 podStartE2EDuration="2.92095278s" podCreationTimestamp="2026-01-30 06:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:52.917813032 +0000 UTC m=+5348.287723329" watchObservedRunningTime="2026-01-30 06:36:52.92095278 +0000 UTC m=+5348.290863037" Jan 30 06:36:54 crc kubenswrapper[4931]: I0130 06:36:54.228121 4931 scope.go:117] "RemoveContainer" containerID="173f20cd392f59bb3d09e8d879e9a2c54ad0461fcc8850325a690b330805f7aa" Jan 30 06:36:54 crc kubenswrapper[4931]: I0130 06:36:54.317290 4931 scope.go:117] "RemoveContainer" containerID="c20d2d48ca6794144eddad5037d464e6a9ffdad2028bd7ef00590c377c6183ff" Jan 30 06:36:55 crc kubenswrapper[4931]: I0130 06:36:55.562812 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:36:55 crc kubenswrapper[4931]: I0130 06:36:55.627870 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d84887cc5-thvpx"] Jan 30 06:36:55 crc kubenswrapper[4931]: I0130 06:36:55.628183 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="dnsmasq-dns" containerID="cri-o://9a53ddc494ac40f2cd235561690db17e6fa36bf1d84c504bb1475d76a3081caf" gracePeriod=10 Jan 30 06:36:55 crc kubenswrapper[4931]: I0130 06:36:55.917358 4931 generic.go:334] "Generic (PLEG): container finished" podID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerID="9a53ddc494ac40f2cd235561690db17e6fa36bf1d84c504bb1475d76a3081caf" exitCode=0 Jan 30 06:36:55 crc kubenswrapper[4931]: I0130 06:36:55.917397 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" event={"ID":"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5","Type":"ContainerDied","Data":"9a53ddc494ac40f2cd235561690db17e6fa36bf1d84c504bb1475d76a3081caf"} Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.098657 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.228624 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-nb\") pod \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.229305 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-sb\") pod \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.229370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv7bs\" (UniqueName: \"kubernetes.io/projected/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-kube-api-access-qv7bs\") pod \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.229443 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-dns-svc\") pod \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.229506 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-config\") pod \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\" (UID: \"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5\") " Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.233850 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-kube-api-access-qv7bs" (OuterVolumeSpecName: "kube-api-access-qv7bs") pod "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" (UID: "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5"). InnerVolumeSpecName "kube-api-access-qv7bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.268065 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" (UID: "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.275100 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" (UID: "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.292665 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" (UID: "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.301213 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-config" (OuterVolumeSpecName: "config") pod "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" (UID: "11ff05b0-f35c-4f00-b0a7-a59f0368d4f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.331205 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.331251 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv7bs\" (UniqueName: \"kubernetes.io/projected/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-kube-api-access-qv7bs\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.331266 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.331281 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.331294 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.941348 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" event={"ID":"11ff05b0-f35c-4f00-b0a7-a59f0368d4f5","Type":"ContainerDied","Data":"48bcc562af333fbf9303da09ed90c070b458fec2e34b83616d91ef56cccabb57"} Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.941413 4931 scope.go:117] "RemoveContainer" containerID="9a53ddc494ac40f2cd235561690db17e6fa36bf1d84c504bb1475d76a3081caf" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.941525 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.973178 4931 scope.go:117] "RemoveContainer" containerID="753bd88c07189bffc8d69c9bf6a34e6d86af3f56c735e6c62d25cd5e5e4da562" Jan 30 06:36:56 crc kubenswrapper[4931]: I0130 06:36:56.991530 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d84887cc5-thvpx"] Jan 30 06:36:57 crc kubenswrapper[4931]: I0130 06:36:57.001648 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d84887cc5-thvpx"] Jan 30 06:36:57 crc kubenswrapper[4931]: I0130 06:36:57.431199 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" path="/var/lib/kubelet/pods/11ff05b0-f35c-4f00-b0a7-a59f0368d4f5/volumes" Jan 30 06:36:59 crc kubenswrapper[4931]: I0130 06:36:59.214684 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:36:59 crc kubenswrapper[4931]: I0130 06:36:59.214766 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:36:59 crc kubenswrapper[4931]: I0130 06:36:59.268243 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:36:59 crc kubenswrapper[4931]: I0130 06:36:59.307790 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:36:59 crc kubenswrapper[4931]: I0130 06:36:59.972554 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:36:59 crc kubenswrapper[4931]: I0130 06:36:59.972855 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:37:00 crc kubenswrapper[4931]: I0130 06:37:00.825649 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d84887cc5-thvpx" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.39:5353: i/o timeout" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.248133 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.248251 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.285155 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.302202 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.842625 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.920013 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.993625 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:01 crc kubenswrapper[4931]: I0130 06:37:01.993682 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:03 crc kubenswrapper[4931]: I0130 06:37:03.881200 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:03 crc kubenswrapper[4931]: I0130 06:37:03.972537 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.275677 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zkc49"] Jan 30 06:37:10 crc kubenswrapper[4931]: E0130 06:37:10.276455 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="dnsmasq-dns" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.276467 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="dnsmasq-dns" Jan 30 06:37:10 crc kubenswrapper[4931]: E0130 06:37:10.276488 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="init" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.276495 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="init" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.276651 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ff05b0-f35c-4f00-b0a7-a59f0368d4f5" containerName="dnsmasq-dns" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.277180 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.286981 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zkc49"] Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.324752 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mgs\" (UniqueName: \"kubernetes.io/projected/80acfb99-2d96-453a-b29a-62f23608dd5f-kube-api-access-l8mgs\") pod \"placement-db-create-zkc49\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.324977 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80acfb99-2d96-453a-b29a-62f23608dd5f-operator-scripts\") pod \"placement-db-create-zkc49\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.393344 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9f04-account-create-update-wgg9g"] Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.395375 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.402070 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.405708 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f04-account-create-update-wgg9g"] Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.426763 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mgs\" (UniqueName: \"kubernetes.io/projected/80acfb99-2d96-453a-b29a-62f23608dd5f-kube-api-access-l8mgs\") pod \"placement-db-create-zkc49\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.427674 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80acfb99-2d96-453a-b29a-62f23608dd5f-operator-scripts\") pod \"placement-db-create-zkc49\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.430959 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80acfb99-2d96-453a-b29a-62f23608dd5f-operator-scripts\") pod \"placement-db-create-zkc49\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.445826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mgs\" (UniqueName: \"kubernetes.io/projected/80acfb99-2d96-453a-b29a-62f23608dd5f-kube-api-access-l8mgs\") pod \"placement-db-create-zkc49\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.529731 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d676c50-5909-4eeb-a22b-63823761ab17-operator-scripts\") pod \"placement-9f04-account-create-update-wgg9g\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.529780 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gscjc\" (UniqueName: \"kubernetes.io/projected/7d676c50-5909-4eeb-a22b-63823761ab17-kube-api-access-gscjc\") pod \"placement-9f04-account-create-update-wgg9g\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.611507 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zkc49" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.631734 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d676c50-5909-4eeb-a22b-63823761ab17-operator-scripts\") pod \"placement-9f04-account-create-update-wgg9g\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.631787 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gscjc\" (UniqueName: \"kubernetes.io/projected/7d676c50-5909-4eeb-a22b-63823761ab17-kube-api-access-gscjc\") pod \"placement-9f04-account-create-update-wgg9g\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.632528 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d676c50-5909-4eeb-a22b-63823761ab17-operator-scripts\") pod \"placement-9f04-account-create-update-wgg9g\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.654827 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gscjc\" (UniqueName: \"kubernetes.io/projected/7d676c50-5909-4eeb-a22b-63823761ab17-kube-api-access-gscjc\") pod \"placement-9f04-account-create-update-wgg9g\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:10 crc kubenswrapper[4931]: I0130 06:37:10.719946 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:11 crc kubenswrapper[4931]: I0130 06:37:11.063139 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zkc49"] Jan 30 06:37:11 crc kubenswrapper[4931]: W0130 06:37:11.066069 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80acfb99_2d96_453a_b29a_62f23608dd5f.slice/crio-4e88e0fb7db435e0262dd9c96e86538cd63b93d89b1584f0a7ac9701af498ac7 WatchSource:0}: Error finding container 4e88e0fb7db435e0262dd9c96e86538cd63b93d89b1584f0a7ac9701af498ac7: Status 404 returned error can't find the container with id 4e88e0fb7db435e0262dd9c96e86538cd63b93d89b1584f0a7ac9701af498ac7 Jan 30 06:37:11 crc kubenswrapper[4931]: I0130 06:37:11.091495 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zkc49" event={"ID":"80acfb99-2d96-453a-b29a-62f23608dd5f","Type":"ContainerStarted","Data":"4e88e0fb7db435e0262dd9c96e86538cd63b93d89b1584f0a7ac9701af498ac7"} Jan 30 06:37:11 crc kubenswrapper[4931]: I0130 06:37:11.186635 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f04-account-create-update-wgg9g"] Jan 30 06:37:11 crc kubenswrapper[4931]: W0130 06:37:11.194312 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d676c50_5909_4eeb_a22b_63823761ab17.slice/crio-dfb836c72c923aa77e375aaf41768e76c04d5c434f7b1d66cdb2624a0eb722f9 WatchSource:0}: Error finding container dfb836c72c923aa77e375aaf41768e76c04d5c434f7b1d66cdb2624a0eb722f9: Status 404 returned error can't find the container with id dfb836c72c923aa77e375aaf41768e76c04d5c434f7b1d66cdb2624a0eb722f9 Jan 30 06:37:12 crc kubenswrapper[4931]: I0130 06:37:12.101832 4931 generic.go:334] "Generic (PLEG): container finished" podID="80acfb99-2d96-453a-b29a-62f23608dd5f" containerID="d17e4d5da3cbd98de6ce9452e4b21d30ccf3ad5f026d8b30b101024dc6fd4576" exitCode=0 Jan 30 06:37:12 crc kubenswrapper[4931]: I0130 06:37:12.101908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zkc49" event={"ID":"80acfb99-2d96-453a-b29a-62f23608dd5f","Type":"ContainerDied","Data":"d17e4d5da3cbd98de6ce9452e4b21d30ccf3ad5f026d8b30b101024dc6fd4576"} Jan 30 06:37:12 crc kubenswrapper[4931]: I0130 06:37:12.104362 4931 generic.go:334] "Generic (PLEG): container finished" podID="7d676c50-5909-4eeb-a22b-63823761ab17" containerID="312f2f7be76e7df2d1974a8fc3d9bcd846d76d9bf2e6bb52018a7e69743078de" exitCode=0 Jan 30 06:37:12 crc kubenswrapper[4931]: I0130 06:37:12.104446 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f04-account-create-update-wgg9g" event={"ID":"7d676c50-5909-4eeb-a22b-63823761ab17","Type":"ContainerDied","Data":"312f2f7be76e7df2d1974a8fc3d9bcd846d76d9bf2e6bb52018a7e69743078de"} Jan 30 06:37:12 crc kubenswrapper[4931]: I0130 06:37:12.104477 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f04-account-create-update-wgg9g" event={"ID":"7d676c50-5909-4eeb-a22b-63823761ab17","Type":"ContainerStarted","Data":"dfb836c72c923aa77e375aaf41768e76c04d5c434f7b1d66cdb2624a0eb722f9"} Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.596992 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.608088 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zkc49" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.686079 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d676c50-5909-4eeb-a22b-63823761ab17-operator-scripts\") pod \"7d676c50-5909-4eeb-a22b-63823761ab17\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.686204 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gscjc\" (UniqueName: \"kubernetes.io/projected/7d676c50-5909-4eeb-a22b-63823761ab17-kube-api-access-gscjc\") pod \"7d676c50-5909-4eeb-a22b-63823761ab17\" (UID: \"7d676c50-5909-4eeb-a22b-63823761ab17\") " Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.686261 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80acfb99-2d96-453a-b29a-62f23608dd5f-operator-scripts\") pod \"80acfb99-2d96-453a-b29a-62f23608dd5f\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.686414 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mgs\" (UniqueName: \"kubernetes.io/projected/80acfb99-2d96-453a-b29a-62f23608dd5f-kube-api-access-l8mgs\") pod \"80acfb99-2d96-453a-b29a-62f23608dd5f\" (UID: \"80acfb99-2d96-453a-b29a-62f23608dd5f\") " Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.686756 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d676c50-5909-4eeb-a22b-63823761ab17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d676c50-5909-4eeb-a22b-63823761ab17" (UID: "7d676c50-5909-4eeb-a22b-63823761ab17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.687193 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d676c50-5909-4eeb-a22b-63823761ab17-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.688073 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80acfb99-2d96-453a-b29a-62f23608dd5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80acfb99-2d96-453a-b29a-62f23608dd5f" (UID: "80acfb99-2d96-453a-b29a-62f23608dd5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.698853 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d676c50-5909-4eeb-a22b-63823761ab17-kube-api-access-gscjc" (OuterVolumeSpecName: "kube-api-access-gscjc") pod "7d676c50-5909-4eeb-a22b-63823761ab17" (UID: "7d676c50-5909-4eeb-a22b-63823761ab17"). InnerVolumeSpecName "kube-api-access-gscjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.699188 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80acfb99-2d96-453a-b29a-62f23608dd5f-kube-api-access-l8mgs" (OuterVolumeSpecName: "kube-api-access-l8mgs") pod "80acfb99-2d96-453a-b29a-62f23608dd5f" (UID: "80acfb99-2d96-453a-b29a-62f23608dd5f"). InnerVolumeSpecName "kube-api-access-l8mgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.789274 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8mgs\" (UniqueName: \"kubernetes.io/projected/80acfb99-2d96-453a-b29a-62f23608dd5f-kube-api-access-l8mgs\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.789317 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gscjc\" (UniqueName: \"kubernetes.io/projected/7d676c50-5909-4eeb-a22b-63823761ab17-kube-api-access-gscjc\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:13 crc kubenswrapper[4931]: I0130 06:37:13.789331 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80acfb99-2d96-453a-b29a-62f23608dd5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:14 crc kubenswrapper[4931]: I0130 06:37:14.127869 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zkc49" Jan 30 06:37:14 crc kubenswrapper[4931]: I0130 06:37:14.127857 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zkc49" event={"ID":"80acfb99-2d96-453a-b29a-62f23608dd5f","Type":"ContainerDied","Data":"4e88e0fb7db435e0262dd9c96e86538cd63b93d89b1584f0a7ac9701af498ac7"} Jan 30 06:37:14 crc kubenswrapper[4931]: I0130 06:37:14.128058 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e88e0fb7db435e0262dd9c96e86538cd63b93d89b1584f0a7ac9701af498ac7" Jan 30 06:37:14 crc kubenswrapper[4931]: I0130 06:37:14.130826 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f04-account-create-update-wgg9g" event={"ID":"7d676c50-5909-4eeb-a22b-63823761ab17","Type":"ContainerDied","Data":"dfb836c72c923aa77e375aaf41768e76c04d5c434f7b1d66cdb2624a0eb722f9"} Jan 30 06:37:14 crc kubenswrapper[4931]: I0130 06:37:14.130871 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f04-account-create-update-wgg9g" Jan 30 06:37:14 crc kubenswrapper[4931]: I0130 06:37:14.130885 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfb836c72c923aa77e375aaf41768e76c04d5c434f7b1d66cdb2624a0eb722f9" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.724079 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-x9ngk"] Jan 30 06:37:15 crc kubenswrapper[4931]: E0130 06:37:15.724632 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80acfb99-2d96-453a-b29a-62f23608dd5f" containerName="mariadb-database-create" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.724645 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="80acfb99-2d96-453a-b29a-62f23608dd5f" containerName="mariadb-database-create" Jan 30 06:37:15 crc kubenswrapper[4931]: E0130 06:37:15.724657 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d676c50-5909-4eeb-a22b-63823761ab17" containerName="mariadb-account-create-update" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.724662 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d676c50-5909-4eeb-a22b-63823761ab17" containerName="mariadb-account-create-update" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.724824 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d676c50-5909-4eeb-a22b-63823761ab17" containerName="mariadb-account-create-update" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.724840 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="80acfb99-2d96-453a-b29a-62f23608dd5f" containerName="mariadb-database-create" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.725354 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.728361 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.728543 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.728856 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rbl8r" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.739133 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x9ngk"] Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.746129 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf9d65499-j99dc"] Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.748691 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.774064 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf9d65499-j99dc"] Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824590 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-scripts\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824642 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9s8d\" (UniqueName: \"kubernetes.io/projected/08e7d2a9-093c-4495-81ab-99972c72b179-kube-api-access-l9s8d\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824677 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-sb\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824698 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-config-data\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824717 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08e7d2a9-093c-4495-81ab-99972c72b179-logs\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824757 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrwmf\" (UniqueName: \"kubernetes.io/projected/94c3e877-4729-4777-9460-2fdce31b2bc3-kube-api-access-qrwmf\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824853 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-dns-svc\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824951 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-combined-ca-bundle\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.824986 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-nb\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.825069 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-config\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.926974 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrwmf\" (UniqueName: \"kubernetes.io/projected/94c3e877-4729-4777-9460-2fdce31b2bc3-kube-api-access-qrwmf\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.927043 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-dns-svc\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.927096 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-combined-ca-bundle\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.927121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-nb\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.927163 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-config\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.927279 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-scripts\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928149 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-dns-svc\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928177 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-config\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928306 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-nb\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928451 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9s8d\" (UniqueName: \"kubernetes.io/projected/08e7d2a9-093c-4495-81ab-99972c72b179-kube-api-access-l9s8d\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928507 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-sb\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928538 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-config-data\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928559 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08e7d2a9-093c-4495-81ab-99972c72b179-logs\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.928964 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08e7d2a9-093c-4495-81ab-99972c72b179-logs\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.929261 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-sb\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.932764 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-combined-ca-bundle\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.938222 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-config-data\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.941202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-scripts\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.947897 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9s8d\" (UniqueName: \"kubernetes.io/projected/08e7d2a9-093c-4495-81ab-99972c72b179-kube-api-access-l9s8d\") pod \"placement-db-sync-x9ngk\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:15 crc kubenswrapper[4931]: I0130 06:37:15.951758 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrwmf\" (UniqueName: \"kubernetes.io/projected/94c3e877-4729-4777-9460-2fdce31b2bc3-kube-api-access-qrwmf\") pod \"dnsmasq-dns-bf9d65499-j99dc\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:16 crc kubenswrapper[4931]: I0130 06:37:16.047787 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:16 crc kubenswrapper[4931]: I0130 06:37:16.071845 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:16 crc kubenswrapper[4931]: I0130 06:37:16.550855 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x9ngk"] Jan 30 06:37:16 crc kubenswrapper[4931]: I0130 06:37:16.668100 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf9d65499-j99dc"] Jan 30 06:37:17 crc kubenswrapper[4931]: I0130 06:37:17.188245 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x9ngk" event={"ID":"08e7d2a9-093c-4495-81ab-99972c72b179","Type":"ContainerStarted","Data":"a9a1121e8223cb02db35e17fa42991de13dcf37e427dbdd0bf16fa8038651093"} Jan 30 06:37:17 crc kubenswrapper[4931]: I0130 06:37:17.188289 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x9ngk" event={"ID":"08e7d2a9-093c-4495-81ab-99972c72b179","Type":"ContainerStarted","Data":"57de75e2f05fade5baa1cc643f4af3f769856d06f6c34e1b83f87baa4389aede"} Jan 30 06:37:17 crc kubenswrapper[4931]: I0130 06:37:17.193950 4931 generic.go:334] "Generic (PLEG): container finished" podID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerID="d3534b4266cad4f8c042a5d1a723852bc5684f3d2cae49aae0ea01f2e1276ee4" exitCode=0 Jan 30 06:37:17 crc kubenswrapper[4931]: I0130 06:37:17.194227 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" event={"ID":"94c3e877-4729-4777-9460-2fdce31b2bc3","Type":"ContainerDied","Data":"d3534b4266cad4f8c042a5d1a723852bc5684f3d2cae49aae0ea01f2e1276ee4"} Jan 30 06:37:17 crc kubenswrapper[4931]: I0130 06:37:17.194250 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" event={"ID":"94c3e877-4729-4777-9460-2fdce31b2bc3","Type":"ContainerStarted","Data":"cbfdef9dfaed32413bc6b3a12689ef221574794a4832dba0bff0f3e04d140623"} Jan 30 06:37:17 crc kubenswrapper[4931]: I0130 06:37:17.215254 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-x9ngk" podStartSLOduration=2.215237447 podStartE2EDuration="2.215237447s" podCreationTimestamp="2026-01-30 06:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:37:17.208176199 +0000 UTC m=+5372.578086466" watchObservedRunningTime="2026-01-30 06:37:17.215237447 +0000 UTC m=+5372.585147704" Jan 30 06:37:18 crc kubenswrapper[4931]: I0130 06:37:18.206180 4931 generic.go:334] "Generic (PLEG): container finished" podID="08e7d2a9-093c-4495-81ab-99972c72b179" containerID="a9a1121e8223cb02db35e17fa42991de13dcf37e427dbdd0bf16fa8038651093" exitCode=0 Jan 30 06:37:18 crc kubenswrapper[4931]: I0130 06:37:18.206280 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x9ngk" event={"ID":"08e7d2a9-093c-4495-81ab-99972c72b179","Type":"ContainerDied","Data":"a9a1121e8223cb02db35e17fa42991de13dcf37e427dbdd0bf16fa8038651093"} Jan 30 06:37:18 crc kubenswrapper[4931]: I0130 06:37:18.212088 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" event={"ID":"94c3e877-4729-4777-9460-2fdce31b2bc3","Type":"ContainerStarted","Data":"2e8dbb3daa04d0b7ca86823625c46d0d78b12efba028999fcdcceb4252172d0f"} Jan 30 06:37:18 crc kubenswrapper[4931]: I0130 06:37:18.212235 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:18 crc kubenswrapper[4931]: I0130 06:37:18.251680 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" podStartSLOduration=3.251663842 podStartE2EDuration="3.251663842s" podCreationTimestamp="2026-01-30 06:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:37:18.245827999 +0000 UTC m=+5373.615738266" watchObservedRunningTime="2026-01-30 06:37:18.251663842 +0000 UTC m=+5373.621574099" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.616937 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.710167 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-scripts\") pod \"08e7d2a9-093c-4495-81ab-99972c72b179\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.710238 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9s8d\" (UniqueName: \"kubernetes.io/projected/08e7d2a9-093c-4495-81ab-99972c72b179-kube-api-access-l9s8d\") pod \"08e7d2a9-093c-4495-81ab-99972c72b179\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.710261 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08e7d2a9-093c-4495-81ab-99972c72b179-logs\") pod \"08e7d2a9-093c-4495-81ab-99972c72b179\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.710371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-combined-ca-bundle\") pod \"08e7d2a9-093c-4495-81ab-99972c72b179\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.710389 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-config-data\") pod \"08e7d2a9-093c-4495-81ab-99972c72b179\" (UID: \"08e7d2a9-093c-4495-81ab-99972c72b179\") " Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.711121 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08e7d2a9-093c-4495-81ab-99972c72b179-logs" (OuterVolumeSpecName: "logs") pod "08e7d2a9-093c-4495-81ab-99972c72b179" (UID: "08e7d2a9-093c-4495-81ab-99972c72b179"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.717666 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e7d2a9-093c-4495-81ab-99972c72b179-kube-api-access-l9s8d" (OuterVolumeSpecName: "kube-api-access-l9s8d") pod "08e7d2a9-093c-4495-81ab-99972c72b179" (UID: "08e7d2a9-093c-4495-81ab-99972c72b179"). InnerVolumeSpecName "kube-api-access-l9s8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.728238 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-scripts" (OuterVolumeSpecName: "scripts") pod "08e7d2a9-093c-4495-81ab-99972c72b179" (UID: "08e7d2a9-093c-4495-81ab-99972c72b179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.734293 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08e7d2a9-093c-4495-81ab-99972c72b179" (UID: "08e7d2a9-093c-4495-81ab-99972c72b179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.736635 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-config-data" (OuterVolumeSpecName: "config-data") pod "08e7d2a9-093c-4495-81ab-99972c72b179" (UID: "08e7d2a9-093c-4495-81ab-99972c72b179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.812201 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.812237 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9s8d\" (UniqueName: \"kubernetes.io/projected/08e7d2a9-093c-4495-81ab-99972c72b179-kube-api-access-l9s8d\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.812268 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08e7d2a9-093c-4495-81ab-99972c72b179-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.812282 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:19 crc kubenswrapper[4931]: I0130 06:37:19.812295 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e7d2a9-093c-4495-81ab-99972c72b179-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.237234 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x9ngk" event={"ID":"08e7d2a9-093c-4495-81ab-99972c72b179","Type":"ContainerDied","Data":"57de75e2f05fade5baa1cc643f4af3f769856d06f6c34e1b83f87baa4389aede"} Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.237289 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57de75e2f05fade5baa1cc643f4af3f769856d06f6c34e1b83f87baa4389aede" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.237370 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x9ngk" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.324003 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b467bdbbb-ds8j4"] Jan 30 06:37:20 crc kubenswrapper[4931]: E0130 06:37:20.324887 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e7d2a9-093c-4495-81ab-99972c72b179" containerName="placement-db-sync" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.325162 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e7d2a9-093c-4495-81ab-99972c72b179" containerName="placement-db-sync" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.325847 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e7d2a9-093c-4495-81ab-99972c72b179" containerName="placement-db-sync" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.327551 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.330161 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.330474 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rbl8r" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.330823 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.332632 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b467bdbbb-ds8j4"] Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.420821 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-combined-ca-bundle\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.421238 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-logs\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.421262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dstw\" (UniqueName: \"kubernetes.io/projected/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-kube-api-access-5dstw\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.421286 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-config-data\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.421312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-scripts\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.527484 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-logs\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.527560 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dstw\" (UniqueName: \"kubernetes.io/projected/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-kube-api-access-5dstw\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.527592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-config-data\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.527636 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-scripts\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.527709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-combined-ca-bundle\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.529005 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-logs\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.535722 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-config-data\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.542588 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-combined-ca-bundle\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.547053 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-scripts\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.557456 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dstw\" (UniqueName: \"kubernetes.io/projected/6359f2c1-ac0c-4084-969e-7cff11e8b4d8-kube-api-access-5dstw\") pod \"placement-6b467bdbbb-ds8j4\" (UID: \"6359f2c1-ac0c-4084-969e-7cff11e8b4d8\") " pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:20 crc kubenswrapper[4931]: I0130 06:37:20.652072 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:21 crc kubenswrapper[4931]: I0130 06:37:21.087001 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b467bdbbb-ds8j4"] Jan 30 06:37:21 crc kubenswrapper[4931]: I0130 06:37:21.249907 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b467bdbbb-ds8j4" event={"ID":"6359f2c1-ac0c-4084-969e-7cff11e8b4d8","Type":"ContainerStarted","Data":"74e2dbece03674369045d767d71b925745e45fd3043851d166f34f7f4e62abbe"} Jan 30 06:37:22 crc kubenswrapper[4931]: I0130 06:37:22.263917 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b467bdbbb-ds8j4" event={"ID":"6359f2c1-ac0c-4084-969e-7cff11e8b4d8","Type":"ContainerStarted","Data":"46dc669fdce1036d06ad44630a282a5294d2d76e0a512ad39f056a4c1e78453f"} Jan 30 06:37:22 crc kubenswrapper[4931]: I0130 06:37:22.264598 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b467bdbbb-ds8j4" event={"ID":"6359f2c1-ac0c-4084-969e-7cff11e8b4d8","Type":"ContainerStarted","Data":"1d8c77779be70b77d3eeed3d42eed85f62d4217b5baa219074ac9acb30a622a3"} Jan 30 06:37:22 crc kubenswrapper[4931]: I0130 06:37:22.264643 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:22 crc kubenswrapper[4931]: I0130 06:37:22.264667 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:22 crc kubenswrapper[4931]: I0130 06:37:22.293249 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b467bdbbb-ds8j4" podStartSLOduration=2.293213719 podStartE2EDuration="2.293213719s" podCreationTimestamp="2026-01-30 06:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:37:22.289087823 +0000 UTC m=+5377.658998090" watchObservedRunningTime="2026-01-30 06:37:22.293213719 +0000 UTC m=+5377.663124016" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.073747 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.163630 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf7fddbc7-982b5"] Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.163920 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerName="dnsmasq-dns" containerID="cri-o://64b8f4307220d8bf205a24561416d6dc5c9e47a88b8a78a79bc73aae4b6035c4" gracePeriod=10 Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.297663 4931 generic.go:334] "Generic (PLEG): container finished" podID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerID="64b8f4307220d8bf205a24561416d6dc5c9e47a88b8a78a79bc73aae4b6035c4" exitCode=0 Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.297710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" event={"ID":"e7f7543a-72f1-4937-95d9-8869b77ab81d","Type":"ContainerDied","Data":"64b8f4307220d8bf205a24561416d6dc5c9e47a88b8a78a79bc73aae4b6035c4"} Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.664312 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.759143 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-nb\") pod \"e7f7543a-72f1-4937-95d9-8869b77ab81d\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.759229 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-config\") pod \"e7f7543a-72f1-4937-95d9-8869b77ab81d\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.759252 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-dns-svc\") pod \"e7f7543a-72f1-4937-95d9-8869b77ab81d\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.759325 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-sb\") pod \"e7f7543a-72f1-4937-95d9-8869b77ab81d\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.759502 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6k6k\" (UniqueName: \"kubernetes.io/projected/e7f7543a-72f1-4937-95d9-8869b77ab81d-kube-api-access-s6k6k\") pod \"e7f7543a-72f1-4937-95d9-8869b77ab81d\" (UID: \"e7f7543a-72f1-4937-95d9-8869b77ab81d\") " Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.771122 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f7543a-72f1-4937-95d9-8869b77ab81d-kube-api-access-s6k6k" (OuterVolumeSpecName: "kube-api-access-s6k6k") pod "e7f7543a-72f1-4937-95d9-8869b77ab81d" (UID: "e7f7543a-72f1-4937-95d9-8869b77ab81d"). InnerVolumeSpecName "kube-api-access-s6k6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.811561 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-config" (OuterVolumeSpecName: "config") pod "e7f7543a-72f1-4937-95d9-8869b77ab81d" (UID: "e7f7543a-72f1-4937-95d9-8869b77ab81d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.811745 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7f7543a-72f1-4937-95d9-8869b77ab81d" (UID: "e7f7543a-72f1-4937-95d9-8869b77ab81d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.812929 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7f7543a-72f1-4937-95d9-8869b77ab81d" (UID: "e7f7543a-72f1-4937-95d9-8869b77ab81d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.817313 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7f7543a-72f1-4937-95d9-8869b77ab81d" (UID: "e7f7543a-72f1-4937-95d9-8869b77ab81d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.860754 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.860784 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.860794 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.860802 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7f7543a-72f1-4937-95d9-8869b77ab81d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:26 crc kubenswrapper[4931]: I0130 06:37:26.860812 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6k6k\" (UniqueName: \"kubernetes.io/projected/e7f7543a-72f1-4937-95d9-8869b77ab81d-kube-api-access-s6k6k\") on node \"crc\" DevicePath \"\"" Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.311964 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" event={"ID":"e7f7543a-72f1-4937-95d9-8869b77ab81d","Type":"ContainerDied","Data":"424c4112ef530e297d0cff0d4af771a05512506e55897ed954c10a6043fe7171"} Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.312043 4931 scope.go:117] "RemoveContainer" containerID="64b8f4307220d8bf205a24561416d6dc5c9e47a88b8a78a79bc73aae4b6035c4" Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.312078 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf7fddbc7-982b5" Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.334681 4931 scope.go:117] "RemoveContainer" containerID="7119d5284982674648c0826e4626a711e51ca2133b6b1310bb2a6ca06e64c6b3" Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.376780 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf7fddbc7-982b5"] Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.389278 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cf7fddbc7-982b5"] Jan 30 06:37:27 crc kubenswrapper[4931]: I0130 06:37:27.437953 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" path="/var/lib/kubelet/pods/e7f7543a-72f1-4937-95d9-8869b77ab81d/volumes" Jan 30 06:37:51 crc kubenswrapper[4931]: I0130 06:37:51.622679 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:51 crc kubenswrapper[4931]: I0130 06:37:51.623358 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b467bdbbb-ds8j4" Jan 30 06:37:57 crc kubenswrapper[4931]: I0130 06:37:57.363552 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:37:57 crc kubenswrapper[4931]: I0130 06:37:57.363933 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.533221 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vm2gb"] Jan 30 06:38:13 crc kubenswrapper[4931]: E0130 06:38:13.534454 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerName="dnsmasq-dns" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.534481 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerName="dnsmasq-dns" Jan 30 06:38:13 crc kubenswrapper[4931]: E0130 06:38:13.534508 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerName="init" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.534519 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerName="init" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.534782 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f7543a-72f1-4937-95d9-8869b77ab81d" containerName="dnsmasq-dns" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.535555 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.553547 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vm2gb"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.615219 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5xpsl"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.617908 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.623001 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5xpsl"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.704377 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-operator-scripts\") pod \"nova-api-db-create-vm2gb\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.704499 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk4vh\" (UniqueName: \"kubernetes.io/projected/ede2117e-e3d5-46f6-8a54-1cd987370470-kube-api-access-hk4vh\") pod \"nova-cell0-db-create-5xpsl\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.704642 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ede2117e-e3d5-46f6-8a54-1cd987370470-operator-scripts\") pod \"nova-cell0-db-create-5xpsl\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.704721 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqz62\" (UniqueName: \"kubernetes.io/projected/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-kube-api-access-rqz62\") pod \"nova-api-db-create-vm2gb\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.731271 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c3ab-account-create-update-6wqgk"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.732346 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.736400 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.736852 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c3ab-account-create-update-6wqgk"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.746328 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7dkqh"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.747318 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.754279 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7dkqh"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.806569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ede2117e-e3d5-46f6-8a54-1cd987370470-operator-scripts\") pod \"nova-cell0-db-create-5xpsl\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.806641 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqz62\" (UniqueName: \"kubernetes.io/projected/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-kube-api-access-rqz62\") pod \"nova-api-db-create-vm2gb\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.806682 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-operator-scripts\") pod \"nova-api-db-create-vm2gb\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.806713 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk4vh\" (UniqueName: \"kubernetes.io/projected/ede2117e-e3d5-46f6-8a54-1cd987370470-kube-api-access-hk4vh\") pod \"nova-cell0-db-create-5xpsl\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.807742 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ede2117e-e3d5-46f6-8a54-1cd987370470-operator-scripts\") pod \"nova-cell0-db-create-5xpsl\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.808344 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-operator-scripts\") pod \"nova-api-db-create-vm2gb\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.822934 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk4vh\" (UniqueName: \"kubernetes.io/projected/ede2117e-e3d5-46f6-8a54-1cd987370470-kube-api-access-hk4vh\") pod \"nova-cell0-db-create-5xpsl\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.825195 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqz62\" (UniqueName: \"kubernetes.io/projected/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-kube-api-access-rqz62\") pod \"nova-api-db-create-vm2gb\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.866038 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.915126 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9932-account-create-update-6qlx2"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.917646 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/730d8243-e8f1-4b7a-b012-d65ff132d427-operator-scripts\") pod \"nova-api-c3ab-account-create-update-6wqgk\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.917953 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jrp\" (UniqueName: \"kubernetes.io/projected/0d2ac10a-2179-4d51-b7e8-31ac3621d798-kube-api-access-k9jrp\") pod \"nova-cell1-db-create-7dkqh\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.918012 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2ac10a-2179-4d51-b7e8-31ac3621d798-operator-scripts\") pod \"nova-cell1-db-create-7dkqh\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.918058 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zhz\" (UniqueName: \"kubernetes.io/projected/730d8243-e8f1-4b7a-b012-d65ff132d427-kube-api-access-l6zhz\") pod \"nova-api-c3ab-account-create-update-6wqgk\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.919491 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.928241 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.929938 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9932-account-create-update-6qlx2"] Jan 30 06:38:13 crc kubenswrapper[4931]: I0130 06:38:13.934021 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.022950 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/730d8243-e8f1-4b7a-b012-d65ff132d427-operator-scripts\") pod \"nova-api-c3ab-account-create-update-6wqgk\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.023258 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jrp\" (UniqueName: \"kubernetes.io/projected/0d2ac10a-2179-4d51-b7e8-31ac3621d798-kube-api-access-k9jrp\") pod \"nova-cell1-db-create-7dkqh\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.023286 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2ac10a-2179-4d51-b7e8-31ac3621d798-operator-scripts\") pod \"nova-cell1-db-create-7dkqh\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.024133 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/730d8243-e8f1-4b7a-b012-d65ff132d427-operator-scripts\") pod \"nova-api-c3ab-account-create-update-6wqgk\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.024133 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2ac10a-2179-4d51-b7e8-31ac3621d798-operator-scripts\") pod \"nova-cell1-db-create-7dkqh\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.024134 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gzc\" (UniqueName: \"kubernetes.io/projected/0c2a0233-04c5-4382-948d-809c1216b075-kube-api-access-t8gzc\") pod \"nova-cell0-9932-account-create-update-6qlx2\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.024274 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zhz\" (UniqueName: \"kubernetes.io/projected/730d8243-e8f1-4b7a-b012-d65ff132d427-kube-api-access-l6zhz\") pod \"nova-api-c3ab-account-create-update-6wqgk\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.025455 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c2a0233-04c5-4382-948d-809c1216b075-operator-scripts\") pod \"nova-cell0-9932-account-create-update-6qlx2\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.044684 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zhz\" (UniqueName: \"kubernetes.io/projected/730d8243-e8f1-4b7a-b012-d65ff132d427-kube-api-access-l6zhz\") pod \"nova-api-c3ab-account-create-update-6wqgk\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.049196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jrp\" (UniqueName: \"kubernetes.io/projected/0d2ac10a-2179-4d51-b7e8-31ac3621d798-kube-api-access-k9jrp\") pod \"nova-cell1-db-create-7dkqh\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.065746 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.078191 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.126716 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gzc\" (UniqueName: \"kubernetes.io/projected/0c2a0233-04c5-4382-948d-809c1216b075-kube-api-access-t8gzc\") pod \"nova-cell0-9932-account-create-update-6qlx2\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.126812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c2a0233-04c5-4382-948d-809c1216b075-operator-scripts\") pod \"nova-cell0-9932-account-create-update-6qlx2\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.127559 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c2a0233-04c5-4382-948d-809c1216b075-operator-scripts\") pod \"nova-cell0-9932-account-create-update-6qlx2\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.134349 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-184a-account-create-update-b6t5s"] Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.135506 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.137285 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.148264 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gzc\" (UniqueName: \"kubernetes.io/projected/0c2a0233-04c5-4382-948d-809c1216b075-kube-api-access-t8gzc\") pod \"nova-cell0-9932-account-create-update-6qlx2\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.155538 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-184a-account-create-update-b6t5s"] Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.324661 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.329158 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a06db3-c381-45ef-883d-ee7393822e5a-operator-scripts\") pod \"nova-cell1-184a-account-create-update-b6t5s\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.329195 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn26\" (UniqueName: \"kubernetes.io/projected/c7a06db3-c381-45ef-883d-ee7393822e5a-kube-api-access-snn26\") pod \"nova-cell1-184a-account-create-update-b6t5s\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.391718 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vm2gb"] Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.431087 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a06db3-c381-45ef-883d-ee7393822e5a-operator-scripts\") pod \"nova-cell1-184a-account-create-update-b6t5s\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.431130 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snn26\" (UniqueName: \"kubernetes.io/projected/c7a06db3-c381-45ef-883d-ee7393822e5a-kube-api-access-snn26\") pod \"nova-cell1-184a-account-create-update-b6t5s\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.432230 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a06db3-c381-45ef-883d-ee7393822e5a-operator-scripts\") pod \"nova-cell1-184a-account-create-update-b6t5s\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.459306 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn26\" (UniqueName: \"kubernetes.io/projected/c7a06db3-c381-45ef-883d-ee7393822e5a-kube-api-access-snn26\") pod \"nova-cell1-184a-account-create-update-b6t5s\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.506693 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5xpsl"] Jan 30 06:38:14 crc kubenswrapper[4931]: W0130 06:38:14.516910 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podede2117e_e3d5_46f6_8a54_1cd987370470.slice/crio-5b80f05329cca2897dfa70c9bdc0e4c4f460e556ac74de0d1d1d646a96c57a30 WatchSource:0}: Error finding container 5b80f05329cca2897dfa70c9bdc0e4c4f460e556ac74de0d1d1d646a96c57a30: Status 404 returned error can't find the container with id 5b80f05329cca2897dfa70c9bdc0e4c4f460e556ac74de0d1d1d646a96c57a30 Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.631315 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7dkqh"] Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.639149 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c3ab-account-create-update-6wqgk"] Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.757745 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.793539 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9932-account-create-update-6qlx2"] Jan 30 06:38:14 crc kubenswrapper[4931]: W0130 06:38:14.800238 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c2a0233_04c5_4382_948d_809c1216b075.slice/crio-af5a106da150f0bb17662ab7ab3e10a8f540d31cc16fca47df9962bdb4fded43 WatchSource:0}: Error finding container af5a106da150f0bb17662ab7ab3e10a8f540d31cc16fca47df9962bdb4fded43: Status 404 returned error can't find the container with id af5a106da150f0bb17662ab7ab3e10a8f540d31cc16fca47df9962bdb4fded43 Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.829881 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5xpsl" event={"ID":"ede2117e-e3d5-46f6-8a54-1cd987370470","Type":"ContainerStarted","Data":"3e425706ce28b09c3d63a84e9658921cb423491b485b43961ce5f36589fbc46a"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.829929 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5xpsl" event={"ID":"ede2117e-e3d5-46f6-8a54-1cd987370470","Type":"ContainerStarted","Data":"5b80f05329cca2897dfa70c9bdc0e4c4f460e556ac74de0d1d1d646a96c57a30"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.831768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" event={"ID":"730d8243-e8f1-4b7a-b012-d65ff132d427","Type":"ContainerStarted","Data":"8b48030bead5b8810494ecca940f3ce0fe837fa405d9830629e0021c23f80e05"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.848728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vm2gb" event={"ID":"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4","Type":"ContainerStarted","Data":"e6ba02a25b1e0ee2f6ab68e84a5695ffab01e5de9788f62975d6f6c203f6437a"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.848811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vm2gb" event={"ID":"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4","Type":"ContainerStarted","Data":"a98150e0e31af136066d55cd6aac76cfe239ea6044527ea73c361e8d1c5d2a0e"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.851880 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7dkqh" event={"ID":"0d2ac10a-2179-4d51-b7e8-31ac3621d798","Type":"ContainerStarted","Data":"27e13b6b085d0725e59ac8bc7078474da43dc2cdd83ba76b6820d4ffc9d8594f"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.851935 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7dkqh" event={"ID":"0d2ac10a-2179-4d51-b7e8-31ac3621d798","Type":"ContainerStarted","Data":"e7421e16eac378e7a0dea1f9d24b088f5207c980e51a7f3d6f384ea981d58f88"} Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.853347 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-5xpsl" podStartSLOduration=1.853333329 podStartE2EDuration="1.853333329s" podCreationTimestamp="2026-01-30 06:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:14.848244046 +0000 UTC m=+5430.218154313" watchObservedRunningTime="2026-01-30 06:38:14.853333329 +0000 UTC m=+5430.223243586" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.868005 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-vm2gb" podStartSLOduration=1.867986749 podStartE2EDuration="1.867986749s" podCreationTimestamp="2026-01-30 06:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:14.865056147 +0000 UTC m=+5430.234966404" watchObservedRunningTime="2026-01-30 06:38:14.867986749 +0000 UTC m=+5430.237897006" Jan 30 06:38:14 crc kubenswrapper[4931]: I0130 06:38:14.897159 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-7dkqh" podStartSLOduration=1.897120284 podStartE2EDuration="1.897120284s" podCreationTimestamp="2026-01-30 06:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:14.882960878 +0000 UTC m=+5430.252871135" watchObservedRunningTime="2026-01-30 06:38:14.897120284 +0000 UTC m=+5430.267030551" Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.289900 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-184a-account-create-update-b6t5s"] Jan 30 06:38:15 crc kubenswrapper[4931]: W0130 06:38:15.299833 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7a06db3_c381_45ef_883d_ee7393822e5a.slice/crio-be0cce4914a349c03bb254b8aec6996628db2bb168ecf5eee1fd286e7266cb60 WatchSource:0}: Error finding container be0cce4914a349c03bb254b8aec6996628db2bb168ecf5eee1fd286e7266cb60: Status 404 returned error can't find the container with id be0cce4914a349c03bb254b8aec6996628db2bb168ecf5eee1fd286e7266cb60 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.865506 4931 generic.go:334] "Generic (PLEG): container finished" podID="5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" containerID="e6ba02a25b1e0ee2f6ab68e84a5695ffab01e5de9788f62975d6f6c203f6437a" exitCode=0 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.865695 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vm2gb" event={"ID":"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4","Type":"ContainerDied","Data":"e6ba02a25b1e0ee2f6ab68e84a5695ffab01e5de9788f62975d6f6c203f6437a"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.869535 4931 generic.go:334] "Generic (PLEG): container finished" podID="0d2ac10a-2179-4d51-b7e8-31ac3621d798" containerID="27e13b6b085d0725e59ac8bc7078474da43dc2cdd83ba76b6820d4ffc9d8594f" exitCode=0 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.869747 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7dkqh" event={"ID":"0d2ac10a-2179-4d51-b7e8-31ac3621d798","Type":"ContainerDied","Data":"27e13b6b085d0725e59ac8bc7078474da43dc2cdd83ba76b6820d4ffc9d8594f"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.876189 4931 generic.go:334] "Generic (PLEG): container finished" podID="c7a06db3-c381-45ef-883d-ee7393822e5a" containerID="5ecad2bc0e017bf1777e069b0d51cce5fdf83ffb212486a161ec83e5ab28a776" exitCode=0 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.876268 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" event={"ID":"c7a06db3-c381-45ef-883d-ee7393822e5a","Type":"ContainerDied","Data":"5ecad2bc0e017bf1777e069b0d51cce5fdf83ffb212486a161ec83e5ab28a776"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.876298 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" event={"ID":"c7a06db3-c381-45ef-883d-ee7393822e5a","Type":"ContainerStarted","Data":"be0cce4914a349c03bb254b8aec6996628db2bb168ecf5eee1fd286e7266cb60"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.885585 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5xpsl" event={"ID":"ede2117e-e3d5-46f6-8a54-1cd987370470","Type":"ContainerDied","Data":"3e425706ce28b09c3d63a84e9658921cb423491b485b43961ce5f36589fbc46a"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.887747 4931 generic.go:334] "Generic (PLEG): container finished" podID="ede2117e-e3d5-46f6-8a54-1cd987370470" containerID="3e425706ce28b09c3d63a84e9658921cb423491b485b43961ce5f36589fbc46a" exitCode=0 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.891475 4931 generic.go:334] "Generic (PLEG): container finished" podID="730d8243-e8f1-4b7a-b012-d65ff132d427" containerID="6d585a871e0bcc02099ca7b0fec64c91cc44765f6bff9850b839ed74e64354fb" exitCode=0 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.891552 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" event={"ID":"730d8243-e8f1-4b7a-b012-d65ff132d427","Type":"ContainerDied","Data":"6d585a871e0bcc02099ca7b0fec64c91cc44765f6bff9850b839ed74e64354fb"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.893935 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c2a0233-04c5-4382-948d-809c1216b075" containerID="982580309a618618acf59d7ed62dffc9baa63654e107e79e17f31ae5e09b9d10" exitCode=0 Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.893986 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" event={"ID":"0c2a0233-04c5-4382-948d-809c1216b075","Type":"ContainerDied","Data":"982580309a618618acf59d7ed62dffc9baa63654e107e79e17f31ae5e09b9d10"} Jan 30 06:38:15 crc kubenswrapper[4931]: I0130 06:38:15.894047 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" event={"ID":"0c2a0233-04c5-4382-948d-809c1216b075","Type":"ContainerStarted","Data":"af5a106da150f0bb17662ab7ab3e10a8f540d31cc16fca47df9962bdb4fded43"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.283398 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.390750 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk4vh\" (UniqueName: \"kubernetes.io/projected/ede2117e-e3d5-46f6-8a54-1cd987370470-kube-api-access-hk4vh\") pod \"ede2117e-e3d5-46f6-8a54-1cd987370470\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.390814 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ede2117e-e3d5-46f6-8a54-1cd987370470-operator-scripts\") pod \"ede2117e-e3d5-46f6-8a54-1cd987370470\" (UID: \"ede2117e-e3d5-46f6-8a54-1cd987370470\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.391776 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede2117e-e3d5-46f6-8a54-1cd987370470-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ede2117e-e3d5-46f6-8a54-1cd987370470" (UID: "ede2117e-e3d5-46f6-8a54-1cd987370470"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.412290 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede2117e-e3d5-46f6-8a54-1cd987370470-kube-api-access-hk4vh" (OuterVolumeSpecName: "kube-api-access-hk4vh") pod "ede2117e-e3d5-46f6-8a54-1cd987370470" (UID: "ede2117e-e3d5-46f6-8a54-1cd987370470"). InnerVolumeSpecName "kube-api-access-hk4vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.492515 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk4vh\" (UniqueName: \"kubernetes.io/projected/ede2117e-e3d5-46f6-8a54-1cd987370470-kube-api-access-hk4vh\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.492558 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ede2117e-e3d5-46f6-8a54-1cd987370470-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.575338 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.581538 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.597444 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.602175 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.622948 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695014 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a06db3-c381-45ef-883d-ee7393822e5a-operator-scripts\") pod \"c7a06db3-c381-45ef-883d-ee7393822e5a\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695066 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c2a0233-04c5-4382-948d-809c1216b075-operator-scripts\") pod \"0c2a0233-04c5-4382-948d-809c1216b075\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695122 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6zhz\" (UniqueName: \"kubernetes.io/projected/730d8243-e8f1-4b7a-b012-d65ff132d427-kube-api-access-l6zhz\") pod \"730d8243-e8f1-4b7a-b012-d65ff132d427\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695158 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/730d8243-e8f1-4b7a-b012-d65ff132d427-operator-scripts\") pod \"730d8243-e8f1-4b7a-b012-d65ff132d427\" (UID: \"730d8243-e8f1-4b7a-b012-d65ff132d427\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695184 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8gzc\" (UniqueName: \"kubernetes.io/projected/0c2a0233-04c5-4382-948d-809c1216b075-kube-api-access-t8gzc\") pod \"0c2a0233-04c5-4382-948d-809c1216b075\" (UID: \"0c2a0233-04c5-4382-948d-809c1216b075\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695244 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqz62\" (UniqueName: \"kubernetes.io/projected/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-kube-api-access-rqz62\") pod \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695260 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-operator-scripts\") pod \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\" (UID: \"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695293 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2ac10a-2179-4d51-b7e8-31ac3621d798-operator-scripts\") pod \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snn26\" (UniqueName: \"kubernetes.io/projected/c7a06db3-c381-45ef-883d-ee7393822e5a-kube-api-access-snn26\") pod \"c7a06db3-c381-45ef-883d-ee7393822e5a\" (UID: \"c7a06db3-c381-45ef-883d-ee7393822e5a\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.695353 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9jrp\" (UniqueName: \"kubernetes.io/projected/0d2ac10a-2179-4d51-b7e8-31ac3621d798-kube-api-access-k9jrp\") pod \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\" (UID: \"0d2ac10a-2179-4d51-b7e8-31ac3621d798\") " Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.696480 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a06db3-c381-45ef-883d-ee7393822e5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7a06db3-c381-45ef-883d-ee7393822e5a" (UID: "c7a06db3-c381-45ef-883d-ee7393822e5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.696886 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" (UID: "5d595bdd-ffa6-4292-b4c2-1eba0736a6a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.696964 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d2ac10a-2179-4d51-b7e8-31ac3621d798-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d2ac10a-2179-4d51-b7e8-31ac3621d798" (UID: "0d2ac10a-2179-4d51-b7e8-31ac3621d798"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.697729 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/730d8243-e8f1-4b7a-b012-d65ff132d427-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "730d8243-e8f1-4b7a-b012-d65ff132d427" (UID: "730d8243-e8f1-4b7a-b012-d65ff132d427"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.697850 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2a0233-04c5-4382-948d-809c1216b075-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c2a0233-04c5-4382-948d-809c1216b075" (UID: "0c2a0233-04c5-4382-948d-809c1216b075"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.699298 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2a0233-04c5-4382-948d-809c1216b075-kube-api-access-t8gzc" (OuterVolumeSpecName: "kube-api-access-t8gzc") pod "0c2a0233-04c5-4382-948d-809c1216b075" (UID: "0c2a0233-04c5-4382-948d-809c1216b075"). InnerVolumeSpecName "kube-api-access-t8gzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.699494 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-kube-api-access-rqz62" (OuterVolumeSpecName: "kube-api-access-rqz62") pod "5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" (UID: "5d595bdd-ffa6-4292-b4c2-1eba0736a6a4"). InnerVolumeSpecName "kube-api-access-rqz62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.699637 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730d8243-e8f1-4b7a-b012-d65ff132d427-kube-api-access-l6zhz" (OuterVolumeSpecName: "kube-api-access-l6zhz") pod "730d8243-e8f1-4b7a-b012-d65ff132d427" (UID: "730d8243-e8f1-4b7a-b012-d65ff132d427"). InnerVolumeSpecName "kube-api-access-l6zhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.699727 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a06db3-c381-45ef-883d-ee7393822e5a-kube-api-access-snn26" (OuterVolumeSpecName: "kube-api-access-snn26") pod "c7a06db3-c381-45ef-883d-ee7393822e5a" (UID: "c7a06db3-c381-45ef-883d-ee7393822e5a"). InnerVolumeSpecName "kube-api-access-snn26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.702139 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2ac10a-2179-4d51-b7e8-31ac3621d798-kube-api-access-k9jrp" (OuterVolumeSpecName: "kube-api-access-k9jrp") pod "0d2ac10a-2179-4d51-b7e8-31ac3621d798" (UID: "0d2ac10a-2179-4d51-b7e8-31ac3621d798"). InnerVolumeSpecName "kube-api-access-k9jrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.796991 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9jrp\" (UniqueName: \"kubernetes.io/projected/0d2ac10a-2179-4d51-b7e8-31ac3621d798-kube-api-access-k9jrp\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797033 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a06db3-c381-45ef-883d-ee7393822e5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797042 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c2a0233-04c5-4382-948d-809c1216b075-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797051 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6zhz\" (UniqueName: \"kubernetes.io/projected/730d8243-e8f1-4b7a-b012-d65ff132d427-kube-api-access-l6zhz\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797060 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/730d8243-e8f1-4b7a-b012-d65ff132d427-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797068 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8gzc\" (UniqueName: \"kubernetes.io/projected/0c2a0233-04c5-4382-948d-809c1216b075-kube-api-access-t8gzc\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797077 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqz62\" (UniqueName: \"kubernetes.io/projected/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-kube-api-access-rqz62\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797085 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797093 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2ac10a-2179-4d51-b7e8-31ac3621d798-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.797100 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snn26\" (UniqueName: \"kubernetes.io/projected/c7a06db3-c381-45ef-883d-ee7393822e5a-kube-api-access-snn26\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.922924 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5xpsl" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.923233 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5xpsl" event={"ID":"ede2117e-e3d5-46f6-8a54-1cd987370470","Type":"ContainerDied","Data":"5b80f05329cca2897dfa70c9bdc0e4c4f460e556ac74de0d1d1d646a96c57a30"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.923274 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b80f05329cca2897dfa70c9bdc0e4c4f460e556ac74de0d1d1d646a96c57a30" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.924899 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.924908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c3ab-account-create-update-6wqgk" event={"ID":"730d8243-e8f1-4b7a-b012-d65ff132d427","Type":"ContainerDied","Data":"8b48030bead5b8810494ecca940f3ce0fe837fa405d9830629e0021c23f80e05"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.924942 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b48030bead5b8810494ecca940f3ce0fe837fa405d9830629e0021c23f80e05" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.926852 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" event={"ID":"0c2a0233-04c5-4382-948d-809c1216b075","Type":"ContainerDied","Data":"af5a106da150f0bb17662ab7ab3e10a8f540d31cc16fca47df9962bdb4fded43"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.926876 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af5a106da150f0bb17662ab7ab3e10a8f540d31cc16fca47df9962bdb4fded43" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.926883 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9932-account-create-update-6qlx2" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.928308 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vm2gb" event={"ID":"5d595bdd-ffa6-4292-b4c2-1eba0736a6a4","Type":"ContainerDied","Data":"a98150e0e31af136066d55cd6aac76cfe239ea6044527ea73c361e8d1c5d2a0e"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.928343 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98150e0e31af136066d55cd6aac76cfe239ea6044527ea73c361e8d1c5d2a0e" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.928435 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vm2gb" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.929882 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7dkqh" event={"ID":"0d2ac10a-2179-4d51-b7e8-31ac3621d798","Type":"ContainerDied","Data":"e7421e16eac378e7a0dea1f9d24b088f5207c980e51a7f3d6f384ea981d58f88"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.929903 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7421e16eac378e7a0dea1f9d24b088f5207c980e51a7f3d6f384ea981d58f88" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.929965 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7dkqh" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.938203 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" event={"ID":"c7a06db3-c381-45ef-883d-ee7393822e5a","Type":"ContainerDied","Data":"be0cce4914a349c03bb254b8aec6996628db2bb168ecf5eee1fd286e7266cb60"} Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.938239 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0cce4914a349c03bb254b8aec6996628db2bb168ecf5eee1fd286e7266cb60" Jan 30 06:38:17 crc kubenswrapper[4931]: I0130 06:38:17.938328 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-184a-account-create-update-b6t5s" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.153935 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-thknd"] Jan 30 06:38:19 crc kubenswrapper[4931]: E0130 06:38:19.154359 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730d8243-e8f1-4b7a-b012-d65ff132d427" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154376 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="730d8243-e8f1-4b7a-b012-d65ff132d427" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: E0130 06:38:19.154396 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2a0233-04c5-4382-948d-809c1216b075" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154405 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2a0233-04c5-4382-948d-809c1216b075" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: E0130 06:38:19.154419 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154443 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: E0130 06:38:19.154460 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede2117e-e3d5-46f6-8a54-1cd987370470" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154470 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede2117e-e3d5-46f6-8a54-1cd987370470" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: E0130 06:38:19.154487 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a06db3-c381-45ef-883d-ee7393822e5a" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154496 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a06db3-c381-45ef-883d-ee7393822e5a" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: E0130 06:38:19.154516 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2ac10a-2179-4d51-b7e8-31ac3621d798" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154524 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2ac10a-2179-4d51-b7e8-31ac3621d798" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154720 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede2117e-e3d5-46f6-8a54-1cd987370470" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154742 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="730d8243-e8f1-4b7a-b012-d65ff132d427" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154753 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2ac10a-2179-4d51-b7e8-31ac3621d798" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154765 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a06db3-c381-45ef-883d-ee7393822e5a" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154777 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2a0233-04c5-4382-948d-809c1216b075" containerName="mariadb-account-create-update" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.154788 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" containerName="mariadb-database-create" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.155517 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.158260 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.158918 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4nvdz" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.159386 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.177355 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-thknd"] Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.226404 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggbks\" (UniqueName: \"kubernetes.io/projected/693a2e91-1503-4caa-a71d-4f65d99a913c-kube-api-access-ggbks\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.226515 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-scripts\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.226544 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-config-data\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.226589 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.328143 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.328569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggbks\" (UniqueName: \"kubernetes.io/projected/693a2e91-1503-4caa-a71d-4f65d99a913c-kube-api-access-ggbks\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.328674 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-scripts\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.328712 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-config-data\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.331681 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.332170 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-scripts\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.332489 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-config-data\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.343585 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggbks\" (UniqueName: \"kubernetes.io/projected/693a2e91-1503-4caa-a71d-4f65d99a913c-kube-api-access-ggbks\") pod \"nova-cell0-conductor-db-sync-thknd\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.481452 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.744299 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-thknd"] Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.962536 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-thknd" event={"ID":"693a2e91-1503-4caa-a71d-4f65d99a913c","Type":"ContainerStarted","Data":"0945601f9dc541b7489d17e996b29ddbb60ba07b0e0dec353dfa850db402078c"} Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.962580 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-thknd" event={"ID":"693a2e91-1503-4caa-a71d-4f65d99a913c","Type":"ContainerStarted","Data":"ac62c375bd605110094f1dfe2f9000637195e690488df8000525cc79d4598be2"} Jan 30 06:38:19 crc kubenswrapper[4931]: I0130 06:38:19.981670 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-thknd" podStartSLOduration=0.981650457 podStartE2EDuration="981.650457ms" podCreationTimestamp="2026-01-30 06:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:19.974038965 +0000 UTC m=+5435.343949222" watchObservedRunningTime="2026-01-30 06:38:19.981650457 +0000 UTC m=+5435.351560714" Jan 30 06:38:25 crc kubenswrapper[4931]: I0130 06:38:25.028407 4931 generic.go:334] "Generic (PLEG): container finished" podID="693a2e91-1503-4caa-a71d-4f65d99a913c" containerID="0945601f9dc541b7489d17e996b29ddbb60ba07b0e0dec353dfa850db402078c" exitCode=0 Jan 30 06:38:25 crc kubenswrapper[4931]: I0130 06:38:25.028498 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-thknd" event={"ID":"693a2e91-1503-4caa-a71d-4f65d99a913c","Type":"ContainerDied","Data":"0945601f9dc541b7489d17e996b29ddbb60ba07b0e0dec353dfa850db402078c"} Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.503714 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.604693 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-scripts\") pod \"693a2e91-1503-4caa-a71d-4f65d99a913c\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.604812 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-config-data\") pod \"693a2e91-1503-4caa-a71d-4f65d99a913c\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.604952 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-combined-ca-bundle\") pod \"693a2e91-1503-4caa-a71d-4f65d99a913c\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.605136 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggbks\" (UniqueName: \"kubernetes.io/projected/693a2e91-1503-4caa-a71d-4f65d99a913c-kube-api-access-ggbks\") pod \"693a2e91-1503-4caa-a71d-4f65d99a913c\" (UID: \"693a2e91-1503-4caa-a71d-4f65d99a913c\") " Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.611981 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693a2e91-1503-4caa-a71d-4f65d99a913c-kube-api-access-ggbks" (OuterVolumeSpecName: "kube-api-access-ggbks") pod "693a2e91-1503-4caa-a71d-4f65d99a913c" (UID: "693a2e91-1503-4caa-a71d-4f65d99a913c"). InnerVolumeSpecName "kube-api-access-ggbks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.612073 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-scripts" (OuterVolumeSpecName: "scripts") pod "693a2e91-1503-4caa-a71d-4f65d99a913c" (UID: "693a2e91-1503-4caa-a71d-4f65d99a913c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.634685 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "693a2e91-1503-4caa-a71d-4f65d99a913c" (UID: "693a2e91-1503-4caa-a71d-4f65d99a913c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.650900 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-config-data" (OuterVolumeSpecName: "config-data") pod "693a2e91-1503-4caa-a71d-4f65d99a913c" (UID: "693a2e91-1503-4caa-a71d-4f65d99a913c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.707035 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.707075 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.707091 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/693a2e91-1503-4caa-a71d-4f65d99a913c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:26 crc kubenswrapper[4931]: I0130 06:38:26.707149 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggbks\" (UniqueName: \"kubernetes.io/projected/693a2e91-1503-4caa-a71d-4f65d99a913c-kube-api-access-ggbks\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.053225 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-thknd" event={"ID":"693a2e91-1503-4caa-a71d-4f65d99a913c","Type":"ContainerDied","Data":"ac62c375bd605110094f1dfe2f9000637195e690488df8000525cc79d4598be2"} Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.053594 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac62c375bd605110094f1dfe2f9000637195e690488df8000525cc79d4598be2" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.053324 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-thknd" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.137842 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:38:27 crc kubenswrapper[4931]: E0130 06:38:27.138240 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693a2e91-1503-4caa-a71d-4f65d99a913c" containerName="nova-cell0-conductor-db-sync" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.138258 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="693a2e91-1503-4caa-a71d-4f65d99a913c" containerName="nova-cell0-conductor-db-sync" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.138483 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="693a2e91-1503-4caa-a71d-4f65d99a913c" containerName="nova-cell0-conductor-db-sync" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.139682 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.142313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.142881 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4nvdz" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.150847 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.216360 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.216566 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.216771 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9bw\" (UniqueName: \"kubernetes.io/projected/2e99598c-cc27-462b-8c5b-9647fdc031dc-kube-api-access-7w9bw\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.318444 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9bw\" (UniqueName: \"kubernetes.io/projected/2e99598c-cc27-462b-8c5b-9647fdc031dc-kube-api-access-7w9bw\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.318567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.320920 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.334009 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.334919 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.337678 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9bw\" (UniqueName: \"kubernetes.io/projected/2e99598c-cc27-462b-8c5b-9647fdc031dc-kube-api-access-7w9bw\") pod \"nova-cell0-conductor-0\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.363763 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.363853 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.457547 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:27 crc kubenswrapper[4931]: I0130 06:38:27.970359 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:38:28 crc kubenswrapper[4931]: I0130 06:38:28.066053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2e99598c-cc27-462b-8c5b-9647fdc031dc","Type":"ContainerStarted","Data":"7484fc206458a3c7c0f0725319e96b32501a736307f2141c6d77f6213f261ff9"} Jan 30 06:38:29 crc kubenswrapper[4931]: I0130 06:38:29.081697 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2e99598c-cc27-462b-8c5b-9647fdc031dc","Type":"ContainerStarted","Data":"a67c2af59ce774fac5d99d16e2c4d0308297f692031a8a81e468f5bef97702ac"} Jan 30 06:38:29 crc kubenswrapper[4931]: I0130 06:38:29.082185 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:29 crc kubenswrapper[4931]: I0130 06:38:29.120024 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.119997341 podStartE2EDuration="2.119997341s" podCreationTimestamp="2026-01-30 06:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:29.107480491 +0000 UTC m=+5444.477390828" watchObservedRunningTime="2026-01-30 06:38:29.119997341 +0000 UTC m=+5444.489907638" Jan 30 06:38:37 crc kubenswrapper[4931]: I0130 06:38:37.507084 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.062345 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-49hcs"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.063994 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.069405 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.074833 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-49hcs"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.076163 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.181956 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.182012 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-scripts\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.182130 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-config-data\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.182168 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmfn\" (UniqueName: \"kubernetes.io/projected/3f484f87-1747-491b-a6c5-dd1d51ff66af-kube-api-access-6rmfn\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.227314 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.229481 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.233638 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.261762 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.283905 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-config-data\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.283954 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmfn\" (UniqueName: \"kubernetes.io/projected/3f484f87-1747-491b-a6c5-dd1d51ff66af-kube-api-access-6rmfn\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.283988 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-config-data\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.284015 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.284044 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzszf\" (UniqueName: \"kubernetes.io/projected/9399bfc6-7083-4978-b49d-bc46769c2b9e-kube-api-access-gzszf\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.284069 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.284094 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-scripts\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.295143 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-scripts\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.299320 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.299473 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.299685 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-config-data\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.308105 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmfn\" (UniqueName: \"kubernetes.io/projected/3f484f87-1747-491b-a6c5-dd1d51ff66af-kube-api-access-6rmfn\") pod \"nova-cell0-cell-mapping-49hcs\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.319162 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.340256 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.349062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.363474 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.365195 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.370393 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.375876 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.386985 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-config-data\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.387045 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.387083 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzszf\" (UniqueName: \"kubernetes.io/projected/9399bfc6-7083-4978-b49d-bc46769c2b9e-kube-api-access-gzszf\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.387121 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmssh\" (UniqueName: \"kubernetes.io/projected/dc16d452-7a63-4d86-b729-2f7384b3ea73-kube-api-access-vmssh\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.387236 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.387264 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.388628 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.413865 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-config-data\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.414765 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.426340 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzszf\" (UniqueName: \"kubernetes.io/projected/9399bfc6-7083-4978-b49d-bc46769c2b9e-kube-api-access-gzszf\") pod \"nova-scheduler-0\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.496385 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564639bd-5984-4822-80a8-c88dd5ae22da-logs\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.496489 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmssh\" (UniqueName: \"kubernetes.io/projected/dc16d452-7a63-4d86-b729-2f7384b3ea73-kube-api-access-vmssh\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.497074 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-config-data\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.497143 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vr8\" (UniqueName: \"kubernetes.io/projected/564639bd-5984-4822-80a8-c88dd5ae22da-kube-api-access-42vr8\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.497181 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.497214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.497242 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.504597 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.534413 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.536864 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.542969 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.546082 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.558972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmssh\" (UniqueName: \"kubernetes.io/projected/dc16d452-7a63-4d86-b729-2f7384b3ea73-kube-api-access-vmssh\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.568546 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.575843 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.585936 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67c9d4fb9c-cpnds"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.587978 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.595840 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67c9d4fb9c-cpnds"] Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.598880 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vr8\" (UniqueName: \"kubernetes.io/projected/564639bd-5984-4822-80a8-c88dd5ae22da-kube-api-access-42vr8\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.598940 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599036 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564639bd-5984-4822-80a8-c88dd5ae22da-logs\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599059 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599115 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-config-data\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599172 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-config-data\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599187 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b405de5-9885-473d-acc2-e974d5fcdcdf-logs\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599212 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qdwb\" (UniqueName: \"kubernetes.io/projected/3b405de5-9885-473d-acc2-e974d5fcdcdf-kube-api-access-5qdwb\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.599833 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564639bd-5984-4822-80a8-c88dd5ae22da-logs\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.602964 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-config-data\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.604001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.616663 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vr8\" (UniqueName: \"kubernetes.io/projected/564639bd-5984-4822-80a8-c88dd5ae22da-kube-api-access-42vr8\") pod \"nova-metadata-0\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701217 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-config\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701493 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-nb\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701536 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701564 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-sb\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701604 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-config-data\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701671 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b405de5-9885-473d-acc2-e974d5fcdcdf-logs\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701698 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qdwb\" (UniqueName: \"kubernetes.io/projected/3b405de5-9885-473d-acc2-e974d5fcdcdf-kube-api-access-5qdwb\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701720 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-dns-svc\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.701755 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6xc\" (UniqueName: \"kubernetes.io/projected/c8aaa63b-49f3-44c6-abe3-d24692e5894e-kube-api-access-hm6xc\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.702329 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b405de5-9885-473d-acc2-e974d5fcdcdf-logs\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.705110 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.706197 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-config-data\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.719594 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qdwb\" (UniqueName: \"kubernetes.io/projected/3b405de5-9885-473d-acc2-e974d5fcdcdf-kube-api-access-5qdwb\") pod \"nova-api-0\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.803396 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-sb\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.803557 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-dns-svc\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.803608 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6xc\" (UniqueName: \"kubernetes.io/projected/c8aaa63b-49f3-44c6-abe3-d24692e5894e-kube-api-access-hm6xc\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.803647 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-config\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.803672 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-nb\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.804453 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-sb\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.805995 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-config\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.806221 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-dns-svc\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.810325 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-nb\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.824510 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.829098 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6xc\" (UniqueName: \"kubernetes.io/projected/c8aaa63b-49f3-44c6-abe3-d24692e5894e-kube-api-access-hm6xc\") pod \"dnsmasq-dns-67c9d4fb9c-cpnds\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.844588 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.873935 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.907778 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:38 crc kubenswrapper[4931]: I0130 06:38:38.992233 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-49hcs"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.076916 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.087715 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xg8js"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.088851 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.093296 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.093372 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.095544 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xg8js"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.193555 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9399bfc6-7083-4978-b49d-bc46769c2b9e","Type":"ContainerStarted","Data":"a4fd2130796fb3af0e7960b4f61e0b3a284a488489630a1d386b0d9487a9d9c8"} Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.194625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49hcs" event={"ID":"3f484f87-1747-491b-a6c5-dd1d51ff66af","Type":"ContainerStarted","Data":"96ecf819b31adbd25c929467c1cd8090a82c8e42d995ce015f866e19d37cb78f"} Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.211866 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-config-data\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.211904 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-scripts\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.211985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzr5v\" (UniqueName: \"kubernetes.io/projected/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-kube-api-access-xzr5v\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.212014 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.292525 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.314317 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzr5v\" (UniqueName: \"kubernetes.io/projected/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-kube-api-access-xzr5v\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.314360 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.314479 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-config-data\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.314499 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-scripts\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.317880 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-config-data\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.317999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-scripts\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.319098 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.330637 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzr5v\" (UniqueName: \"kubernetes.io/projected/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-kube-api-access-xzr5v\") pod \"nova-cell1-conductor-db-sync-xg8js\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.411568 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.419060 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.432065 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.532062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67c9d4fb9c-cpnds"] Jan 30 06:38:39 crc kubenswrapper[4931]: I0130 06:38:39.954137 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xg8js"] Jan 30 06:38:39 crc kubenswrapper[4931]: W0130 06:38:39.960618 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97ed4dbf_5bb0_45b9_bc15_763a93ba7375.slice/crio-d61fb4dbc351b26c251d3469015fc7f9033401d22cd6c8b47548cb37e7ac9418 WatchSource:0}: Error finding container d61fb4dbc351b26c251d3469015fc7f9033401d22cd6c8b47548cb37e7ac9418: Status 404 returned error can't find the container with id d61fb4dbc351b26c251d3469015fc7f9033401d22cd6c8b47548cb37e7ac9418 Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.204573 4931 generic.go:334] "Generic (PLEG): container finished" podID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerID="297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181" exitCode=0 Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.204637 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" event={"ID":"c8aaa63b-49f3-44c6-abe3-d24692e5894e","Type":"ContainerDied","Data":"297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.204662 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" event={"ID":"c8aaa63b-49f3-44c6-abe3-d24692e5894e","Type":"ContainerStarted","Data":"be21e08fe5735cb0ef573095e6460329d55a3b26fd373de9ad820520ace903ab"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.207044 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9399bfc6-7083-4978-b49d-bc46769c2b9e","Type":"ContainerStarted","Data":"813a4c2198bcec0f1cab1d8053ab7437050231fbc0dad71c25c447ce377f8fcf"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.209277 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49hcs" event={"ID":"3f484f87-1747-491b-a6c5-dd1d51ff66af","Type":"ContainerStarted","Data":"f317a267d263377c23363a6996fcb39842543912fc8f152fbbaa9e502e107fac"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.215892 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b405de5-9885-473d-acc2-e974d5fcdcdf","Type":"ContainerStarted","Data":"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.215934 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b405de5-9885-473d-acc2-e974d5fcdcdf","Type":"ContainerStarted","Data":"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.215943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b405de5-9885-473d-acc2-e974d5fcdcdf","Type":"ContainerStarted","Data":"8e2f7fd7e3b97e352d13a85c9cb339f64f9b7aade8260e679a2c93c61eeeff04"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.234840 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"564639bd-5984-4822-80a8-c88dd5ae22da","Type":"ContainerStarted","Data":"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.234878 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"564639bd-5984-4822-80a8-c88dd5ae22da","Type":"ContainerStarted","Data":"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.234887 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"564639bd-5984-4822-80a8-c88dd5ae22da","Type":"ContainerStarted","Data":"12fd945e445bdfd59f114b8ea4c531fff45ade762e92da96058d89f075ac3f03"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.241302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc16d452-7a63-4d86-b729-2f7384b3ea73","Type":"ContainerStarted","Data":"781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.241340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc16d452-7a63-4d86-b729-2f7384b3ea73","Type":"ContainerStarted","Data":"be2cb5795144a0336c26c2ce840d01e8f6b40f1134f0aca0ca6716edd8f9b6e4"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.245671 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.245651963 podStartE2EDuration="2.245651963s" podCreationTimestamp="2026-01-30 06:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:40.244166391 +0000 UTC m=+5455.614076648" watchObservedRunningTime="2026-01-30 06:38:40.245651963 +0000 UTC m=+5455.615562230" Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.248757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xg8js" event={"ID":"97ed4dbf-5bb0-45b9-bc15-763a93ba7375","Type":"ContainerStarted","Data":"4fbc33cf72a98eaa43152775b96af49bda5487d618cfc4a531333efaa8d2b27d"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.248811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xg8js" event={"ID":"97ed4dbf-5bb0-45b9-bc15-763a93ba7375","Type":"ContainerStarted","Data":"d61fb4dbc351b26c251d3469015fc7f9033401d22cd6c8b47548cb37e7ac9418"} Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.294111 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.294086148 podStartE2EDuration="2.294086148s" podCreationTimestamp="2026-01-30 06:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:40.283055189 +0000 UTC m=+5455.652965446" watchObservedRunningTime="2026-01-30 06:38:40.294086148 +0000 UTC m=+5455.663996405" Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.323777 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-49hcs" podStartSLOduration=2.323752118 podStartE2EDuration="2.323752118s" podCreationTimestamp="2026-01-30 06:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:40.304691654 +0000 UTC m=+5455.674601901" watchObservedRunningTime="2026-01-30 06:38:40.323752118 +0000 UTC m=+5455.693662375" Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.331101 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xg8js" podStartSLOduration=1.331076422 podStartE2EDuration="1.331076422s" podCreationTimestamp="2026-01-30 06:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:40.319517339 +0000 UTC m=+5455.689427596" watchObservedRunningTime="2026-01-30 06:38:40.331076422 +0000 UTC m=+5455.700986679" Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.394805 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.394786835 podStartE2EDuration="2.394786835s" podCreationTimestamp="2026-01-30 06:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:40.35923097 +0000 UTC m=+5455.729141247" watchObservedRunningTime="2026-01-30 06:38:40.394786835 +0000 UTC m=+5455.764697092" Jan 30 06:38:40 crc kubenswrapper[4931]: I0130 06:38:40.399691 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.399682912 podStartE2EDuration="2.399682912s" podCreationTimestamp="2026-01-30 06:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:40.344189069 +0000 UTC m=+5455.714099326" watchObservedRunningTime="2026-01-30 06:38:40.399682912 +0000 UTC m=+5455.769593169" Jan 30 06:38:41 crc kubenswrapper[4931]: I0130 06:38:41.267706 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" event={"ID":"c8aaa63b-49f3-44c6-abe3-d24692e5894e","Type":"ContainerStarted","Data":"d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69"} Jan 30 06:38:41 crc kubenswrapper[4931]: I0130 06:38:41.312933 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" podStartSLOduration=3.31289401 podStartE2EDuration="3.31289401s" podCreationTimestamp="2026-01-30 06:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:41.289254469 +0000 UTC m=+5456.659164726" watchObservedRunningTime="2026-01-30 06:38:41.31289401 +0000 UTC m=+5456.682804297" Jan 30 06:38:42 crc kubenswrapper[4931]: I0130 06:38:42.274360 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:43 crc kubenswrapper[4931]: I0130 06:38:43.291671 4931 generic.go:334] "Generic (PLEG): container finished" podID="97ed4dbf-5bb0-45b9-bc15-763a93ba7375" containerID="4fbc33cf72a98eaa43152775b96af49bda5487d618cfc4a531333efaa8d2b27d" exitCode=0 Jan 30 06:38:43 crc kubenswrapper[4931]: I0130 06:38:43.291774 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xg8js" event={"ID":"97ed4dbf-5bb0-45b9-bc15-763a93ba7375","Type":"ContainerDied","Data":"4fbc33cf72a98eaa43152775b96af49bda5487d618cfc4a531333efaa8d2b27d"} Jan 30 06:38:43 crc kubenswrapper[4931]: I0130 06:38:43.569368 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 06:38:43 crc kubenswrapper[4931]: I0130 06:38:43.824742 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:43 crc kubenswrapper[4931]: I0130 06:38:43.845351 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:38:43 crc kubenswrapper[4931]: I0130 06:38:43.845451 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.305838 4931 generic.go:334] "Generic (PLEG): container finished" podID="3f484f87-1747-491b-a6c5-dd1d51ff66af" containerID="f317a267d263377c23363a6996fcb39842543912fc8f152fbbaa9e502e107fac" exitCode=0 Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.306199 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49hcs" event={"ID":"3f484f87-1747-491b-a6c5-dd1d51ff66af","Type":"ContainerDied","Data":"f317a267d263377c23363a6996fcb39842543912fc8f152fbbaa9e502e107fac"} Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.708496 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.868235 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzr5v\" (UniqueName: \"kubernetes.io/projected/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-kube-api-access-xzr5v\") pod \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.868294 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-config-data\") pod \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.868555 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-combined-ca-bundle\") pod \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.868607 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-scripts\") pod \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\" (UID: \"97ed4dbf-5bb0-45b9-bc15-763a93ba7375\") " Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.875079 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-kube-api-access-xzr5v" (OuterVolumeSpecName: "kube-api-access-xzr5v") pod "97ed4dbf-5bb0-45b9-bc15-763a93ba7375" (UID: "97ed4dbf-5bb0-45b9-bc15-763a93ba7375"). InnerVolumeSpecName "kube-api-access-xzr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.875910 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-scripts" (OuterVolumeSpecName: "scripts") pod "97ed4dbf-5bb0-45b9-bc15-763a93ba7375" (UID: "97ed4dbf-5bb0-45b9-bc15-763a93ba7375"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.895311 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97ed4dbf-5bb0-45b9-bc15-763a93ba7375" (UID: "97ed4dbf-5bb0-45b9-bc15-763a93ba7375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.921022 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-config-data" (OuterVolumeSpecName: "config-data") pod "97ed4dbf-5bb0-45b9-bc15-763a93ba7375" (UID: "97ed4dbf-5bb0-45b9-bc15-763a93ba7375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.971651 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzr5v\" (UniqueName: \"kubernetes.io/projected/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-kube-api-access-xzr5v\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.971712 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.971734 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:44 crc kubenswrapper[4931]: I0130 06:38:44.971753 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ed4dbf-5bb0-45b9-bc15-763a93ba7375-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.321316 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xg8js" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.321334 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xg8js" event={"ID":"97ed4dbf-5bb0-45b9-bc15-763a93ba7375","Type":"ContainerDied","Data":"d61fb4dbc351b26c251d3469015fc7f9033401d22cd6c8b47548cb37e7ac9418"} Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.321405 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61fb4dbc351b26c251d3469015fc7f9033401d22cd6c8b47548cb37e7ac9418" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.985478 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:38:45 crc kubenswrapper[4931]: E0130 06:38:45.986162 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ed4dbf-5bb0-45b9-bc15-763a93ba7375" containerName="nova-cell1-conductor-db-sync" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.986179 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ed4dbf-5bb0-45b9-bc15-763a93ba7375" containerName="nova-cell1-conductor-db-sync" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.986398 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ed4dbf-5bb0-45b9-bc15-763a93ba7375" containerName="nova-cell1-conductor-db-sync" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.987091 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.989490 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 06:38:45 crc kubenswrapper[4931]: I0130 06:38:45.997102 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.093077 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.093215 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4r9\" (UniqueName: \"kubernetes.io/projected/c9b04495-2e29-4188-adbe-e6ed3669c25a-kube-api-access-tt4r9\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.093508 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.200589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.200698 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.200743 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4r9\" (UniqueName: \"kubernetes.io/projected/c9b04495-2e29-4188-adbe-e6ed3669c25a-kube-api-access-tt4r9\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.207482 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.208862 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.214569 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.224728 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4r9\" (UniqueName: \"kubernetes.io/projected/c9b04495-2e29-4188-adbe-e6ed3669c25a-kube-api-access-tt4r9\") pod \"nova-cell1-conductor-0\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.294069 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.314896 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.383624 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49hcs" event={"ID":"3f484f87-1747-491b-a6c5-dd1d51ff66af","Type":"ContainerDied","Data":"96ecf819b31adbd25c929467c1cd8090a82c8e42d995ce015f866e19d37cb78f"} Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.383678 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49hcs" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.383680 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ecf819b31adbd25c929467c1cd8090a82c8e42d995ce015f866e19d37cb78f" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.403574 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-combined-ca-bundle\") pod \"3f484f87-1747-491b-a6c5-dd1d51ff66af\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.403784 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmfn\" (UniqueName: \"kubernetes.io/projected/3f484f87-1747-491b-a6c5-dd1d51ff66af-kube-api-access-6rmfn\") pod \"3f484f87-1747-491b-a6c5-dd1d51ff66af\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.403927 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-config-data\") pod \"3f484f87-1747-491b-a6c5-dd1d51ff66af\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.404011 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-scripts\") pod \"3f484f87-1747-491b-a6c5-dd1d51ff66af\" (UID: \"3f484f87-1747-491b-a6c5-dd1d51ff66af\") " Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.412038 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f484f87-1747-491b-a6c5-dd1d51ff66af-kube-api-access-6rmfn" (OuterVolumeSpecName: "kube-api-access-6rmfn") pod "3f484f87-1747-491b-a6c5-dd1d51ff66af" (UID: "3f484f87-1747-491b-a6c5-dd1d51ff66af"). InnerVolumeSpecName "kube-api-access-6rmfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.413672 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-scripts" (OuterVolumeSpecName: "scripts") pod "3f484f87-1747-491b-a6c5-dd1d51ff66af" (UID: "3f484f87-1747-491b-a6c5-dd1d51ff66af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.426239 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f484f87-1747-491b-a6c5-dd1d51ff66af" (UID: "3f484f87-1747-491b-a6c5-dd1d51ff66af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.439056 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-config-data" (OuterVolumeSpecName: "config-data") pod "3f484f87-1747-491b-a6c5-dd1d51ff66af" (UID: "3f484f87-1747-491b-a6c5-dd1d51ff66af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.518833 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rmfn\" (UniqueName: \"kubernetes.io/projected/3f484f87-1747-491b-a6c5-dd1d51ff66af-kube-api-access-6rmfn\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.518879 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.518891 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.518907 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f484f87-1747-491b-a6c5-dd1d51ff66af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.558855 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.559152 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-log" containerID="cri-o://02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2" gracePeriod=30 Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.559456 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-api" containerID="cri-o://0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794" gracePeriod=30 Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.566307 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.566524 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9399bfc6-7083-4978-b49d-bc46769c2b9e" containerName="nova-scheduler-scheduler" containerID="cri-o://813a4c2198bcec0f1cab1d8053ab7437050231fbc0dad71c25c447ce377f8fcf" gracePeriod=30 Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.590625 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.590839 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-log" containerID="cri-o://cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95" gracePeriod=30 Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.591676 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-metadata" containerID="cri-o://1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3" gracePeriod=30 Jan 30 06:38:46 crc kubenswrapper[4931]: I0130 06:38:46.851264 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.068481 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.231033 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-combined-ca-bundle\") pod \"3b405de5-9885-473d-acc2-e974d5fcdcdf\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.231372 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qdwb\" (UniqueName: \"kubernetes.io/projected/3b405de5-9885-473d-acc2-e974d5fcdcdf-kube-api-access-5qdwb\") pod \"3b405de5-9885-473d-acc2-e974d5fcdcdf\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.231441 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-config-data\") pod \"3b405de5-9885-473d-acc2-e974d5fcdcdf\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.231640 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b405de5-9885-473d-acc2-e974d5fcdcdf-logs\") pod \"3b405de5-9885-473d-acc2-e974d5fcdcdf\" (UID: \"3b405de5-9885-473d-acc2-e974d5fcdcdf\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.232323 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b405de5-9885-473d-acc2-e974d5fcdcdf-logs" (OuterVolumeSpecName: "logs") pod "3b405de5-9885-473d-acc2-e974d5fcdcdf" (UID: "3b405de5-9885-473d-acc2-e974d5fcdcdf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.236514 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b405de5-9885-473d-acc2-e974d5fcdcdf-kube-api-access-5qdwb" (OuterVolumeSpecName: "kube-api-access-5qdwb") pod "3b405de5-9885-473d-acc2-e974d5fcdcdf" (UID: "3b405de5-9885-473d-acc2-e974d5fcdcdf"). InnerVolumeSpecName "kube-api-access-5qdwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.237676 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.257077 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b405de5-9885-473d-acc2-e974d5fcdcdf" (UID: "3b405de5-9885-473d-acc2-e974d5fcdcdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.273492 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-config-data" (OuterVolumeSpecName: "config-data") pod "3b405de5-9885-473d-acc2-e974d5fcdcdf" (UID: "3b405de5-9885-473d-acc2-e974d5fcdcdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.333331 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42vr8\" (UniqueName: \"kubernetes.io/projected/564639bd-5984-4822-80a8-c88dd5ae22da-kube-api-access-42vr8\") pod \"564639bd-5984-4822-80a8-c88dd5ae22da\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.333413 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-config-data\") pod \"564639bd-5984-4822-80a8-c88dd5ae22da\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.333472 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-combined-ca-bundle\") pod \"564639bd-5984-4822-80a8-c88dd5ae22da\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.333572 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564639bd-5984-4822-80a8-c88dd5ae22da-logs\") pod \"564639bd-5984-4822-80a8-c88dd5ae22da\" (UID: \"564639bd-5984-4822-80a8-c88dd5ae22da\") " Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.334203 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b405de5-9885-473d-acc2-e974d5fcdcdf-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.334253 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.334269 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qdwb\" (UniqueName: \"kubernetes.io/projected/3b405de5-9885-473d-acc2-e974d5fcdcdf-kube-api-access-5qdwb\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.334281 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b405de5-9885-473d-acc2-e974d5fcdcdf-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.334985 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564639bd-5984-4822-80a8-c88dd5ae22da-logs" (OuterVolumeSpecName: "logs") pod "564639bd-5984-4822-80a8-c88dd5ae22da" (UID: "564639bd-5984-4822-80a8-c88dd5ae22da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.338809 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564639bd-5984-4822-80a8-c88dd5ae22da-kube-api-access-42vr8" (OuterVolumeSpecName: "kube-api-access-42vr8") pod "564639bd-5984-4822-80a8-c88dd5ae22da" (UID: "564639bd-5984-4822-80a8-c88dd5ae22da"). InnerVolumeSpecName "kube-api-access-42vr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.358107 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "564639bd-5984-4822-80a8-c88dd5ae22da" (UID: "564639bd-5984-4822-80a8-c88dd5ae22da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.378455 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-config-data" (OuterVolumeSpecName: "config-data") pod "564639bd-5984-4822-80a8-c88dd5ae22da" (UID: "564639bd-5984-4822-80a8-c88dd5ae22da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.392102 4931 generic.go:334] "Generic (PLEG): container finished" podID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerID="0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794" exitCode=0 Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.393286 4931 generic.go:334] "Generic (PLEG): container finished" podID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerID="02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2" exitCode=143 Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.392354 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.392264 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b405de5-9885-473d-acc2-e974d5fcdcdf","Type":"ContainerDied","Data":"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.394053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b405de5-9885-473d-acc2-e974d5fcdcdf","Type":"ContainerDied","Data":"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.394076 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b405de5-9885-473d-acc2-e974d5fcdcdf","Type":"ContainerDied","Data":"8e2f7fd7e3b97e352d13a85c9cb339f64f9b7aade8260e679a2c93c61eeeff04"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.394097 4931 scope.go:117] "RemoveContainer" containerID="0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.396768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c9b04495-2e29-4188-adbe-e6ed3669c25a","Type":"ContainerStarted","Data":"7956c67be4873a213d6ce531a234902ab8e420fef8d81bd2f9cc50a55b2ed19e"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.396803 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c9b04495-2e29-4188-adbe-e6ed3669c25a","Type":"ContainerStarted","Data":"c53e952c29d0f7bb7753df2ecd373b270a2a034437bebd33a1a8707e3ab33ea8"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.397199 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.404235 4931 generic.go:334] "Generic (PLEG): container finished" podID="564639bd-5984-4822-80a8-c88dd5ae22da" containerID="1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3" exitCode=0 Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.404453 4931 generic.go:334] "Generic (PLEG): container finished" podID="564639bd-5984-4822-80a8-c88dd5ae22da" containerID="cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95" exitCode=143 Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.404669 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"564639bd-5984-4822-80a8-c88dd5ae22da","Type":"ContainerDied","Data":"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.404828 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"564639bd-5984-4822-80a8-c88dd5ae22da","Type":"ContainerDied","Data":"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.404965 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"564639bd-5984-4822-80a8-c88dd5ae22da","Type":"ContainerDied","Data":"12fd945e445bdfd59f114b8ea4c531fff45ade762e92da96058d89f075ac3f03"} Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.405359 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.428512 4931 scope.go:117] "RemoveContainer" containerID="02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.443152 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42vr8\" (UniqueName: \"kubernetes.io/projected/564639bd-5984-4822-80a8-c88dd5ae22da-kube-api-access-42vr8\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.443505 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.443593 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564639bd-5984-4822-80a8-c88dd5ae22da-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.443693 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564639bd-5984-4822-80a8-c88dd5ae22da-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.450639 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.450608549 podStartE2EDuration="2.450608549s" podCreationTimestamp="2026-01-30 06:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:47.426947357 +0000 UTC m=+5462.796857614" watchObservedRunningTime="2026-01-30 06:38:47.450608549 +0000 UTC m=+5462.820518846" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.473398 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.516064 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.519666 4931 scope.go:117] "RemoveContainer" containerID="0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.520632 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794\": container with ID starting with 0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794 not found: ID does not exist" containerID="0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.520664 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794"} err="failed to get container status \"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794\": rpc error: code = NotFound desc = could not find container \"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794\": container with ID starting with 0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.520685 4931 scope.go:117] "RemoveContainer" containerID="02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.520907 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2\": container with ID starting with 02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2 not found: ID does not exist" containerID="02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.521579 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2"} err="failed to get container status \"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2\": rpc error: code = NotFound desc = could not find container \"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2\": container with ID starting with 02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.521608 4931 scope.go:117] "RemoveContainer" containerID="0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.521924 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794"} err="failed to get container status \"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794\": rpc error: code = NotFound desc = could not find container \"0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794\": container with ID starting with 0f837ad88ec9472bf685a2a4e4bca226406e5e671841748e97cd2f939ad6d794 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.521979 4931 scope.go:117] "RemoveContainer" containerID="02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.522397 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2"} err="failed to get container status \"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2\": rpc error: code = NotFound desc = could not find container \"02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2\": container with ID starting with 02ba2014b1566d696849ad8dcc9456620c92cebec9969725ae530ca735245ce2 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.522439 4931 scope.go:117] "RemoveContainer" containerID="1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.524777 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.525170 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-log" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525181 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-log" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.525200 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-api" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525206 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-api" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.525215 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-metadata" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525221 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-metadata" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.525234 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-log" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525239 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-log" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.525254 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f484f87-1747-491b-a6c5-dd1d51ff66af" containerName="nova-manage" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525259 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f484f87-1747-491b-a6c5-dd1d51ff66af" containerName="nova-manage" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525402 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-log" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525429 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f484f87-1747-491b-a6c5-dd1d51ff66af" containerName="nova-manage" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525443 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" containerName="nova-api-api" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525454 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-metadata" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.525465 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" containerName="nova-metadata-log" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.526715 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.532672 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.538530 4931 scope.go:117] "RemoveContainer" containerID="cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.544698 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.552439 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.559515 4931 scope.go:117] "RemoveContainer" containerID="1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.560490 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.562753 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3\": container with ID starting with 1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3 not found: ID does not exist" containerID="1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.562788 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3"} err="failed to get container status \"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3\": rpc error: code = NotFound desc = could not find container \"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3\": container with ID starting with 1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.562817 4931 scope.go:117] "RemoveContainer" containerID="cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95" Jan 30 06:38:47 crc kubenswrapper[4931]: E0130 06:38:47.563149 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95\": container with ID starting with cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95 not found: ID does not exist" containerID="cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.563186 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95"} err="failed to get container status \"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95\": rpc error: code = NotFound desc = could not find container \"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95\": container with ID starting with cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.563211 4931 scope.go:117] "RemoveContainer" containerID="1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.563567 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3"} err="failed to get container status \"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3\": rpc error: code = NotFound desc = could not find container \"1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3\": container with ID starting with 1df264c72c48138559ebff03be7bd1d563bc44130837ce7ac5ce396f96609ca3 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.563585 4931 scope.go:117] "RemoveContainer" containerID="cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.563829 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95"} err="failed to get container status \"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95\": rpc error: code = NotFound desc = could not find container \"cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95\": container with ID starting with cae1c4faeb2daaa7074a7b00fa2d92090fd657c7c771ec3ca9a114e70a76bb95 not found: ID does not exist" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.568242 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.569684 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.571732 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.576676 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.647977 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648093 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9psj\" (UniqueName: \"kubernetes.io/projected/dd921adf-3fc0-4727-bf34-17203123e432-kube-api-access-z9psj\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648271 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc9m\" (UniqueName: \"kubernetes.io/projected/7144d1e4-cf7f-4cd9-891c-02bf466f894f-kube-api-access-mpc9m\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648327 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648362 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-config-data\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648480 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-config-data\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648536 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd921adf-3fc0-4727-bf34-17203123e432-logs\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.648573 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7144d1e4-cf7f-4cd9-891c-02bf466f894f-logs\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.749753 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-config-data\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.749813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd921adf-3fc0-4727-bf34-17203123e432-logs\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.749838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7144d1e4-cf7f-4cd9-891c-02bf466f894f-logs\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.749892 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.749924 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9psj\" (UniqueName: \"kubernetes.io/projected/dd921adf-3fc0-4727-bf34-17203123e432-kube-api-access-z9psj\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.750010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc9m\" (UniqueName: \"kubernetes.io/projected/7144d1e4-cf7f-4cd9-891c-02bf466f894f-kube-api-access-mpc9m\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.750032 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.750049 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-config-data\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.751588 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd921adf-3fc0-4727-bf34-17203123e432-logs\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.754884 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-config-data\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.768317 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7144d1e4-cf7f-4cd9-891c-02bf466f894f-logs\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.769049 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.769303 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-config-data\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.769727 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.772349 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9psj\" (UniqueName: \"kubernetes.io/projected/dd921adf-3fc0-4727-bf34-17203123e432-kube-api-access-z9psj\") pod \"nova-metadata-0\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.773001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc9m\" (UniqueName: \"kubernetes.io/projected/7144d1e4-cf7f-4cd9-891c-02bf466f894f-kube-api-access-mpc9m\") pod \"nova-api-0\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " pod="openstack/nova-api-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.848910 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:38:47 crc kubenswrapper[4931]: I0130 06:38:47.884112 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.271581 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:38:48 crc kubenswrapper[4931]: W0130 06:38:48.276978 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7144d1e4_cf7f_4cd9_891c_02bf466f894f.slice/crio-d8da632fe1a21ba627e5a93e9f2262e371b5481c6d1d50e41210e4e09d1d78c5 WatchSource:0}: Error finding container d8da632fe1a21ba627e5a93e9f2262e371b5481c6d1d50e41210e4e09d1d78c5: Status 404 returned error can't find the container with id d8da632fe1a21ba627e5a93e9f2262e371b5481c6d1d50e41210e4e09d1d78c5 Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.360507 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:38:48 crc kubenswrapper[4931]: W0130 06:38:48.375297 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd921adf_3fc0_4727_bf34_17203123e432.slice/crio-3427a754aa9dbc9c068a50beefaa44100c472db7b66af28a59aa83acea56ca82 WatchSource:0}: Error finding container 3427a754aa9dbc9c068a50beefaa44100c472db7b66af28a59aa83acea56ca82: Status 404 returned error can't find the container with id 3427a754aa9dbc9c068a50beefaa44100c472db7b66af28a59aa83acea56ca82 Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.419524 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7144d1e4-cf7f-4cd9-891c-02bf466f894f","Type":"ContainerStarted","Data":"d8da632fe1a21ba627e5a93e9f2262e371b5481c6d1d50e41210e4e09d1d78c5"} Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.421673 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd921adf-3fc0-4727-bf34-17203123e432","Type":"ContainerStarted","Data":"3427a754aa9dbc9c068a50beefaa44100c472db7b66af28a59aa83acea56ca82"} Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.825264 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.839102 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.909695 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.972885 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf9d65499-j99dc"] Jan 30 06:38:48 crc kubenswrapper[4931]: I0130 06:38:48.973109 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerName="dnsmasq-dns" containerID="cri-o://2e8dbb3daa04d0b7ca86823625c46d0d78b12efba028999fcdcceb4252172d0f" gracePeriod=10 Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.436231 4931 generic.go:334] "Generic (PLEG): container finished" podID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerID="2e8dbb3daa04d0b7ca86823625c46d0d78b12efba028999fcdcceb4252172d0f" exitCode=0 Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.442542 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b405de5-9885-473d-acc2-e974d5fcdcdf" path="/var/lib/kubelet/pods/3b405de5-9885-473d-acc2-e974d5fcdcdf/volumes" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.443168 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564639bd-5984-4822-80a8-c88dd5ae22da" path="/var/lib/kubelet/pods/564639bd-5984-4822-80a8-c88dd5ae22da/volumes" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.443787 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" event={"ID":"94c3e877-4729-4777-9460-2fdce31b2bc3","Type":"ContainerDied","Data":"2e8dbb3daa04d0b7ca86823625c46d0d78b12efba028999fcdcceb4252172d0f"} Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.466557 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7144d1e4-cf7f-4cd9-891c-02bf466f894f","Type":"ContainerStarted","Data":"99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed"} Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.466600 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7144d1e4-cf7f-4cd9-891c-02bf466f894f","Type":"ContainerStarted","Data":"587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8"} Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.470301 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd921adf-3fc0-4727-bf34-17203123e432","Type":"ContainerStarted","Data":"c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5"} Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.470333 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd921adf-3fc0-4727-bf34-17203123e432","Type":"ContainerStarted","Data":"93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb"} Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.487721 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.507653 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5076304350000003 podStartE2EDuration="2.507630435s" podCreationTimestamp="2026-01-30 06:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:49.491464883 +0000 UTC m=+5464.861375160" watchObservedRunningTime="2026-01-30 06:38:49.507630435 +0000 UTC m=+5464.877540692" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.525768 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.538555 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.53853694 podStartE2EDuration="2.53853694s" podCreationTimestamp="2026-01-30 06:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:49.518749017 +0000 UTC m=+5464.888659274" watchObservedRunningTime="2026-01-30 06:38:49.53853694 +0000 UTC m=+5464.908447197" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.705092 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-sb\") pod \"94c3e877-4729-4777-9460-2fdce31b2bc3\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.705144 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrwmf\" (UniqueName: \"kubernetes.io/projected/94c3e877-4729-4777-9460-2fdce31b2bc3-kube-api-access-qrwmf\") pod \"94c3e877-4729-4777-9460-2fdce31b2bc3\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.705165 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-dns-svc\") pod \"94c3e877-4729-4777-9460-2fdce31b2bc3\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.705367 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-config\") pod \"94c3e877-4729-4777-9460-2fdce31b2bc3\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.705873 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-nb\") pod \"94c3e877-4729-4777-9460-2fdce31b2bc3\" (UID: \"94c3e877-4729-4777-9460-2fdce31b2bc3\") " Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.714188 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c3e877-4729-4777-9460-2fdce31b2bc3-kube-api-access-qrwmf" (OuterVolumeSpecName: "kube-api-access-qrwmf") pod "94c3e877-4729-4777-9460-2fdce31b2bc3" (UID: "94c3e877-4729-4777-9460-2fdce31b2bc3"). InnerVolumeSpecName "kube-api-access-qrwmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.747588 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "94c3e877-4729-4777-9460-2fdce31b2bc3" (UID: "94c3e877-4729-4777-9460-2fdce31b2bc3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.748406 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94c3e877-4729-4777-9460-2fdce31b2bc3" (UID: "94c3e877-4729-4777-9460-2fdce31b2bc3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.748818 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94c3e877-4729-4777-9460-2fdce31b2bc3" (UID: "94c3e877-4729-4777-9460-2fdce31b2bc3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.762382 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-config" (OuterVolumeSpecName: "config") pod "94c3e877-4729-4777-9460-2fdce31b2bc3" (UID: "94c3e877-4729-4777-9460-2fdce31b2bc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.808138 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.808167 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.808179 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.808189 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrwmf\" (UniqueName: \"kubernetes.io/projected/94c3e877-4729-4777-9460-2fdce31b2bc3-kube-api-access-qrwmf\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:49 crc kubenswrapper[4931]: I0130 06:38:49.808199 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c3e877-4729-4777-9460-2fdce31b2bc3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.481338 4931 generic.go:334] "Generic (PLEG): container finished" podID="9399bfc6-7083-4978-b49d-bc46769c2b9e" containerID="813a4c2198bcec0f1cab1d8053ab7437050231fbc0dad71c25c447ce377f8fcf" exitCode=0 Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.481469 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9399bfc6-7083-4978-b49d-bc46769c2b9e","Type":"ContainerDied","Data":"813a4c2198bcec0f1cab1d8053ab7437050231fbc0dad71c25c447ce377f8fcf"} Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.484182 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" event={"ID":"94c3e877-4729-4777-9460-2fdce31b2bc3","Type":"ContainerDied","Data":"cbfdef9dfaed32413bc6b3a12689ef221574794a4832dba0bff0f3e04d140623"} Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.484261 4931 scope.go:117] "RemoveContainer" containerID="2e8dbb3daa04d0b7ca86823625c46d0d78b12efba028999fcdcceb4252172d0f" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.484491 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf9d65499-j99dc" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.527511 4931 scope.go:117] "RemoveContainer" containerID="d3534b4266cad4f8c042a5d1a723852bc5684f3d2cae49aae0ea01f2e1276ee4" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.529579 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf9d65499-j99dc"] Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.545014 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf9d65499-j99dc"] Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.762987 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.938974 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-combined-ca-bundle\") pod \"9399bfc6-7083-4978-b49d-bc46769c2b9e\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.939566 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-config-data\") pod \"9399bfc6-7083-4978-b49d-bc46769c2b9e\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.939693 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzszf\" (UniqueName: \"kubernetes.io/projected/9399bfc6-7083-4978-b49d-bc46769c2b9e-kube-api-access-gzszf\") pod \"9399bfc6-7083-4978-b49d-bc46769c2b9e\" (UID: \"9399bfc6-7083-4978-b49d-bc46769c2b9e\") " Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.949986 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9399bfc6-7083-4978-b49d-bc46769c2b9e-kube-api-access-gzszf" (OuterVolumeSpecName: "kube-api-access-gzszf") pod "9399bfc6-7083-4978-b49d-bc46769c2b9e" (UID: "9399bfc6-7083-4978-b49d-bc46769c2b9e"). InnerVolumeSpecName "kube-api-access-gzszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.987631 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9399bfc6-7083-4978-b49d-bc46769c2b9e" (UID: "9399bfc6-7083-4978-b49d-bc46769c2b9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:50 crc kubenswrapper[4931]: I0130 06:38:50.988364 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-config-data" (OuterVolumeSpecName: "config-data") pod "9399bfc6-7083-4978-b49d-bc46769c2b9e" (UID: "9399bfc6-7083-4978-b49d-bc46769c2b9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.044081 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.044157 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzszf\" (UniqueName: \"kubernetes.io/projected/9399bfc6-7083-4978-b49d-bc46769c2b9e-kube-api-access-gzszf\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.044187 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9399bfc6-7083-4978-b49d-bc46769c2b9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.441564 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" path="/var/lib/kubelet/pods/94c3e877-4729-4777-9460-2fdce31b2bc3/volumes" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.520618 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9399bfc6-7083-4978-b49d-bc46769c2b9e","Type":"ContainerDied","Data":"a4fd2130796fb3af0e7960b4f61e0b3a284a488489630a1d386b0d9487a9d9c8"} Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.520699 4931 scope.go:117] "RemoveContainer" containerID="813a4c2198bcec0f1cab1d8053ab7437050231fbc0dad71c25c447ce377f8fcf" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.520745 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.582880 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.612234 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.625295 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:51 crc kubenswrapper[4931]: E0130 06:38:51.625800 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerName="dnsmasq-dns" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.625824 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerName="dnsmasq-dns" Jan 30 06:38:51 crc kubenswrapper[4931]: E0130 06:38:51.625844 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerName="init" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.625853 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerName="init" Jan 30 06:38:51 crc kubenswrapper[4931]: E0130 06:38:51.625889 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9399bfc6-7083-4978-b49d-bc46769c2b9e" containerName="nova-scheduler-scheduler" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.625899 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9399bfc6-7083-4978-b49d-bc46769c2b9e" containerName="nova-scheduler-scheduler" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.626117 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9399bfc6-7083-4978-b49d-bc46769c2b9e" containerName="nova-scheduler-scheduler" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.626149 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c3e877-4729-4777-9460-2fdce31b2bc3" containerName="dnsmasq-dns" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.626915 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.629599 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.636166 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.661400 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-config-data\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.661568 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.661658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khfr\" (UniqueName: \"kubernetes.io/projected/ee55169a-5fa4-4ad5-b765-41685339650c-kube-api-access-6khfr\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.763647 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.763764 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6khfr\" (UniqueName: \"kubernetes.io/projected/ee55169a-5fa4-4ad5-b765-41685339650c-kube-api-access-6khfr\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.763811 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-config-data\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.767539 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.769402 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-config-data\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.785504 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khfr\" (UniqueName: \"kubernetes.io/projected/ee55169a-5fa4-4ad5-b765-41685339650c-kube-api-access-6khfr\") pod \"nova-scheduler-0\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " pod="openstack/nova-scheduler-0" Jan 30 06:38:51 crc kubenswrapper[4931]: I0130 06:38:51.950939 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:38:52 crc kubenswrapper[4931]: I0130 06:38:52.464453 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:38:52 crc kubenswrapper[4931]: W0130 06:38:52.467627 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee55169a_5fa4_4ad5_b765_41685339650c.slice/crio-c43be8ffe96dc67c1c71579ad5ffebd7fcb89e53c15d4d3b57a62c61c04d179d WatchSource:0}: Error finding container c43be8ffe96dc67c1c71579ad5ffebd7fcb89e53c15d4d3b57a62c61c04d179d: Status 404 returned error can't find the container with id c43be8ffe96dc67c1c71579ad5ffebd7fcb89e53c15d4d3b57a62c61c04d179d Jan 30 06:38:52 crc kubenswrapper[4931]: I0130 06:38:52.538229 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee55169a-5fa4-4ad5-b765-41685339650c","Type":"ContainerStarted","Data":"c43be8ffe96dc67c1c71579ad5ffebd7fcb89e53c15d4d3b57a62c61c04d179d"} Jan 30 06:38:52 crc kubenswrapper[4931]: I0130 06:38:52.849968 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:38:52 crc kubenswrapper[4931]: I0130 06:38:52.850024 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:38:53 crc kubenswrapper[4931]: I0130 06:38:53.442841 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9399bfc6-7083-4978-b49d-bc46769c2b9e" path="/var/lib/kubelet/pods/9399bfc6-7083-4978-b49d-bc46769c2b9e/volumes" Jan 30 06:38:53 crc kubenswrapper[4931]: I0130 06:38:53.553252 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee55169a-5fa4-4ad5-b765-41685339650c","Type":"ContainerStarted","Data":"0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8"} Jan 30 06:38:53 crc kubenswrapper[4931]: I0130 06:38:53.581698 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.581672871 podStartE2EDuration="2.581672871s" podCreationTimestamp="2026-01-30 06:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:53.571117586 +0000 UTC m=+5468.941027873" watchObservedRunningTime="2026-01-30 06:38:53.581672871 +0000 UTC m=+5468.951583158" Jan 30 06:38:54 crc kubenswrapper[4931]: I0130 06:38:54.519872 4931 scope.go:117] "RemoveContainer" containerID="eb43994040c586b7da805891db0738dbfdbd3eca5875691d38856aeb8f3f02e2" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.350784 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.864212 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xqzzz"] Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.865993 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.870332 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.870796 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.876696 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqzzz"] Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.951648 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.971120 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.971179 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zws7k\" (UniqueName: \"kubernetes.io/projected/d80061de-8d87-4c58-8733-26c5224bf03a-kube-api-access-zws7k\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.971235 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-config-data\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:56 crc kubenswrapper[4931]: I0130 06:38:56.971262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-scripts\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.073220 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-config-data\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.073608 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-scripts\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.073891 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.074111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zws7k\" (UniqueName: \"kubernetes.io/projected/d80061de-8d87-4c58-8733-26c5224bf03a-kube-api-access-zws7k\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.078268 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.086180 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-scripts\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.094809 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zws7k\" (UniqueName: \"kubernetes.io/projected/d80061de-8d87-4c58-8733-26c5224bf03a-kube-api-access-zws7k\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.098833 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-config-data\") pod \"nova-cell1-cell-mapping-xqzzz\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.213610 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.363038 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.363291 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.363350 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.363933 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7acc7dd93c5206c6ff8ebf3271a041083dbd9bcf0e00cc88a42d6c0b4c7429dd"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.363982 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://7acc7dd93c5206c6ff8ebf3271a041083dbd9bcf0e00cc88a42d6c0b4c7429dd" gracePeriod=600 Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.603930 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="7acc7dd93c5206c6ff8ebf3271a041083dbd9bcf0e00cc88a42d6c0b4c7429dd" exitCode=0 Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.603985 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"7acc7dd93c5206c6ff8ebf3271a041083dbd9bcf0e00cc88a42d6c0b4c7429dd"} Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.604031 4931 scope.go:117] "RemoveContainer" containerID="a15ec04892f67aace9be1080c17614ea03774513f4a10f801ecc9b778518f57b" Jan 30 06:38:57 crc kubenswrapper[4931]: W0130 06:38:57.717675 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd80061de_8d87_4c58_8733_26c5224bf03a.slice/crio-4bbab414c5eeb76108044d3c6b386d270ce280324b5ec5da58d6e7fbd35f561d WatchSource:0}: Error finding container 4bbab414c5eeb76108044d3c6b386d270ce280324b5ec5da58d6e7fbd35f561d: Status 404 returned error can't find the container with id 4bbab414c5eeb76108044d3c6b386d270ce280324b5ec5da58d6e7fbd35f561d Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.728189 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqzzz"] Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.850275 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.850323 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.884790 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:38:57 crc kubenswrapper[4931]: I0130 06:38:57.886458 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:38:58 crc kubenswrapper[4931]: I0130 06:38:58.615087 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqzzz" event={"ID":"d80061de-8d87-4c58-8733-26c5224bf03a","Type":"ContainerStarted","Data":"ad91a28e445938a8582000a48ddcd232576020a5c15a4a29af6e45aaf8531507"} Jan 30 06:38:58 crc kubenswrapper[4931]: I0130 06:38:58.615545 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqzzz" event={"ID":"d80061de-8d87-4c58-8733-26c5224bf03a","Type":"ContainerStarted","Data":"4bbab414c5eeb76108044d3c6b386d270ce280324b5ec5da58d6e7fbd35f561d"} Jan 30 06:38:58 crc kubenswrapper[4931]: I0130 06:38:58.617845 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7"} Jan 30 06:38:58 crc kubenswrapper[4931]: I0130 06:38:58.648550 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xqzzz" podStartSLOduration=2.648525491 podStartE2EDuration="2.648525491s" podCreationTimestamp="2026-01-30 06:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:58.640037354 +0000 UTC m=+5474.009947621" watchObservedRunningTime="2026-01-30 06:38:58.648525491 +0000 UTC m=+5474.018435758" Jan 30 06:38:59 crc kubenswrapper[4931]: I0130 06:38:59.014674 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:38:59 crc kubenswrapper[4931]: I0130 06:38:59.014730 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:38:59 crc kubenswrapper[4931]: I0130 06:38:59.014714 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:38:59 crc kubenswrapper[4931]: I0130 06:38:59.014674 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.140245 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47vng"] Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.143267 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.162719 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47vng"] Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.248467 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q779l\" (UniqueName: \"kubernetes.io/projected/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-kube-api-access-q779l\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.248510 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-catalog-content\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.248558 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-utilities\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.350077 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q779l\" (UniqueName: \"kubernetes.io/projected/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-kube-api-access-q779l\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.350121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-catalog-content\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.350169 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-utilities\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.350638 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-utilities\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.350732 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-catalog-content\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.388840 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q779l\" (UniqueName: \"kubernetes.io/projected/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-kube-api-access-q779l\") pod \"community-operators-47vng\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.477077 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.951142 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 06:39:01 crc kubenswrapper[4931]: I0130 06:39:01.981336 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.028622 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47vng"] Jan 30 06:39:02 crc kubenswrapper[4931]: W0130 06:39:02.033603 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc3d76da_828d_4a79_8f4f_aa9003c7eb85.slice/crio-254d4413486427e0757d7276f1aefd28a7034ac0768f441833b69cb8156807be WatchSource:0}: Error finding container 254d4413486427e0757d7276f1aefd28a7034ac0768f441833b69cb8156807be: Status 404 returned error can't find the container with id 254d4413486427e0757d7276f1aefd28a7034ac0768f441833b69cb8156807be Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.657155 4931 generic.go:334] "Generic (PLEG): container finished" podID="d80061de-8d87-4c58-8733-26c5224bf03a" containerID="ad91a28e445938a8582000a48ddcd232576020a5c15a4a29af6e45aaf8531507" exitCode=0 Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.657234 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqzzz" event={"ID":"d80061de-8d87-4c58-8733-26c5224bf03a","Type":"ContainerDied","Data":"ad91a28e445938a8582000a48ddcd232576020a5c15a4a29af6e45aaf8531507"} Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.661007 4931 generic.go:334] "Generic (PLEG): container finished" podID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerID="a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b" exitCode=0 Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.661068 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerDied","Data":"a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b"} Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.661113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerStarted","Data":"254d4413486427e0757d7276f1aefd28a7034ac0768f441833b69cb8156807be"} Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.664623 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:39:02 crc kubenswrapper[4931]: I0130 06:39:02.709220 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 06:39:03 crc kubenswrapper[4931]: I0130 06:39:03.675281 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerStarted","Data":"ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a"} Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.186923 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.321947 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zws7k\" (UniqueName: \"kubernetes.io/projected/d80061de-8d87-4c58-8733-26c5224bf03a-kube-api-access-zws7k\") pod \"d80061de-8d87-4c58-8733-26c5224bf03a\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.322171 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-combined-ca-bundle\") pod \"d80061de-8d87-4c58-8733-26c5224bf03a\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.322274 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-config-data\") pod \"d80061de-8d87-4c58-8733-26c5224bf03a\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.322299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-scripts\") pod \"d80061de-8d87-4c58-8733-26c5224bf03a\" (UID: \"d80061de-8d87-4c58-8733-26c5224bf03a\") " Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.328531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-scripts" (OuterVolumeSpecName: "scripts") pod "d80061de-8d87-4c58-8733-26c5224bf03a" (UID: "d80061de-8d87-4c58-8733-26c5224bf03a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.333533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80061de-8d87-4c58-8733-26c5224bf03a-kube-api-access-zws7k" (OuterVolumeSpecName: "kube-api-access-zws7k") pod "d80061de-8d87-4c58-8733-26c5224bf03a" (UID: "d80061de-8d87-4c58-8733-26c5224bf03a"). InnerVolumeSpecName "kube-api-access-zws7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.346155 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-config-data" (OuterVolumeSpecName: "config-data") pod "d80061de-8d87-4c58-8733-26c5224bf03a" (UID: "d80061de-8d87-4c58-8733-26c5224bf03a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.353017 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d80061de-8d87-4c58-8733-26c5224bf03a" (UID: "d80061de-8d87-4c58-8733-26c5224bf03a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.425239 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.425522 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.425682 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zws7k\" (UniqueName: \"kubernetes.io/projected/d80061de-8d87-4c58-8733-26c5224bf03a-kube-api-access-zws7k\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.425814 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d80061de-8d87-4c58-8733-26c5224bf03a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.689753 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqzzz" event={"ID":"d80061de-8d87-4c58-8733-26c5224bf03a","Type":"ContainerDied","Data":"4bbab414c5eeb76108044d3c6b386d270ce280324b5ec5da58d6e7fbd35f561d"} Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.689815 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bbab414c5eeb76108044d3c6b386d270ce280324b5ec5da58d6e7fbd35f561d" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.689909 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqzzz" Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.699989 4931 generic.go:334] "Generic (PLEG): container finished" podID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerID="ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a" exitCode=0 Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.700091 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerDied","Data":"ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a"} Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.872079 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.872783 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-api" containerID="cri-o://99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed" gracePeriod=30 Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.873361 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-log" containerID="cri-o://587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8" gracePeriod=30 Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.943557 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.943798 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ee55169a-5fa4-4ad5-b765-41685339650c" containerName="nova-scheduler-scheduler" containerID="cri-o://0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" gracePeriod=30 Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.959867 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.960121 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-log" containerID="cri-o://93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb" gracePeriod=30 Jan 30 06:39:04 crc kubenswrapper[4931]: I0130 06:39:04.960271 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-metadata" containerID="cri-o://c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5" gracePeriod=30 Jan 30 06:39:05 crc kubenswrapper[4931]: E0130 06:39:05.029033 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7144d1e4_cf7f_4cd9_891c_02bf466f894f.slice/crio-587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8.scope\": RecentStats: unable to find data in memory cache]" Jan 30 06:39:05 crc kubenswrapper[4931]: I0130 06:39:05.713713 4931 generic.go:334] "Generic (PLEG): container finished" podID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerID="587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8" exitCode=143 Jan 30 06:39:05 crc kubenswrapper[4931]: I0130 06:39:05.713786 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7144d1e4-cf7f-4cd9-891c-02bf466f894f","Type":"ContainerDied","Data":"587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8"} Jan 30 06:39:05 crc kubenswrapper[4931]: I0130 06:39:05.716360 4931 generic.go:334] "Generic (PLEG): container finished" podID="dd921adf-3fc0-4727-bf34-17203123e432" containerID="93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb" exitCode=143 Jan 30 06:39:05 crc kubenswrapper[4931]: I0130 06:39:05.716450 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd921adf-3fc0-4727-bf34-17203123e432","Type":"ContainerDied","Data":"93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb"} Jan 30 06:39:05 crc kubenswrapper[4931]: I0130 06:39:05.718688 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerStarted","Data":"c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1"} Jan 30 06:39:05 crc kubenswrapper[4931]: I0130 06:39:05.748378 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47vng" podStartSLOduration=2.322250964 podStartE2EDuration="4.748356576s" podCreationTimestamp="2026-01-30 06:39:01 +0000 UTC" firstStartedPulling="2026-01-30 06:39:02.663406312 +0000 UTC m=+5478.033316569" lastFinishedPulling="2026-01-30 06:39:05.089511924 +0000 UTC m=+5480.459422181" observedRunningTime="2026-01-30 06:39:05.737005529 +0000 UTC m=+5481.106915796" watchObservedRunningTime="2026-01-30 06:39:05.748356576 +0000 UTC m=+5481.118266853" Jan 30 06:39:06 crc kubenswrapper[4931]: E0130 06:39:06.956854 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 06:39:06 crc kubenswrapper[4931]: E0130 06:39:06.958857 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 06:39:06 crc kubenswrapper[4931]: E0130 06:39:06.960780 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 06:39:06 crc kubenswrapper[4931]: E0130 06:39:06.960846 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ee55169a-5fa4-4ad5-b765-41685339650c" containerName="nova-scheduler-scheduler" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.554894 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.563266 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.606890 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7144d1e4-cf7f-4cd9-891c-02bf466f894f-logs\") pod \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.606934 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-config-data\") pod \"dd921adf-3fc0-4727-bf34-17203123e432\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.606974 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd921adf-3fc0-4727-bf34-17203123e432-logs\") pod \"dd921adf-3fc0-4727-bf34-17203123e432\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.607000 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpc9m\" (UniqueName: \"kubernetes.io/projected/7144d1e4-cf7f-4cd9-891c-02bf466f894f-kube-api-access-mpc9m\") pod \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.607050 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-combined-ca-bundle\") pod \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.607089 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-config-data\") pod \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\" (UID: \"7144d1e4-cf7f-4cd9-891c-02bf466f894f\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.607151 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-combined-ca-bundle\") pod \"dd921adf-3fc0-4727-bf34-17203123e432\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.607226 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9psj\" (UniqueName: \"kubernetes.io/projected/dd921adf-3fc0-4727-bf34-17203123e432-kube-api-access-z9psj\") pod \"dd921adf-3fc0-4727-bf34-17203123e432\" (UID: \"dd921adf-3fc0-4727-bf34-17203123e432\") " Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.607894 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7144d1e4-cf7f-4cd9-891c-02bf466f894f-logs" (OuterVolumeSpecName: "logs") pod "7144d1e4-cf7f-4cd9-891c-02bf466f894f" (UID: "7144d1e4-cf7f-4cd9-891c-02bf466f894f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.608001 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd921adf-3fc0-4727-bf34-17203123e432-logs" (OuterVolumeSpecName: "logs") pod "dd921adf-3fc0-4727-bf34-17203123e432" (UID: "dd921adf-3fc0-4727-bf34-17203123e432"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.608453 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7144d1e4-cf7f-4cd9-891c-02bf466f894f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.608472 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd921adf-3fc0-4727-bf34-17203123e432-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.612771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd921adf-3fc0-4727-bf34-17203123e432-kube-api-access-z9psj" (OuterVolumeSpecName: "kube-api-access-z9psj") pod "dd921adf-3fc0-4727-bf34-17203123e432" (UID: "dd921adf-3fc0-4727-bf34-17203123e432"). InnerVolumeSpecName "kube-api-access-z9psj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.613089 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7144d1e4-cf7f-4cd9-891c-02bf466f894f-kube-api-access-mpc9m" (OuterVolumeSpecName: "kube-api-access-mpc9m") pod "7144d1e4-cf7f-4cd9-891c-02bf466f894f" (UID: "7144d1e4-cf7f-4cd9-891c-02bf466f894f"). InnerVolumeSpecName "kube-api-access-mpc9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.631155 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-config-data" (OuterVolumeSpecName: "config-data") pod "7144d1e4-cf7f-4cd9-891c-02bf466f894f" (UID: "7144d1e4-cf7f-4cd9-891c-02bf466f894f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.632022 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-config-data" (OuterVolumeSpecName: "config-data") pod "dd921adf-3fc0-4727-bf34-17203123e432" (UID: "dd921adf-3fc0-4727-bf34-17203123e432"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.634593 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7144d1e4-cf7f-4cd9-891c-02bf466f894f" (UID: "7144d1e4-cf7f-4cd9-891c-02bf466f894f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.641200 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd921adf-3fc0-4727-bf34-17203123e432" (UID: "dd921adf-3fc0-4727-bf34-17203123e432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.710033 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9psj\" (UniqueName: \"kubernetes.io/projected/dd921adf-3fc0-4727-bf34-17203123e432-kube-api-access-z9psj\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.710305 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.710372 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpc9m\" (UniqueName: \"kubernetes.io/projected/7144d1e4-cf7f-4cd9-891c-02bf466f894f-kube-api-access-mpc9m\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.710450 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.710513 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7144d1e4-cf7f-4cd9-891c-02bf466f894f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.710598 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd921adf-3fc0-4727-bf34-17203123e432-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.756168 4931 generic.go:334] "Generic (PLEG): container finished" podID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerID="99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed" exitCode=0 Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.756213 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7144d1e4-cf7f-4cd9-891c-02bf466f894f","Type":"ContainerDied","Data":"99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed"} Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.756257 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.756566 4931 scope.go:117] "RemoveContainer" containerID="99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.756552 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7144d1e4-cf7f-4cd9-891c-02bf466f894f","Type":"ContainerDied","Data":"d8da632fe1a21ba627e5a93e9f2262e371b5481c6d1d50e41210e4e09d1d78c5"} Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.760270 4931 generic.go:334] "Generic (PLEG): container finished" podID="dd921adf-3fc0-4727-bf34-17203123e432" containerID="c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5" exitCode=0 Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.760315 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.760347 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd921adf-3fc0-4727-bf34-17203123e432","Type":"ContainerDied","Data":"c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5"} Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.760733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd921adf-3fc0-4727-bf34-17203123e432","Type":"ContainerDied","Data":"3427a754aa9dbc9c068a50beefaa44100c472db7b66af28a59aa83acea56ca82"} Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.815563 4931 scope.go:117] "RemoveContainer" containerID="587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.845975 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.849413 4931 scope.go:117] "RemoveContainer" containerID="99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.850006 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed\": container with ID starting with 99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed not found: ID does not exist" containerID="99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.850066 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed"} err="failed to get container status \"99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed\": rpc error: code = NotFound desc = could not find container \"99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed\": container with ID starting with 99aec164fb9690cd6c1b7a53b3b877acbec5359ed456860e231836eab572d4ed not found: ID does not exist" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.850100 4931 scope.go:117] "RemoveContainer" containerID="587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.850597 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8\": container with ID starting with 587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8 not found: ID does not exist" containerID="587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.850629 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8"} err="failed to get container status \"587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8\": rpc error: code = NotFound desc = could not find container \"587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8\": container with ID starting with 587caf75c31add55f1a6a574c49914f02af25c4b0be8f6b086fffe845f649aa8 not found: ID does not exist" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.850675 4931 scope.go:117] "RemoveContainer" containerID="c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.877825 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.894592 4931 scope.go:117] "RemoveContainer" containerID="93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.898519 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.899079 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-log" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899108 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-log" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.899125 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80061de-8d87-4c58-8733-26c5224bf03a" containerName="nova-manage" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899137 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80061de-8d87-4c58-8733-26c5224bf03a" containerName="nova-manage" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.899174 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-log" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899186 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-log" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.899204 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-api" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899216 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-api" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.899231 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-metadata" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899242 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-metadata" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899578 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80061de-8d87-4c58-8733-26c5224bf03a" containerName="nova-manage" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899621 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-log" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899646 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-api" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899665 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd921adf-3fc0-4727-bf34-17203123e432" containerName="nova-metadata-metadata" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.899701 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" containerName="nova-api-log" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.901369 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.903880 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.909876 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.918108 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.930234 4931 scope.go:117] "RemoveContainer" containerID="c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.931819 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.932595 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5\": container with ID starting with c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5 not found: ID does not exist" containerID="c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.932634 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5"} err="failed to get container status \"c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5\": rpc error: code = NotFound desc = could not find container \"c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5\": container with ID starting with c0c11230e25196ea7528394085a766313743d5088ac7e8e0d9898abd0fdf47d5 not found: ID does not exist" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.932666 4931 scope.go:117] "RemoveContainer" containerID="93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb" Jan 30 06:39:08 crc kubenswrapper[4931]: E0130 06:39:08.932950 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb\": container with ID starting with 93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb not found: ID does not exist" containerID="93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.932966 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb"} err="failed to get container status \"93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb\": rpc error: code = NotFound desc = could not find container \"93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb\": container with ID starting with 93f20145e875dd7d7fbc11140ea0da414569321168d69c332aa4d305f94bbfcb not found: ID does not exist" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.940209 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.942403 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.948046 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:39:08 crc kubenswrapper[4931]: I0130 06:39:08.952140 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.020567 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-config-data\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.020644 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgzf\" (UniqueName: \"kubernetes.io/projected/4c56d95d-5087-41db-a759-2273aef32a3c-kube-api-access-wcgzf\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.020938 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.021096 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.021255 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530f454-5ee2-4767-8c0b-75d50ba8a44b-logs\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.021454 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c56d95d-5087-41db-a759-2273aef32a3c-logs\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.021497 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chqpp\" (UniqueName: \"kubernetes.io/projected/2530f454-5ee2-4767-8c0b-75d50ba8a44b-kube-api-access-chqpp\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.021624 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-config-data\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.076489 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.123868 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c56d95d-5087-41db-a759-2273aef32a3c-logs\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chqpp\" (UniqueName: \"kubernetes.io/projected/2530f454-5ee2-4767-8c0b-75d50ba8a44b-kube-api-access-chqpp\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124221 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-config-data\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124277 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-config-data\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124311 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgzf\" (UniqueName: \"kubernetes.io/projected/4c56d95d-5087-41db-a759-2273aef32a3c-kube-api-access-wcgzf\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124335 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c56d95d-5087-41db-a759-2273aef32a3c-logs\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124363 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124397 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124510 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530f454-5ee2-4767-8c0b-75d50ba8a44b-logs\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.124888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530f454-5ee2-4767-8c0b-75d50ba8a44b-logs\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.132340 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.133709 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-config-data\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.134747 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.139995 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-config-data\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.144320 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chqpp\" (UniqueName: \"kubernetes.io/projected/2530f454-5ee2-4767-8c0b-75d50ba8a44b-kube-api-access-chqpp\") pod \"nova-api-0\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.145374 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgzf\" (UniqueName: \"kubernetes.io/projected/4c56d95d-5087-41db-a759-2273aef32a3c-kube-api-access-wcgzf\") pod \"nova-metadata-0\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.216614 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.225587 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-config-data\") pod \"ee55169a-5fa4-4ad5-b765-41685339650c\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.225732 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6khfr\" (UniqueName: \"kubernetes.io/projected/ee55169a-5fa4-4ad5-b765-41685339650c-kube-api-access-6khfr\") pod \"ee55169a-5fa4-4ad5-b765-41685339650c\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.225788 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-combined-ca-bundle\") pod \"ee55169a-5fa4-4ad5-b765-41685339650c\" (UID: \"ee55169a-5fa4-4ad5-b765-41685339650c\") " Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.232283 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee55169a-5fa4-4ad5-b765-41685339650c-kube-api-access-6khfr" (OuterVolumeSpecName: "kube-api-access-6khfr") pod "ee55169a-5fa4-4ad5-b765-41685339650c" (UID: "ee55169a-5fa4-4ad5-b765-41685339650c"). InnerVolumeSpecName "kube-api-access-6khfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.255366 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee55169a-5fa4-4ad5-b765-41685339650c" (UID: "ee55169a-5fa4-4ad5-b765-41685339650c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.259468 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.272635 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-config-data" (OuterVolumeSpecName: "config-data") pod "ee55169a-5fa4-4ad5-b765-41685339650c" (UID: "ee55169a-5fa4-4ad5-b765-41685339650c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.328090 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.328128 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6khfr\" (UniqueName: \"kubernetes.io/projected/ee55169a-5fa4-4ad5-b765-41685339650c-kube-api-access-6khfr\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.328143 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee55169a-5fa4-4ad5-b765-41685339650c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.443452 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7144d1e4-cf7f-4cd9-891c-02bf466f894f" path="/var/lib/kubelet/pods/7144d1e4-cf7f-4cd9-891c-02bf466f894f/volumes" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.444231 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd921adf-3fc0-4727-bf34-17203123e432" path="/var/lib/kubelet/pods/dd921adf-3fc0-4727-bf34-17203123e432/volumes" Jan 30 06:39:09 crc kubenswrapper[4931]: W0130 06:39:09.724523 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c56d95d_5087_41db_a759_2273aef32a3c.slice/crio-927c72bfe637ff62dfcafae17c76ddbeded161c27b84f0c5f70d67241dbe6fdd WatchSource:0}: Error finding container 927c72bfe637ff62dfcafae17c76ddbeded161c27b84f0c5f70d67241dbe6fdd: Status 404 returned error can't find the container with id 927c72bfe637ff62dfcafae17c76ddbeded161c27b84f0c5f70d67241dbe6fdd Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.727850 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.790854 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c56d95d-5087-41db-a759-2273aef32a3c","Type":"ContainerStarted","Data":"927c72bfe637ff62dfcafae17c76ddbeded161c27b84f0c5f70d67241dbe6fdd"} Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.792946 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.799788 4931 generic.go:334] "Generic (PLEG): container finished" podID="ee55169a-5fa4-4ad5-b765-41685339650c" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" exitCode=0 Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.799816 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee55169a-5fa4-4ad5-b765-41685339650c","Type":"ContainerDied","Data":"0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8"} Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.799839 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee55169a-5fa4-4ad5-b765-41685339650c","Type":"ContainerDied","Data":"c43be8ffe96dc67c1c71579ad5ffebd7fcb89e53c15d4d3b57a62c61c04d179d"} Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.799855 4931 scope.go:117] "RemoveContainer" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.799895 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:39:09 crc kubenswrapper[4931]: W0130 06:39:09.814913 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2530f454_5ee2_4767_8c0b_75d50ba8a44b.slice/crio-aef8e55688d59f4d24924520cc3d195a0fe2137c7ba9584c52f87f8f22ad3859 WatchSource:0}: Error finding container aef8e55688d59f4d24924520cc3d195a0fe2137c7ba9584c52f87f8f22ad3859: Status 404 returned error can't find the container with id aef8e55688d59f4d24924520cc3d195a0fe2137c7ba9584c52f87f8f22ad3859 Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.831177 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.832635 4931 scope.go:117] "RemoveContainer" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" Jan 30 06:39:09 crc kubenswrapper[4931]: E0130 06:39:09.833123 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8\": container with ID starting with 0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8 not found: ID does not exist" containerID="0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.833173 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8"} err="failed to get container status \"0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8\": rpc error: code = NotFound desc = could not find container \"0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8\": container with ID starting with 0c76769cb41b7bc6ed6258448d4d6f919efef2f7b0e763d4a7ea1422d4adbfd8 not found: ID does not exist" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.853750 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.861692 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: E0130 06:39:09.862343 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee55169a-5fa4-4ad5-b765-41685339650c" containerName="nova-scheduler-scheduler" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.862363 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee55169a-5fa4-4ad5-b765-41685339650c" containerName="nova-scheduler-scheduler" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.862734 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee55169a-5fa4-4ad5-b765-41685339650c" containerName="nova-scheduler-scheduler" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.863799 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.868261 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.889575 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.950800 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpxfx\" (UniqueName: \"kubernetes.io/projected/0cc67f7b-ac8c-4b63-8f28-fd5135307022-kube-api-access-gpxfx\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.950847 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-config-data\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:09 crc kubenswrapper[4931]: I0130 06:39:09.950869 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.051714 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.052093 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpxfx\" (UniqueName: \"kubernetes.io/projected/0cc67f7b-ac8c-4b63-8f28-fd5135307022-kube-api-access-gpxfx\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.052119 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-config-data\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.057036 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.057094 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-config-data\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.077664 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpxfx\" (UniqueName: \"kubernetes.io/projected/0cc67f7b-ac8c-4b63-8f28-fd5135307022-kube-api-access-gpxfx\") pod \"nova-scheduler-0\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.220741 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.707744 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.816956 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c56d95d-5087-41db-a759-2273aef32a3c","Type":"ContainerStarted","Data":"7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a"} Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.817312 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c56d95d-5087-41db-a759-2273aef32a3c","Type":"ContainerStarted","Data":"a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd"} Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.818724 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc67f7b-ac8c-4b63-8f28-fd5135307022","Type":"ContainerStarted","Data":"caa956bb861b331dfc23294a937380376a24f5f4a7dcf1c49c1dbdd00bea437a"} Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.828686 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530f454-5ee2-4767-8c0b-75d50ba8a44b","Type":"ContainerStarted","Data":"b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155"} Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.828746 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530f454-5ee2-4767-8c0b-75d50ba8a44b","Type":"ContainerStarted","Data":"d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc"} Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.828768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530f454-5ee2-4767-8c0b-75d50ba8a44b","Type":"ContainerStarted","Data":"aef8e55688d59f4d24924520cc3d195a0fe2137c7ba9584c52f87f8f22ad3859"} Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.863577 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.863546998 podStartE2EDuration="2.863546998s" podCreationTimestamp="2026-01-30 06:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:10.844989889 +0000 UTC m=+5486.214900156" watchObservedRunningTime="2026-01-30 06:39:10.863546998 +0000 UTC m=+5486.233457295" Jan 30 06:39:10 crc kubenswrapper[4931]: I0130 06:39:10.874708 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.87468396 podStartE2EDuration="2.87468396s" podCreationTimestamp="2026-01-30 06:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:10.864167456 +0000 UTC m=+5486.234077713" watchObservedRunningTime="2026-01-30 06:39:10.87468396 +0000 UTC m=+5486.244594257" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.440767 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee55169a-5fa4-4ad5-b765-41685339650c" path="/var/lib/kubelet/pods/ee55169a-5fa4-4ad5-b765-41685339650c/volumes" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.477662 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.477794 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.559091 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.842152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc67f7b-ac8c-4b63-8f28-fd5135307022","Type":"ContainerStarted","Data":"51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635"} Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.877496 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.877471444 podStartE2EDuration="2.877471444s" podCreationTimestamp="2026-01-30 06:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:11.874560193 +0000 UTC m=+5487.244470500" watchObservedRunningTime="2026-01-30 06:39:11.877471444 +0000 UTC m=+5487.247381741" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.928856 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:11 crc kubenswrapper[4931]: I0130 06:39:11.991281 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47vng"] Jan 30 06:39:13 crc kubenswrapper[4931]: I0130 06:39:13.864218 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47vng" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="registry-server" containerID="cri-o://c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1" gracePeriod=2 Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.220587 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.220994 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.367083 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.447105 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-catalog-content\") pod \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.447301 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-utilities\") pod \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.447390 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q779l\" (UniqueName: \"kubernetes.io/projected/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-kube-api-access-q779l\") pod \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\" (UID: \"cc3d76da-828d-4a79-8f4f-aa9003c7eb85\") " Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.448858 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-utilities" (OuterVolumeSpecName: "utilities") pod "cc3d76da-828d-4a79-8f4f-aa9003c7eb85" (UID: "cc3d76da-828d-4a79-8f4f-aa9003c7eb85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.453399 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-kube-api-access-q779l" (OuterVolumeSpecName: "kube-api-access-q779l") pod "cc3d76da-828d-4a79-8f4f-aa9003c7eb85" (UID: "cc3d76da-828d-4a79-8f4f-aa9003c7eb85"). InnerVolumeSpecName "kube-api-access-q779l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.502447 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc3d76da-828d-4a79-8f4f-aa9003c7eb85" (UID: "cc3d76da-828d-4a79-8f4f-aa9003c7eb85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.549846 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q779l\" (UniqueName: \"kubernetes.io/projected/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-kube-api-access-q779l\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.549875 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.549884 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc3d76da-828d-4a79-8f4f-aa9003c7eb85-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.878027 4931 generic.go:334] "Generic (PLEG): container finished" podID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerID="c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1" exitCode=0 Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.878102 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerDied","Data":"c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1"} Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.878115 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47vng" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.878144 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47vng" event={"ID":"cc3d76da-828d-4a79-8f4f-aa9003c7eb85","Type":"ContainerDied","Data":"254d4413486427e0757d7276f1aefd28a7034ac0768f441833b69cb8156807be"} Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.878175 4931 scope.go:117] "RemoveContainer" containerID="c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.908153 4931 scope.go:117] "RemoveContainer" containerID="ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.944442 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47vng"] Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.960470 4931 scope.go:117] "RemoveContainer" containerID="a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.963585 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47vng"] Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.985370 4931 scope.go:117] "RemoveContainer" containerID="c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1" Jan 30 06:39:14 crc kubenswrapper[4931]: E0130 06:39:14.986219 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1\": container with ID starting with c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1 not found: ID does not exist" containerID="c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.986257 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1"} err="failed to get container status \"c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1\": rpc error: code = NotFound desc = could not find container \"c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1\": container with ID starting with c48f0755a45d4afa892452045777a85195356f2296938c94d478ca9d785262a1 not found: ID does not exist" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.986278 4931 scope.go:117] "RemoveContainer" containerID="ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a" Jan 30 06:39:14 crc kubenswrapper[4931]: E0130 06:39:14.986702 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a\": container with ID starting with ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a not found: ID does not exist" containerID="ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.986732 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a"} err="failed to get container status \"ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a\": rpc error: code = NotFound desc = could not find container \"ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a\": container with ID starting with ef4ac3d8b0d4e86ef467dd6f37fc4673ea797f5d3dbabc0f98fc2c28865e956a not found: ID does not exist" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.986749 4931 scope.go:117] "RemoveContainer" containerID="a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b" Jan 30 06:39:14 crc kubenswrapper[4931]: E0130 06:39:14.986984 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b\": container with ID starting with a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b not found: ID does not exist" containerID="a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b" Jan 30 06:39:14 crc kubenswrapper[4931]: I0130 06:39:14.987018 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b"} err="failed to get container status \"a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b\": rpc error: code = NotFound desc = could not find container \"a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b\": container with ID starting with a2e2cc9e9d4103ac2ed8bada89c67b32e5943424cecc01bc674e582cc8f1414b not found: ID does not exist" Jan 30 06:39:15 crc kubenswrapper[4931]: I0130 06:39:15.220838 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 06:39:15 crc kubenswrapper[4931]: I0130 06:39:15.446236 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" path="/var/lib/kubelet/pods/cc3d76da-828d-4a79-8f4f-aa9003c7eb85/volumes" Jan 30 06:39:19 crc kubenswrapper[4931]: I0130 06:39:19.217345 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:39:19 crc kubenswrapper[4931]: I0130 06:39:19.218084 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:39:19 crc kubenswrapper[4931]: I0130 06:39:19.260651 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:39:19 crc kubenswrapper[4931]: I0130 06:39:19.261584 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.221856 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.259259 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.299650 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.299695 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.381856 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.382075 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:39:20 crc kubenswrapper[4931]: I0130 06:39:20.979591 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.222127 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.224176 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.226239 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.276864 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.277713 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.277976 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:39:29 crc kubenswrapper[4931]: I0130 06:39:29.283338 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.065455 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.067793 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.069695 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.317289 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6b7dcd95-j4hmz"] Jan 30 06:39:30 crc kubenswrapper[4931]: E0130 06:39:30.317881 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="registry-server" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.317894 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="registry-server" Jan 30 06:39:30 crc kubenswrapper[4931]: E0130 06:39:30.317908 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="extract-content" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.317914 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="extract-content" Jan 30 06:39:30 crc kubenswrapper[4931]: E0130 06:39:30.317936 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="extract-utilities" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.317942 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="extract-utilities" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.318126 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc3d76da-828d-4a79-8f4f-aa9003c7eb85" containerName="registry-server" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.318980 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.344070 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6b7dcd95-j4hmz"] Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.390618 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.390816 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drhrf\" (UniqueName: \"kubernetes.io/projected/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-kube-api-access-drhrf\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.390941 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-config\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.391646 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.391748 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-dns-svc\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.495599 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.495676 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-dns-svc\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.495889 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.496138 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drhrf\" (UniqueName: \"kubernetes.io/projected/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-kube-api-access-drhrf\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.496889 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.497069 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.497076 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-dns-svc\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.497290 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-config\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.497842 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-config\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.515389 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drhrf\" (UniqueName: \"kubernetes.io/projected/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-kube-api-access-drhrf\") pod \"dnsmasq-dns-5b6b7dcd95-j4hmz\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:30 crc kubenswrapper[4931]: I0130 06:39:30.636853 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:31 crc kubenswrapper[4931]: I0130 06:39:31.690186 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6b7dcd95-j4hmz"] Jan 30 06:39:31 crc kubenswrapper[4931]: W0130 06:39:31.707682 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf1a68d3_4ec7_46e5_9fee_ea7f89ec8c7b.slice/crio-4b8f45f3ed9effeb333573d3d797f78260b16f7ed70972bc0ea03075747c1260 WatchSource:0}: Error finding container 4b8f45f3ed9effeb333573d3d797f78260b16f7ed70972bc0ea03075747c1260: Status 404 returned error can't find the container with id 4b8f45f3ed9effeb333573d3d797f78260b16f7ed70972bc0ea03075747c1260 Jan 30 06:39:32 crc kubenswrapper[4931]: I0130 06:39:32.089528 4931 generic.go:334] "Generic (PLEG): container finished" podID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerID="f570ad04682e1e4020b7b6f4103c4537ec3f58b31b00dec90861ecb783916f09" exitCode=0 Jan 30 06:39:32 crc kubenswrapper[4931]: I0130 06:39:32.089582 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" event={"ID":"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b","Type":"ContainerDied","Data":"f570ad04682e1e4020b7b6f4103c4537ec3f58b31b00dec90861ecb783916f09"} Jan 30 06:39:32 crc kubenswrapper[4931]: I0130 06:39:32.090101 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" event={"ID":"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b","Type":"ContainerStarted","Data":"4b8f45f3ed9effeb333573d3d797f78260b16f7ed70972bc0ea03075747c1260"} Jan 30 06:39:33 crc kubenswrapper[4931]: I0130 06:39:33.106157 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" event={"ID":"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b","Type":"ContainerStarted","Data":"88c455ebf77b0c0d27d3a79f9baa7eb72c57c08194cb775791bc835634485c4f"} Jan 30 06:39:33 crc kubenswrapper[4931]: I0130 06:39:33.106577 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:33 crc kubenswrapper[4931]: I0130 06:39:33.127106 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" podStartSLOduration=3.127075451 podStartE2EDuration="3.127075451s" podCreationTimestamp="2026-01-30 06:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:33.123875852 +0000 UTC m=+5508.493786139" watchObservedRunningTime="2026-01-30 06:39:33.127075451 +0000 UTC m=+5508.496985738" Jan 30 06:39:40 crc kubenswrapper[4931]: I0130 06:39:40.638673 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:39:40 crc kubenswrapper[4931]: I0130 06:39:40.713795 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67c9d4fb9c-cpnds"] Jan 30 06:39:40 crc kubenswrapper[4931]: I0130 06:39:40.714002 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerName="dnsmasq-dns" containerID="cri-o://d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69" gracePeriod=10 Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.195357 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.201512 4931 generic.go:334] "Generic (PLEG): container finished" podID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerID="d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69" exitCode=0 Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.201547 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" event={"ID":"c8aaa63b-49f3-44c6-abe3-d24692e5894e","Type":"ContainerDied","Data":"d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69"} Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.201569 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" event={"ID":"c8aaa63b-49f3-44c6-abe3-d24692e5894e","Type":"ContainerDied","Data":"be21e08fe5735cb0ef573095e6460329d55a3b26fd373de9ad820520ace903ab"} Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.201585 4931 scope.go:117] "RemoveContainer" containerID="d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.201628 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c9d4fb9c-cpnds" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.243082 4931 scope.go:117] "RemoveContainer" containerID="297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.283043 4931 scope.go:117] "RemoveContainer" containerID="d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69" Jan 30 06:39:41 crc kubenswrapper[4931]: E0130 06:39:41.283543 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69\": container with ID starting with d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69 not found: ID does not exist" containerID="d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.283574 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69"} err="failed to get container status \"d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69\": rpc error: code = NotFound desc = could not find container \"d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69\": container with ID starting with d1d0fb7302cb4bb01785b643544b6fa52ef364befea892a07300a916312dad69 not found: ID does not exist" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.283602 4931 scope.go:117] "RemoveContainer" containerID="297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181" Jan 30 06:39:41 crc kubenswrapper[4931]: E0130 06:39:41.283975 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181\": container with ID starting with 297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181 not found: ID does not exist" containerID="297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.284000 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181"} err="failed to get container status \"297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181\": rpc error: code = NotFound desc = could not find container \"297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181\": container with ID starting with 297ce66cdef573b34d7a957122e5080cc6a632a68b54cc34d36fdf95b8d11181 not found: ID does not exist" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.326352 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-config\") pod \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.326449 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-dns-svc\") pod \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.327270 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-sb\") pod \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.327295 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-nb\") pod \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.327328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm6xc\" (UniqueName: \"kubernetes.io/projected/c8aaa63b-49f3-44c6-abe3-d24692e5894e-kube-api-access-hm6xc\") pod \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\" (UID: \"c8aaa63b-49f3-44c6-abe3-d24692e5894e\") " Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.335715 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8aaa63b-49f3-44c6-abe3-d24692e5894e-kube-api-access-hm6xc" (OuterVolumeSpecName: "kube-api-access-hm6xc") pod "c8aaa63b-49f3-44c6-abe3-d24692e5894e" (UID: "c8aaa63b-49f3-44c6-abe3-d24692e5894e"). InnerVolumeSpecName "kube-api-access-hm6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.381356 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8aaa63b-49f3-44c6-abe3-d24692e5894e" (UID: "c8aaa63b-49f3-44c6-abe3-d24692e5894e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.381404 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-config" (OuterVolumeSpecName: "config") pod "c8aaa63b-49f3-44c6-abe3-d24692e5894e" (UID: "c8aaa63b-49f3-44c6-abe3-d24692e5894e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.391549 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8aaa63b-49f3-44c6-abe3-d24692e5894e" (UID: "c8aaa63b-49f3-44c6-abe3-d24692e5894e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.407093 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8aaa63b-49f3-44c6-abe3-d24692e5894e" (UID: "c8aaa63b-49f3-44c6-abe3-d24692e5894e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.428749 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.428790 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.428799 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.428810 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm6xc\" (UniqueName: \"kubernetes.io/projected/c8aaa63b-49f3-44c6-abe3-d24692e5894e-kube-api-access-hm6xc\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.428820 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8aaa63b-49f3-44c6-abe3-d24692e5894e-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.537237 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67c9d4fb9c-cpnds"] Jan 30 06:39:41 crc kubenswrapper[4931]: I0130 06:39:41.549016 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67c9d4fb9c-cpnds"] Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.432851 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" path="/var/lib/kubelet/pods/c8aaa63b-49f3-44c6-abe3-d24692e5894e/volumes" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.441107 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7fmgw"] Jan 30 06:39:43 crc kubenswrapper[4931]: E0130 06:39:43.441513 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerName="init" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.441535 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerName="init" Jan 30 06:39:43 crc kubenswrapper[4931]: E0130 06:39:43.441570 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerName="dnsmasq-dns" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.441577 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerName="dnsmasq-dns" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.441728 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aaa63b-49f3-44c6-abe3-d24692e5894e" containerName="dnsmasq-dns" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.442292 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.458629 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7fmgw"] Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.478286 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7909a37-a194-4666-b642-8193c2b8e29c-operator-scripts\") pod \"cinder-db-create-7fmgw\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.478366 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gxb\" (UniqueName: \"kubernetes.io/projected/a7909a37-a194-4666-b642-8193c2b8e29c-kube-api-access-f7gxb\") pod \"cinder-db-create-7fmgw\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.550902 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f264-account-create-update-hm2jw"] Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.552245 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.558548 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f264-account-create-update-hm2jw"] Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.558764 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.580664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7909a37-a194-4666-b642-8193c2b8e29c-operator-scripts\") pod \"cinder-db-create-7fmgw\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.580733 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gxb\" (UniqueName: \"kubernetes.io/projected/a7909a37-a194-4666-b642-8193c2b8e29c-kube-api-access-f7gxb\") pod \"cinder-db-create-7fmgw\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.581446 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7909a37-a194-4666-b642-8193c2b8e29c-operator-scripts\") pod \"cinder-db-create-7fmgw\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.611503 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gxb\" (UniqueName: \"kubernetes.io/projected/a7909a37-a194-4666-b642-8193c2b8e29c-kube-api-access-f7gxb\") pod \"cinder-db-create-7fmgw\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.682194 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44bda186-cc7a-4422-8266-5f494795cf7f-operator-scripts\") pod \"cinder-f264-account-create-update-hm2jw\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.682275 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68rpb\" (UniqueName: \"kubernetes.io/projected/44bda186-cc7a-4422-8266-5f494795cf7f-kube-api-access-68rpb\") pod \"cinder-f264-account-create-update-hm2jw\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.759275 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.784493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68rpb\" (UniqueName: \"kubernetes.io/projected/44bda186-cc7a-4422-8266-5f494795cf7f-kube-api-access-68rpb\") pod \"cinder-f264-account-create-update-hm2jw\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.784747 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44bda186-cc7a-4422-8266-5f494795cf7f-operator-scripts\") pod \"cinder-f264-account-create-update-hm2jw\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.788048 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44bda186-cc7a-4422-8266-5f494795cf7f-operator-scripts\") pod \"cinder-f264-account-create-update-hm2jw\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.806794 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68rpb\" (UniqueName: \"kubernetes.io/projected/44bda186-cc7a-4422-8266-5f494795cf7f-kube-api-access-68rpb\") pod \"cinder-f264-account-create-update-hm2jw\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:43 crc kubenswrapper[4931]: I0130 06:39:43.871011 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:44 crc kubenswrapper[4931]: I0130 06:39:44.235875 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f264-account-create-update-hm2jw"] Jan 30 06:39:44 crc kubenswrapper[4931]: I0130 06:39:44.246710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f264-account-create-update-hm2jw" event={"ID":"44bda186-cc7a-4422-8266-5f494795cf7f","Type":"ContainerStarted","Data":"dd5867b3c6aa732ef49ab6ef7d805c2250f0d888186dd17b002ffcc4871b0ba1"} Jan 30 06:39:44 crc kubenswrapper[4931]: I0130 06:39:44.286509 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7fmgw"] Jan 30 06:39:45 crc kubenswrapper[4931]: I0130 06:39:45.262487 4931 generic.go:334] "Generic (PLEG): container finished" podID="44bda186-cc7a-4422-8266-5f494795cf7f" containerID="956ce554bd663761599c9dc4f978e7719f40043720c3d10db30cc18c76ff6127" exitCode=0 Jan 30 06:39:45 crc kubenswrapper[4931]: I0130 06:39:45.262610 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f264-account-create-update-hm2jw" event={"ID":"44bda186-cc7a-4422-8266-5f494795cf7f","Type":"ContainerDied","Data":"956ce554bd663761599c9dc4f978e7719f40043720c3d10db30cc18c76ff6127"} Jan 30 06:39:45 crc kubenswrapper[4931]: I0130 06:39:45.267631 4931 generic.go:334] "Generic (PLEG): container finished" podID="a7909a37-a194-4666-b642-8193c2b8e29c" containerID="cff7e0d64b5667667e85a8a7d8d6d557567a72e224933981bb30fb75cc9c37a5" exitCode=0 Jan 30 06:39:45 crc kubenswrapper[4931]: I0130 06:39:45.267672 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7fmgw" event={"ID":"a7909a37-a194-4666-b642-8193c2b8e29c","Type":"ContainerDied","Data":"cff7e0d64b5667667e85a8a7d8d6d557567a72e224933981bb30fb75cc9c37a5"} Jan 30 06:39:45 crc kubenswrapper[4931]: I0130 06:39:45.267697 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7fmgw" event={"ID":"a7909a37-a194-4666-b642-8193c2b8e29c","Type":"ContainerStarted","Data":"6b2dd84b807447d9c8b727f554e36988b6bedbe41f7dd0abb738cb7ae80014fc"} Jan 30 06:39:46 crc kubenswrapper[4931]: I0130 06:39:46.934000 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:46 crc kubenswrapper[4931]: I0130 06:39:46.941558 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.048536 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7909a37-a194-4666-b642-8193c2b8e29c-operator-scripts\") pod \"a7909a37-a194-4666-b642-8193c2b8e29c\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.048711 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7gxb\" (UniqueName: \"kubernetes.io/projected/a7909a37-a194-4666-b642-8193c2b8e29c-kube-api-access-f7gxb\") pod \"a7909a37-a194-4666-b642-8193c2b8e29c\" (UID: \"a7909a37-a194-4666-b642-8193c2b8e29c\") " Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.048911 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68rpb\" (UniqueName: \"kubernetes.io/projected/44bda186-cc7a-4422-8266-5f494795cf7f-kube-api-access-68rpb\") pod \"44bda186-cc7a-4422-8266-5f494795cf7f\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.048951 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44bda186-cc7a-4422-8266-5f494795cf7f-operator-scripts\") pod \"44bda186-cc7a-4422-8266-5f494795cf7f\" (UID: \"44bda186-cc7a-4422-8266-5f494795cf7f\") " Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.049144 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7909a37-a194-4666-b642-8193c2b8e29c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7909a37-a194-4666-b642-8193c2b8e29c" (UID: "a7909a37-a194-4666-b642-8193c2b8e29c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.049613 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7909a37-a194-4666-b642-8193c2b8e29c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.050026 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44bda186-cc7a-4422-8266-5f494795cf7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44bda186-cc7a-4422-8266-5f494795cf7f" (UID: "44bda186-cc7a-4422-8266-5f494795cf7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.053914 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7909a37-a194-4666-b642-8193c2b8e29c-kube-api-access-f7gxb" (OuterVolumeSpecName: "kube-api-access-f7gxb") pod "a7909a37-a194-4666-b642-8193c2b8e29c" (UID: "a7909a37-a194-4666-b642-8193c2b8e29c"). InnerVolumeSpecName "kube-api-access-f7gxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.055851 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bda186-cc7a-4422-8266-5f494795cf7f-kube-api-access-68rpb" (OuterVolumeSpecName: "kube-api-access-68rpb") pod "44bda186-cc7a-4422-8266-5f494795cf7f" (UID: "44bda186-cc7a-4422-8266-5f494795cf7f"). InnerVolumeSpecName "kube-api-access-68rpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.151345 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7gxb\" (UniqueName: \"kubernetes.io/projected/a7909a37-a194-4666-b642-8193c2b8e29c-kube-api-access-f7gxb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.151398 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68rpb\" (UniqueName: \"kubernetes.io/projected/44bda186-cc7a-4422-8266-5f494795cf7f-kube-api-access-68rpb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.151443 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44bda186-cc7a-4422-8266-5f494795cf7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.291609 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f264-account-create-update-hm2jw" event={"ID":"44bda186-cc7a-4422-8266-5f494795cf7f","Type":"ContainerDied","Data":"dd5867b3c6aa732ef49ab6ef7d805c2250f0d888186dd17b002ffcc4871b0ba1"} Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.291646 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5867b3c6aa732ef49ab6ef7d805c2250f0d888186dd17b002ffcc4871b0ba1" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.291659 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f264-account-create-update-hm2jw" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.293784 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7fmgw" event={"ID":"a7909a37-a194-4666-b642-8193c2b8e29c","Type":"ContainerDied","Data":"6b2dd84b807447d9c8b727f554e36988b6bedbe41f7dd0abb738cb7ae80014fc"} Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.293806 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2dd84b807447d9c8b727f554e36988b6bedbe41f7dd0abb738cb7ae80014fc" Jan 30 06:39:47 crc kubenswrapper[4931]: I0130 06:39:47.293873 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7fmgw" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.725333 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tfpsx"] Jan 30 06:39:48 crc kubenswrapper[4931]: E0130 06:39:48.725903 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7909a37-a194-4666-b642-8193c2b8e29c" containerName="mariadb-database-create" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.725926 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7909a37-a194-4666-b642-8193c2b8e29c" containerName="mariadb-database-create" Jan 30 06:39:48 crc kubenswrapper[4931]: E0130 06:39:48.725969 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bda186-cc7a-4422-8266-5f494795cf7f" containerName="mariadb-account-create-update" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.725981 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bda186-cc7a-4422-8266-5f494795cf7f" containerName="mariadb-account-create-update" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.726276 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7909a37-a194-4666-b642-8193c2b8e29c" containerName="mariadb-database-create" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.726312 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bda186-cc7a-4422-8266-5f494795cf7f" containerName="mariadb-account-create-update" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.727314 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.732406 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.732963 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-499hm" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.733193 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.743308 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tfpsx"] Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.898256 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqjk\" (UniqueName: \"kubernetes.io/projected/de0677a1-9051-4719-9e4e-142694e6683a-kube-api-access-jzqjk\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.899000 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-db-sync-config-data\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.899378 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-config-data\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.899560 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-combined-ca-bundle\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.899769 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-scripts\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:48 crc kubenswrapper[4931]: I0130 06:39:48.899832 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de0677a1-9051-4719-9e4e-142694e6683a-etc-machine-id\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.001849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-config-data\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.001997 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-combined-ca-bundle\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.002155 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-scripts\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.002212 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de0677a1-9051-4719-9e4e-142694e6683a-etc-machine-id\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.002354 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqjk\" (UniqueName: \"kubernetes.io/projected/de0677a1-9051-4719-9e4e-142694e6683a-kube-api-access-jzqjk\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.002466 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-db-sync-config-data\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.002487 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de0677a1-9051-4719-9e4e-142694e6683a-etc-machine-id\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.011025 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-config-data\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.014284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-combined-ca-bundle\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.015131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-scripts\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.015257 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-db-sync-config-data\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.033062 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqjk\" (UniqueName: \"kubernetes.io/projected/de0677a1-9051-4719-9e4e-142694e6683a-kube-api-access-jzqjk\") pod \"cinder-db-sync-tfpsx\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.056334 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:49 crc kubenswrapper[4931]: I0130 06:39:49.571632 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tfpsx"] Jan 30 06:39:49 crc kubenswrapper[4931]: W0130 06:39:49.578161 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde0677a1_9051_4719_9e4e_142694e6683a.slice/crio-02130b7da365af6fbb4827061845a7d662a4972437bcbee99c02f1e731fe6ada WatchSource:0}: Error finding container 02130b7da365af6fbb4827061845a7d662a4972437bcbee99c02f1e731fe6ada: Status 404 returned error can't find the container with id 02130b7da365af6fbb4827061845a7d662a4972437bcbee99c02f1e731fe6ada Jan 30 06:39:50 crc kubenswrapper[4931]: I0130 06:39:50.320254 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tfpsx" event={"ID":"de0677a1-9051-4719-9e4e-142694e6683a","Type":"ContainerStarted","Data":"02130b7da365af6fbb4827061845a7d662a4972437bcbee99c02f1e731fe6ada"} Jan 30 06:39:51 crc kubenswrapper[4931]: I0130 06:39:51.337393 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tfpsx" event={"ID":"de0677a1-9051-4719-9e4e-142694e6683a","Type":"ContainerStarted","Data":"18638ce6c93ee0d191ba3ee6b587a88fb1ae5413bad9f52cba0bc5cd608d3a29"} Jan 30 06:39:51 crc kubenswrapper[4931]: I0130 06:39:51.374986 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tfpsx" podStartSLOduration=3.374958873 podStartE2EDuration="3.374958873s" podCreationTimestamp="2026-01-30 06:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:51.361016153 +0000 UTC m=+5526.730926450" watchObservedRunningTime="2026-01-30 06:39:51.374958873 +0000 UTC m=+5526.744869170" Jan 30 06:39:53 crc kubenswrapper[4931]: I0130 06:39:53.369485 4931 generic.go:334] "Generic (PLEG): container finished" podID="de0677a1-9051-4719-9e4e-142694e6683a" containerID="18638ce6c93ee0d191ba3ee6b587a88fb1ae5413bad9f52cba0bc5cd608d3a29" exitCode=0 Jan 30 06:39:53 crc kubenswrapper[4931]: I0130 06:39:53.369546 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tfpsx" event={"ID":"de0677a1-9051-4719-9e4e-142694e6683a","Type":"ContainerDied","Data":"18638ce6c93ee0d191ba3ee6b587a88fb1ae5413bad9f52cba0bc5cd608d3a29"} Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.813195 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.966167 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-db-sync-config-data\") pod \"de0677a1-9051-4719-9e4e-142694e6683a\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.966275 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-config-data\") pod \"de0677a1-9051-4719-9e4e-142694e6683a\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.966328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-combined-ca-bundle\") pod \"de0677a1-9051-4719-9e4e-142694e6683a\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.966371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqjk\" (UniqueName: \"kubernetes.io/projected/de0677a1-9051-4719-9e4e-142694e6683a-kube-api-access-jzqjk\") pod \"de0677a1-9051-4719-9e4e-142694e6683a\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.966414 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-scripts\") pod \"de0677a1-9051-4719-9e4e-142694e6683a\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.966477 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de0677a1-9051-4719-9e4e-142694e6683a-etc-machine-id\") pod \"de0677a1-9051-4719-9e4e-142694e6683a\" (UID: \"de0677a1-9051-4719-9e4e-142694e6683a\") " Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.967062 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de0677a1-9051-4719-9e4e-142694e6683a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "de0677a1-9051-4719-9e4e-142694e6683a" (UID: "de0677a1-9051-4719-9e4e-142694e6683a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.973519 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-scripts" (OuterVolumeSpecName: "scripts") pod "de0677a1-9051-4719-9e4e-142694e6683a" (UID: "de0677a1-9051-4719-9e4e-142694e6683a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.973661 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de0677a1-9051-4719-9e4e-142694e6683a" (UID: "de0677a1-9051-4719-9e4e-142694e6683a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:54 crc kubenswrapper[4931]: I0130 06:39:54.973881 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de0677a1-9051-4719-9e4e-142694e6683a-kube-api-access-jzqjk" (OuterVolumeSpecName: "kube-api-access-jzqjk") pod "de0677a1-9051-4719-9e4e-142694e6683a" (UID: "de0677a1-9051-4719-9e4e-142694e6683a"). InnerVolumeSpecName "kube-api-access-jzqjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.015694 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-config-data" (OuterVolumeSpecName: "config-data") pod "de0677a1-9051-4719-9e4e-142694e6683a" (UID: "de0677a1-9051-4719-9e4e-142694e6683a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.022908 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de0677a1-9051-4719-9e4e-142694e6683a" (UID: "de0677a1-9051-4719-9e4e-142694e6683a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.069519 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.069562 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.069576 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.069591 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqjk\" (UniqueName: \"kubernetes.io/projected/de0677a1-9051-4719-9e4e-142694e6683a-kube-api-access-jzqjk\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.069604 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de0677a1-9051-4719-9e4e-142694e6683a-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.069617 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de0677a1-9051-4719-9e4e-142694e6683a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.390843 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tfpsx" event={"ID":"de0677a1-9051-4719-9e4e-142694e6683a","Type":"ContainerDied","Data":"02130b7da365af6fbb4827061845a7d662a4972437bcbee99c02f1e731fe6ada"} Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.390878 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02130b7da365af6fbb4827061845a7d662a4972437bcbee99c02f1e731fe6ada" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.390942 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tfpsx" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.768006 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5879d4f7c5-x7dw2"] Jan 30 06:39:55 crc kubenswrapper[4931]: E0130 06:39:55.769355 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0677a1-9051-4719-9e4e-142694e6683a" containerName="cinder-db-sync" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.769391 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0677a1-9051-4719-9e4e-142694e6683a" containerName="cinder-db-sync" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.774293 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="de0677a1-9051-4719-9e4e-142694e6683a" containerName="cinder-db-sync" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.785570 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.787456 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-nb\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.787596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9tk\" (UniqueName: \"kubernetes.io/projected/214b78b9-e769-4474-be87-e9b494c2fa69-kube-api-access-pn9tk\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.787689 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-dns-svc\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.787715 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-config\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.787847 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-sb\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.795374 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5879d4f7c5-x7dw2"] Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.890351 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9tk\" (UniqueName: \"kubernetes.io/projected/214b78b9-e769-4474-be87-e9b494c2fa69-kube-api-access-pn9tk\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.890781 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-dns-svc\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.890805 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-config\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.890866 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-sb\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.890927 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-nb\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.891932 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-config\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.892626 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-sb\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.893784 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-nb\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.893939 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-dns-svc\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:55 crc kubenswrapper[4931]: I0130 06:39:55.923258 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9tk\" (UniqueName: \"kubernetes.io/projected/214b78b9-e769-4474-be87-e9b494c2fa69-kube-api-access-pn9tk\") pod \"dnsmasq-dns-5879d4f7c5-x7dw2\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.088902 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.092299 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093223 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data-custom\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093280 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093325 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d301dc-5f68-4e1b-ae27-51aa02e45789-logs\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093410 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv5dr\" (UniqueName: \"kubernetes.io/projected/91d301dc-5f68-4e1b-ae27-51aa02e45789-kube-api-access-lv5dr\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093533 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093573 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91d301dc-5f68-4e1b-ae27-51aa02e45789-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.093625 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-scripts\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.096648 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.096802 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-499hm" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.096907 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.097122 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.112923 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.119068 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.194925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-scripts\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195027 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data-custom\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195058 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195092 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d301dc-5f68-4e1b-ae27-51aa02e45789-logs\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195119 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv5dr\" (UniqueName: \"kubernetes.io/projected/91d301dc-5f68-4e1b-ae27-51aa02e45789-kube-api-access-lv5dr\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195136 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195152 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91d301dc-5f68-4e1b-ae27-51aa02e45789-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195261 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91d301dc-5f68-4e1b-ae27-51aa02e45789-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.195967 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d301dc-5f68-4e1b-ae27-51aa02e45789-logs\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.200848 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-scripts\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.201204 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.202271 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data-custom\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.205580 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.210304 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv5dr\" (UniqueName: \"kubernetes.io/projected/91d301dc-5f68-4e1b-ae27-51aa02e45789-kube-api-access-lv5dr\") pod \"cinder-api-0\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.413680 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.638487 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5879d4f7c5-x7dw2"] Jan 30 06:39:56 crc kubenswrapper[4931]: I0130 06:39:56.703199 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:39:57 crc kubenswrapper[4931]: I0130 06:39:57.429661 4931 generic.go:334] "Generic (PLEG): container finished" podID="214b78b9-e769-4474-be87-e9b494c2fa69" containerID="45cf3829eaba7efc9ffdbde5fa46c91facdbe555edf8963708f266596e0113d9" exitCode=0 Jan 30 06:39:57 crc kubenswrapper[4931]: I0130 06:39:57.437550 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" event={"ID":"214b78b9-e769-4474-be87-e9b494c2fa69","Type":"ContainerDied","Data":"45cf3829eaba7efc9ffdbde5fa46c91facdbe555edf8963708f266596e0113d9"} Jan 30 06:39:57 crc kubenswrapper[4931]: I0130 06:39:57.437606 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" event={"ID":"214b78b9-e769-4474-be87-e9b494c2fa69","Type":"ContainerStarted","Data":"1cea5345d991653f1a6830de732c35c4b2f81ed6821f46e956ec8f3a43e28720"} Jan 30 06:39:57 crc kubenswrapper[4931]: I0130 06:39:57.437627 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91d301dc-5f68-4e1b-ae27-51aa02e45789","Type":"ContainerStarted","Data":"63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a"} Jan 30 06:39:57 crc kubenswrapper[4931]: I0130 06:39:57.437648 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91d301dc-5f68-4e1b-ae27-51aa02e45789","Type":"ContainerStarted","Data":"845bdc0d8784551cc86053ca285aa0afc7bce0f017005659d5e194550515ea02"} Jan 30 06:39:58 crc kubenswrapper[4931]: I0130 06:39:58.477752 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" event={"ID":"214b78b9-e769-4474-be87-e9b494c2fa69","Type":"ContainerStarted","Data":"1b21e024974dbffc1b686bd5f52316fe76f211ede1a6fa05295886b31dbd35b5"} Jan 30 06:39:58 crc kubenswrapper[4931]: I0130 06:39:58.480611 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:39:58 crc kubenswrapper[4931]: I0130 06:39:58.497481 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91d301dc-5f68-4e1b-ae27-51aa02e45789","Type":"ContainerStarted","Data":"fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71"} Jan 30 06:39:58 crc kubenswrapper[4931]: I0130 06:39:58.498549 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 06:39:58 crc kubenswrapper[4931]: I0130 06:39:58.531245 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" podStartSLOduration=3.531226427 podStartE2EDuration="3.531226427s" podCreationTimestamp="2026-01-30 06:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:58.516795173 +0000 UTC m=+5533.886705440" watchObservedRunningTime="2026-01-30 06:39:58.531226427 +0000 UTC m=+5533.901136684" Jan 30 06:39:58 crc kubenswrapper[4931]: I0130 06:39:58.567923 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.567898903 podStartE2EDuration="2.567898903s" podCreationTimestamp="2026-01-30 06:39:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:58.551799433 +0000 UTC m=+5533.921709690" watchObservedRunningTime="2026-01-30 06:39:58.567898903 +0000 UTC m=+5533.937809160" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.121625 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.203178 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6b7dcd95-j4hmz"] Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.203469 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerName="dnsmasq-dns" containerID="cri-o://88c455ebf77b0c0d27d3a79f9baa7eb72c57c08194cb775791bc835634485c4f" gracePeriod=10 Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.593376 4931 generic.go:334] "Generic (PLEG): container finished" podID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerID="88c455ebf77b0c0d27d3a79f9baa7eb72c57c08194cb775791bc835634485c4f" exitCode=0 Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.593647 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" event={"ID":"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b","Type":"ContainerDied","Data":"88c455ebf77b0c0d27d3a79f9baa7eb72c57c08194cb775791bc835634485c4f"} Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.752345 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.845346 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-dns-svc\") pod \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.846371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-config\") pod \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.846476 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-nb\") pod \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.846593 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drhrf\" (UniqueName: \"kubernetes.io/projected/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-kube-api-access-drhrf\") pod \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.846664 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-sb\") pod \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\" (UID: \"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b\") " Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.865660 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-kube-api-access-drhrf" (OuterVolumeSpecName: "kube-api-access-drhrf") pod "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" (UID: "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b"). InnerVolumeSpecName "kube-api-access-drhrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.889172 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" (UID: "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.911089 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" (UID: "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.913039 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" (UID: "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.918040 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-config" (OuterVolumeSpecName: "config") pod "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" (UID: "bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.948989 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drhrf\" (UniqueName: \"kubernetes.io/projected/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-kube-api-access-drhrf\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.949026 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.949043 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.949057 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:06 crc kubenswrapper[4931]: I0130 06:40:06.949068 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.602455 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" event={"ID":"bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b","Type":"ContainerDied","Data":"4b8f45f3ed9effeb333573d3d797f78260b16f7ed70972bc0ea03075747c1260"} Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.602754 4931 scope.go:117] "RemoveContainer" containerID="88c455ebf77b0c0d27d3a79f9baa7eb72c57c08194cb775791bc835634485c4f" Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.602562 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6b7dcd95-j4hmz" Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.631703 4931 scope.go:117] "RemoveContainer" containerID="f570ad04682e1e4020b7b6f4103c4537ec3f58b31b00dec90861ecb783916f09" Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.632912 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6b7dcd95-j4hmz"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.654087 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6b7dcd95-j4hmz"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.727690 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.727932 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-log" containerID="cri-o://d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.728356 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-api" containerID="cri-o://b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.736494 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.736695 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-log" containerID="cri-o://a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.737057 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-metadata" containerID="cri-o://7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.747474 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.747710 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="2e99598c-cc27-462b-8c5b-9647fdc031dc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a67c2af59ce774fac5d99d16e2c4d0308297f692031a8a81e468f5bef97702ac" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.755583 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.755969 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0cc67f7b-ac8c-4b63-8f28-fd5135307022" containerName="nova-scheduler-scheduler" containerID="cri-o://51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.763659 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.763868 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="dc16d452-7a63-4d86-b729-2f7384b3ea73" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53" gracePeriod=30 Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.801637 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:40:07 crc kubenswrapper[4931]: I0130 06:40:07.801827 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="c9b04495-2e29-4188-adbe-e6ed3669c25a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7956c67be4873a213d6ce531a234902ab8e420fef8d81bd2f9cc50a55b2ed19e" gracePeriod=30 Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.608749 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.612246 4931 generic.go:334] "Generic (PLEG): container finished" podID="4c56d95d-5087-41db-a759-2273aef32a3c" containerID="a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd" exitCode=143 Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.612285 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c56d95d-5087-41db-a759-2273aef32a3c","Type":"ContainerDied","Data":"a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd"} Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.615995 4931 generic.go:334] "Generic (PLEG): container finished" podID="dc16d452-7a63-4d86-b729-2f7384b3ea73" containerID="781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53" exitCode=0 Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.616055 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc16d452-7a63-4d86-b729-2f7384b3ea73","Type":"ContainerDied","Data":"781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53"} Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.616100 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc16d452-7a63-4d86-b729-2f7384b3ea73","Type":"ContainerDied","Data":"be2cb5795144a0336c26c2ce840d01e8f6b40f1134f0aca0ca6716edd8f9b6e4"} Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.616117 4931 scope.go:117] "RemoveContainer" containerID="781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.616230 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.619955 4931 generic.go:334] "Generic (PLEG): container finished" podID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerID="d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc" exitCode=143 Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.620020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530f454-5ee2-4767-8c0b-75d50ba8a44b","Type":"ContainerDied","Data":"d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc"} Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.622497 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.656903 4931 scope.go:117] "RemoveContainer" containerID="781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53" Jan 30 06:40:08 crc kubenswrapper[4931]: E0130 06:40:08.657243 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53\": container with ID starting with 781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53 not found: ID does not exist" containerID="781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.657272 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53"} err="failed to get container status \"781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53\": rpc error: code = NotFound desc = could not find container \"781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53\": container with ID starting with 781c9f4814f32596f45ca8f36783e39cf9745733137e6b3a5832e087c7162f53 not found: ID does not exist" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.683839 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-config-data\") pod \"dc16d452-7a63-4d86-b729-2f7384b3ea73\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.683931 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-combined-ca-bundle\") pod \"dc16d452-7a63-4d86-b729-2f7384b3ea73\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.684013 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmssh\" (UniqueName: \"kubernetes.io/projected/dc16d452-7a63-4d86-b729-2f7384b3ea73-kube-api-access-vmssh\") pod \"dc16d452-7a63-4d86-b729-2f7384b3ea73\" (UID: \"dc16d452-7a63-4d86-b729-2f7384b3ea73\") " Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.714796 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc16d452-7a63-4d86-b729-2f7384b3ea73-kube-api-access-vmssh" (OuterVolumeSpecName: "kube-api-access-vmssh") pod "dc16d452-7a63-4d86-b729-2f7384b3ea73" (UID: "dc16d452-7a63-4d86-b729-2f7384b3ea73"). InnerVolumeSpecName "kube-api-access-vmssh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.735868 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-config-data" (OuterVolumeSpecName: "config-data") pod "dc16d452-7a63-4d86-b729-2f7384b3ea73" (UID: "dc16d452-7a63-4d86-b729-2f7384b3ea73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.753399 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc16d452-7a63-4d86-b729-2f7384b3ea73" (UID: "dc16d452-7a63-4d86-b729-2f7384b3ea73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.785645 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.785684 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc16d452-7a63-4d86-b729-2f7384b3ea73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.785695 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmssh\" (UniqueName: \"kubernetes.io/projected/dc16d452-7a63-4d86-b729-2f7384b3ea73-kube-api-access-vmssh\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.960695 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:40:08 crc kubenswrapper[4931]: I0130 06:40:08.982827 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:08.999620 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: E0130 06:40:08.999974 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc16d452-7a63-4d86-b729-2f7384b3ea73" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:08.999988 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc16d452-7a63-4d86-b729-2f7384b3ea73" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 06:40:09 crc kubenswrapper[4931]: E0130 06:40:09.000003 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerName="init" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.000009 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerName="init" Jan 30 06:40:09 crc kubenswrapper[4931]: E0130 06:40:09.000028 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerName="dnsmasq-dns" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.000035 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerName="dnsmasq-dns" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.000211 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" containerName="dnsmasq-dns" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.000225 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc16d452-7a63-4d86-b729-2f7384b3ea73" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.000850 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.004462 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.010741 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.090570 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e49a5c-323c-46de-b34f-2fef9465e277-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.090732 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9txz\" (UniqueName: \"kubernetes.io/projected/c9e49a5c-323c-46de-b34f-2fef9465e277-kube-api-access-d9txz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.090785 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e49a5c-323c-46de-b34f-2fef9465e277-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.109303 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.195041 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9txz\" (UniqueName: \"kubernetes.io/projected/c9e49a5c-323c-46de-b34f-2fef9465e277-kube-api-access-d9txz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.195089 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e49a5c-323c-46de-b34f-2fef9465e277-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.195111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e49a5c-323c-46de-b34f-2fef9465e277-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.201539 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e49a5c-323c-46de-b34f-2fef9465e277-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.201667 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e49a5c-323c-46de-b34f-2fef9465e277-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.210188 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9txz\" (UniqueName: \"kubernetes.io/projected/c9e49a5c-323c-46de-b34f-2fef9465e277-kube-api-access-d9txz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c9e49a5c-323c-46de-b34f-2fef9465e277\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.296199 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpxfx\" (UniqueName: \"kubernetes.io/projected/0cc67f7b-ac8c-4b63-8f28-fd5135307022-kube-api-access-gpxfx\") pod \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.296319 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-config-data\") pod \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.296454 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-combined-ca-bundle\") pod \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\" (UID: \"0cc67f7b-ac8c-4b63-8f28-fd5135307022\") " Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.300564 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc67f7b-ac8c-4b63-8f28-fd5135307022-kube-api-access-gpxfx" (OuterVolumeSpecName: "kube-api-access-gpxfx") pod "0cc67f7b-ac8c-4b63-8f28-fd5135307022" (UID: "0cc67f7b-ac8c-4b63-8f28-fd5135307022"). InnerVolumeSpecName "kube-api-access-gpxfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.321714 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.333647 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-config-data" (OuterVolumeSpecName: "config-data") pod "0cc67f7b-ac8c-4b63-8f28-fd5135307022" (UID: "0cc67f7b-ac8c-4b63-8f28-fd5135307022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.350627 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cc67f7b-ac8c-4b63-8f28-fd5135307022" (UID: "0cc67f7b-ac8c-4b63-8f28-fd5135307022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.400269 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpxfx\" (UniqueName: \"kubernetes.io/projected/0cc67f7b-ac8c-4b63-8f28-fd5135307022-kube-api-access-gpxfx\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.400304 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.400316 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc67f7b-ac8c-4b63-8f28-fd5135307022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.445016 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b" path="/var/lib/kubelet/pods/bf1a68d3-4ec7-46e5-9fee-ea7f89ec8c7b/volumes" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.445781 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc16d452-7a63-4d86-b729-2f7384b3ea73" path="/var/lib/kubelet/pods/dc16d452-7a63-4d86-b729-2f7384b3ea73/volumes" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.649892 4931 generic.go:334] "Generic (PLEG): container finished" podID="0cc67f7b-ac8c-4b63-8f28-fd5135307022" containerID="51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635" exitCode=0 Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.649978 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.650002 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc67f7b-ac8c-4b63-8f28-fd5135307022","Type":"ContainerDied","Data":"51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635"} Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.650860 4931 scope.go:117] "RemoveContainer" containerID="51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.650748 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc67f7b-ac8c-4b63-8f28-fd5135307022","Type":"ContainerDied","Data":"caa956bb861b331dfc23294a937380376a24f5f4a7dcf1c49c1dbdd00bea437a"} Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.678457 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.690561 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.691019 4931 scope.go:117] "RemoveContainer" containerID="51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635" Jan 30 06:40:09 crc kubenswrapper[4931]: E0130 06:40:09.692275 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635\": container with ID starting with 51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635 not found: ID does not exist" containerID="51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.692319 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635"} err="failed to get container status \"51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635\": rpc error: code = NotFound desc = could not find container \"51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635\": container with ID starting with 51fe2630264221f4af34441e598c9cd1502c1a46402bb5792e5a94d0c5b84635 not found: ID does not exist" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.696017 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: E0130 06:40:09.696453 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc67f7b-ac8c-4b63-8f28-fd5135307022" containerName="nova-scheduler-scheduler" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.696468 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc67f7b-ac8c-4b63-8f28-fd5135307022" containerName="nova-scheduler-scheduler" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.696651 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc67f7b-ac8c-4b63-8f28-fd5135307022" containerName="nova-scheduler-scheduler" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.699483 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.701696 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.704352 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.806034 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-config-data\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.806329 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvrc\" (UniqueName: \"kubernetes.io/projected/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-kube-api-access-dxvrc\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.806510 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.853243 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:40:09 crc kubenswrapper[4931]: W0130 06:40:09.860524 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e49a5c_323c_46de_b34f_2fef9465e277.slice/crio-c0dd31a4534f326db817c22d6807a0e67f9ac0247280e5e3a530799c96005dac WatchSource:0}: Error finding container c0dd31a4534f326db817c22d6807a0e67f9ac0247280e5e3a530799c96005dac: Status 404 returned error can't find the container with id c0dd31a4534f326db817c22d6807a0e67f9ac0247280e5e3a530799c96005dac Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.908178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-config-data\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.908722 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvrc\" (UniqueName: \"kubernetes.io/projected/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-kube-api-access-dxvrc\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.908773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.915102 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-config-data\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.923729 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:09 crc kubenswrapper[4931]: I0130 06:40:09.927109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvrc\" (UniqueName: \"kubernetes.io/projected/ee8c9751-e3b5-4031-bb46-a7e5fae46f4e-kube-api-access-dxvrc\") pod \"nova-scheduler-0\" (UID: \"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e\") " pod="openstack/nova-scheduler-0" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.016696 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.491964 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.668255 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c9e49a5c-323c-46de-b34f-2fef9465e277","Type":"ContainerStarted","Data":"ee6b7b39bb21791ee55bb92d2b9260f291bcc5fdcb1a92e68a988767da00ca1b"} Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.668660 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c9e49a5c-323c-46de-b34f-2fef9465e277","Type":"ContainerStarted","Data":"c0dd31a4534f326db817c22d6807a0e67f9ac0247280e5e3a530799c96005dac"} Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.678500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e","Type":"ContainerStarted","Data":"d547cfddc008b2b031e47460775030aadad9c1e280fedd069542e543986c0052"} Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.688304 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.688289702 podStartE2EDuration="2.688289702s" podCreationTimestamp="2026-01-30 06:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:10.684042673 +0000 UTC m=+5546.053952940" watchObservedRunningTime="2026-01-30 06:40:10.688289702 +0000 UTC m=+5546.058199969" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.692975 4931 generic.go:334] "Generic (PLEG): container finished" podID="c9b04495-2e29-4188-adbe-e6ed3669c25a" containerID="7956c67be4873a213d6ce531a234902ab8e420fef8d81bd2f9cc50a55b2ed19e" exitCode=0 Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.693058 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c9b04495-2e29-4188-adbe-e6ed3669c25a","Type":"ContainerDied","Data":"7956c67be4873a213d6ce531a234902ab8e420fef8d81bd2f9cc50a55b2ed19e"} Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.693091 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c9b04495-2e29-4188-adbe-e6ed3669c25a","Type":"ContainerDied","Data":"c53e952c29d0f7bb7753df2ecd373b270a2a034437bebd33a1a8707e3ab33ea8"} Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.693105 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c53e952c29d0f7bb7753df2ecd373b270a2a034437bebd33a1a8707e3ab33ea8" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.746949 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.839370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt4r9\" (UniqueName: \"kubernetes.io/projected/c9b04495-2e29-4188-adbe-e6ed3669c25a-kube-api-access-tt4r9\") pod \"c9b04495-2e29-4188-adbe-e6ed3669c25a\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.839654 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-combined-ca-bundle\") pod \"c9b04495-2e29-4188-adbe-e6ed3669c25a\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.839687 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-config-data\") pod \"c9b04495-2e29-4188-adbe-e6ed3669c25a\" (UID: \"c9b04495-2e29-4188-adbe-e6ed3669c25a\") " Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.843619 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b04495-2e29-4188-adbe-e6ed3669c25a-kube-api-access-tt4r9" (OuterVolumeSpecName: "kube-api-access-tt4r9") pod "c9b04495-2e29-4188-adbe-e6ed3669c25a" (UID: "c9b04495-2e29-4188-adbe-e6ed3669c25a"). InnerVolumeSpecName "kube-api-access-tt4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.867213 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-config-data" (OuterVolumeSpecName: "config-data") pod "c9b04495-2e29-4188-adbe-e6ed3669c25a" (UID: "c9b04495-2e29-4188-adbe-e6ed3669c25a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.868008 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9b04495-2e29-4188-adbe-e6ed3669c25a" (UID: "c9b04495-2e29-4188-adbe-e6ed3669c25a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.885034 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": read tcp 10.217.0.2:44348->10.217.1.76:8774: read: connection reset by peer" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.885118 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": read tcp 10.217.0.2:44340->10.217.1.76:8774: read: connection reset by peer" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.908793 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": read tcp 10.217.0.2:48292->10.217.1.75:8775: read: connection reset by peer" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.909032 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": read tcp 10.217.0.2:48286->10.217.1.75:8775: read: connection reset by peer" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.943319 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt4r9\" (UniqueName: \"kubernetes.io/projected/c9b04495-2e29-4188-adbe-e6ed3669c25a-kube-api-access-tt4r9\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.943367 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:10 crc kubenswrapper[4931]: I0130 06:40:10.943381 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b04495-2e29-4188-adbe-e6ed3669c25a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.219491 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.245493 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chqpp\" (UniqueName: \"kubernetes.io/projected/2530f454-5ee2-4767-8c0b-75d50ba8a44b-kube-api-access-chqpp\") pod \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.245558 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-combined-ca-bundle\") pod \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.245586 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-config-data\") pod \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.245653 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530f454-5ee2-4767-8c0b-75d50ba8a44b-logs\") pod \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\" (UID: \"2530f454-5ee2-4767-8c0b-75d50ba8a44b\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.246278 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2530f454-5ee2-4767-8c0b-75d50ba8a44b-logs" (OuterVolumeSpecName: "logs") pod "2530f454-5ee2-4767-8c0b-75d50ba8a44b" (UID: "2530f454-5ee2-4767-8c0b-75d50ba8a44b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.282952 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2530f454-5ee2-4767-8c0b-75d50ba8a44b-kube-api-access-chqpp" (OuterVolumeSpecName: "kube-api-access-chqpp") pod "2530f454-5ee2-4767-8c0b-75d50ba8a44b" (UID: "2530f454-5ee2-4767-8c0b-75d50ba8a44b"). InnerVolumeSpecName "kube-api-access-chqpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.300728 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2530f454-5ee2-4767-8c0b-75d50ba8a44b" (UID: "2530f454-5ee2-4767-8c0b-75d50ba8a44b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.307457 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-config-data" (OuterVolumeSpecName: "config-data") pod "2530f454-5ee2-4767-8c0b-75d50ba8a44b" (UID: "2530f454-5ee2-4767-8c0b-75d50ba8a44b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.347206 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chqpp\" (UniqueName: \"kubernetes.io/projected/2530f454-5ee2-4767-8c0b-75d50ba8a44b-kube-api-access-chqpp\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.347244 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.347254 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530f454-5ee2-4767-8c0b-75d50ba8a44b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.347262 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530f454-5ee2-4767-8c0b-75d50ba8a44b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.446451 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc67f7b-ac8c-4b63-8f28-fd5135307022" path="/var/lib/kubelet/pods/0cc67f7b-ac8c-4b63-8f28-fd5135307022/volumes" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.466016 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.655115 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-combined-ca-bundle\") pod \"4c56d95d-5087-41db-a759-2273aef32a3c\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.655586 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-config-data\") pod \"4c56d95d-5087-41db-a759-2273aef32a3c\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.655709 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c56d95d-5087-41db-a759-2273aef32a3c-logs\") pod \"4c56d95d-5087-41db-a759-2273aef32a3c\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.655764 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcgzf\" (UniqueName: \"kubernetes.io/projected/4c56d95d-5087-41db-a759-2273aef32a3c-kube-api-access-wcgzf\") pod \"4c56d95d-5087-41db-a759-2273aef32a3c\" (UID: \"4c56d95d-5087-41db-a759-2273aef32a3c\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.658603 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c56d95d-5087-41db-a759-2273aef32a3c-logs" (OuterVolumeSpecName: "logs") pod "4c56d95d-5087-41db-a759-2273aef32a3c" (UID: "4c56d95d-5087-41db-a759-2273aef32a3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.671706 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c56d95d-5087-41db-a759-2273aef32a3c-kube-api-access-wcgzf" (OuterVolumeSpecName: "kube-api-access-wcgzf") pod "4c56d95d-5087-41db-a759-2273aef32a3c" (UID: "4c56d95d-5087-41db-a759-2273aef32a3c"). InnerVolumeSpecName "kube-api-access-wcgzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.682678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c56d95d-5087-41db-a759-2273aef32a3c" (UID: "4c56d95d-5087-41db-a759-2273aef32a3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.727582 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-config-data" (OuterVolumeSpecName: "config-data") pod "4c56d95d-5087-41db-a759-2273aef32a3c" (UID: "4c56d95d-5087-41db-a759-2273aef32a3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.727727 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee8c9751-e3b5-4031-bb46-a7e5fae46f4e","Type":"ContainerStarted","Data":"e691c1952efaf391dd8add1402f864ca4904050c83ad6a836531fb2e18c4d1da"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.746357 4931 generic.go:334] "Generic (PLEG): container finished" podID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerID="b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155" exitCode=0 Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.746451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530f454-5ee2-4767-8c0b-75d50ba8a44b","Type":"ContainerDied","Data":"b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.746473 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530f454-5ee2-4767-8c0b-75d50ba8a44b","Type":"ContainerDied","Data":"aef8e55688d59f4d24924520cc3d195a0fe2137c7ba9584c52f87f8f22ad3859"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.746490 4931 scope.go:117] "RemoveContainer" containerID="b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.746597 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.747065 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.747052541 podStartE2EDuration="2.747052541s" podCreationTimestamp="2026-01-30 06:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:11.73916438 +0000 UTC m=+5547.109074637" watchObservedRunningTime="2026-01-30 06:40:11.747052541 +0000 UTC m=+5547.116962808" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.759550 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.759594 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c56d95d-5087-41db-a759-2273aef32a3c-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.759607 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcgzf\" (UniqueName: \"kubernetes.io/projected/4c56d95d-5087-41db-a759-2273aef32a3c-kube-api-access-wcgzf\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.759619 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c56d95d-5087-41db-a759-2273aef32a3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.763373 4931 generic.go:334] "Generic (PLEG): container finished" podID="4c56d95d-5087-41db-a759-2273aef32a3c" containerID="7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a" exitCode=0 Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.763468 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c56d95d-5087-41db-a759-2273aef32a3c","Type":"ContainerDied","Data":"7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.763508 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c56d95d-5087-41db-a759-2273aef32a3c","Type":"ContainerDied","Data":"927c72bfe637ff62dfcafae17c76ddbeded161c27b84f0c5f70d67241dbe6fdd"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.763587 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.772214 4931 generic.go:334] "Generic (PLEG): container finished" podID="2e99598c-cc27-462b-8c5b-9647fdc031dc" containerID="a67c2af59ce774fac5d99d16e2c4d0308297f692031a8a81e468f5bef97702ac" exitCode=0 Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.772332 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.773968 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2e99598c-cc27-462b-8c5b-9647fdc031dc","Type":"ContainerDied","Data":"a67c2af59ce774fac5d99d16e2c4d0308297f692031a8a81e468f5bef97702ac"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.774041 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2e99598c-cc27-462b-8c5b-9647fdc031dc","Type":"ContainerDied","Data":"7484fc206458a3c7c0f0725319e96b32501a736307f2141c6d77f6213f261ff9"} Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.774060 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7484fc206458a3c7c0f0725319e96b32501a736307f2141c6d77f6213f261ff9" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.779238 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.805955 4931 scope.go:117] "RemoveContainer" containerID="d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.816728 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.843186 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.846307 4931 scope.go:117] "RemoveContainer" containerID="b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.848300 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155\": container with ID starting with b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155 not found: ID does not exist" containerID="b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.848364 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155"} err="failed to get container status \"b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155\": rpc error: code = NotFound desc = could not find container \"b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155\": container with ID starting with b0f9685dc882224c0d513d40000ac1e1655fbf5c7112da5e2f6f7f5866677155 not found: ID does not exist" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.848385 4931 scope.go:117] "RemoveContainer" containerID="d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.849807 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc\": container with ID starting with d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc not found: ID does not exist" containerID="d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.849853 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc"} err="failed to get container status \"d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc\": rpc error: code = NotFound desc = could not find container \"d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc\": container with ID starting with d792fe84be06f29a0f9a0f35ef73368f3a67a29ef7cf6174c301187eb913f1bc not found: ID does not exist" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.849869 4931 scope.go:117] "RemoveContainer" containerID="7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.855385 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874073 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.874432 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b04495-2e29-4188-adbe-e6ed3669c25a" containerName="nova-cell1-conductor-conductor" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874448 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b04495-2e29-4188-adbe-e6ed3669c25a" containerName="nova-cell1-conductor-conductor" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.874477 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-log" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874486 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-log" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.874496 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-metadata" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874502 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-metadata" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.874512 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-api" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874518 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-api" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.874526 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-log" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874533 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-log" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.874541 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e99598c-cc27-462b-8c5b-9647fdc031dc" containerName="nova-cell0-conductor-conductor" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874546 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e99598c-cc27-462b-8c5b-9647fdc031dc" containerName="nova-cell0-conductor-conductor" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874700 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-log" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874712 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-log" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874722 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" containerName="nova-metadata-metadata" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874735 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e99598c-cc27-462b-8c5b-9647fdc031dc" containerName="nova-cell0-conductor-conductor" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874743 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" containerName="nova-api-api" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.874752 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b04495-2e29-4188-adbe-e6ed3669c25a" containerName="nova-cell1-conductor-conductor" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.875590 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.880465 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.903917 4931 scope.go:117] "RemoveContainer" containerID="a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.904328 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.916902 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.927611 4931 scope.go:117] "RemoveContainer" containerID="7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a" Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.928128 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a\": container with ID starting with 7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a not found: ID does not exist" containerID="7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.928169 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a"} err="failed to get container status \"7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a\": rpc error: code = NotFound desc = could not find container \"7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a\": container with ID starting with 7ba088cb21d714f69636911e519755523520f6a3c24d7bc8ee678e814ba0634a not found: ID does not exist" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.928192 4931 scope.go:117] "RemoveContainer" containerID="a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.929193 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: E0130 06:40:11.929799 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd\": container with ID starting with a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd not found: ID does not exist" containerID="a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.929827 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd"} err="failed to get container status \"a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd\": rpc error: code = NotFound desc = could not find container \"a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd\": container with ID starting with a9c10b94942609fec99a9c8583d382387968d89a6e00090015b7a098ed4bbbfd not found: ID does not exist" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.930373 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.932271 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.957173 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.962358 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.962954 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-combined-ca-bundle\") pod \"2e99598c-cc27-462b-8c5b-9647fdc031dc\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.963108 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-config-data\") pod \"2e99598c-cc27-462b-8c5b-9647fdc031dc\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.963147 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9bw\" (UniqueName: \"kubernetes.io/projected/2e99598c-cc27-462b-8c5b-9647fdc031dc-kube-api-access-7w9bw\") pod \"2e99598c-cc27-462b-8c5b-9647fdc031dc\" (UID: \"2e99598c-cc27-462b-8c5b-9647fdc031dc\") " Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.968914 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.970747 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e99598c-cc27-462b-8c5b-9647fdc031dc-kube-api-access-7w9bw" (OuterVolumeSpecName: "kube-api-access-7w9bw") pod "2e99598c-cc27-462b-8c5b-9647fdc031dc" (UID: "2e99598c-cc27-462b-8c5b-9647fdc031dc"). InnerVolumeSpecName "kube-api-access-7w9bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.976635 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.978216 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:40:11 crc kubenswrapper[4931]: I0130 06:40:11.981844 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:11.997753 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.005545 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-config-data" (OuterVolumeSpecName: "config-data") pod "2e99598c-cc27-462b-8c5b-9647fdc031dc" (UID: "2e99598c-cc27-462b-8c5b-9647fdc031dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.036961 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e99598c-cc27-462b-8c5b-9647fdc031dc" (UID: "2e99598c-cc27-462b-8c5b-9647fdc031dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.064872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d940c452-f401-4c40-accd-cb3178bc0490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.064914 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12d7d7a-0b33-425e-98be-5a28ef924b22-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.064948 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d940c452-f401-4c40-accd-cb3178bc0490-logs\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.064974 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4lxm\" (UniqueName: \"kubernetes.io/projected/d940c452-f401-4c40-accd-cb3178bc0490-kube-api-access-h4lxm\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.065094 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12d7d7a-0b33-425e-98be-5a28ef924b22-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.065142 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75245\" (UniqueName: \"kubernetes.io/projected/a12d7d7a-0b33-425e-98be-5a28ef924b22-kube-api-access-75245\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.065174 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d940c452-f401-4c40-accd-cb3178bc0490-config-data\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.065379 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.065416 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e99598c-cc27-462b-8c5b-9647fdc031dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.065455 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9bw\" (UniqueName: \"kubernetes.io/projected/2e99598c-cc27-462b-8c5b-9647fdc031dc-kube-api-access-7w9bw\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166364 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12d7d7a-0b33-425e-98be-5a28ef924b22-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166473 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36294ba3-fcdd-45cd-b4ff-20ee280751da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166498 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75245\" (UniqueName: \"kubernetes.io/projected/a12d7d7a-0b33-425e-98be-5a28ef924b22-kube-api-access-75245\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166515 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36294ba3-fcdd-45cd-b4ff-20ee280751da-logs\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d940c452-f401-4c40-accd-cb3178bc0490-config-data\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166656 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d940c452-f401-4c40-accd-cb3178bc0490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166705 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12d7d7a-0b33-425e-98be-5a28ef924b22-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166766 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d940c452-f401-4c40-accd-cb3178bc0490-logs\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4lxm\" (UniqueName: \"kubernetes.io/projected/d940c452-f401-4c40-accd-cb3178bc0490-kube-api-access-h4lxm\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.166885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwlhm\" (UniqueName: \"kubernetes.io/projected/36294ba3-fcdd-45cd-b4ff-20ee280751da-kube-api-access-kwlhm\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.167028 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36294ba3-fcdd-45cd-b4ff-20ee280751da-config-data\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.167258 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d940c452-f401-4c40-accd-cb3178bc0490-logs\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.170770 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d940c452-f401-4c40-accd-cb3178bc0490-config-data\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.171027 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12d7d7a-0b33-425e-98be-5a28ef924b22-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.172140 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d940c452-f401-4c40-accd-cb3178bc0490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.179037 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12d7d7a-0b33-425e-98be-5a28ef924b22-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.200780 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4lxm\" (UniqueName: \"kubernetes.io/projected/d940c452-f401-4c40-accd-cb3178bc0490-kube-api-access-h4lxm\") pod \"nova-api-0\" (UID: \"d940c452-f401-4c40-accd-cb3178bc0490\") " pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.205818 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75245\" (UniqueName: \"kubernetes.io/projected/a12d7d7a-0b33-425e-98be-5a28ef924b22-kube-api-access-75245\") pod \"nova-cell1-conductor-0\" (UID: \"a12d7d7a-0b33-425e-98be-5a28ef924b22\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.210681 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.257936 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.269061 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36294ba3-fcdd-45cd-b4ff-20ee280751da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.269404 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36294ba3-fcdd-45cd-b4ff-20ee280751da-logs\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.269617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwlhm\" (UniqueName: \"kubernetes.io/projected/36294ba3-fcdd-45cd-b4ff-20ee280751da-kube-api-access-kwlhm\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.269736 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36294ba3-fcdd-45cd-b4ff-20ee280751da-config-data\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.269877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36294ba3-fcdd-45cd-b4ff-20ee280751da-logs\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.273924 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36294ba3-fcdd-45cd-b4ff-20ee280751da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.276163 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36294ba3-fcdd-45cd-b4ff-20ee280751da-config-data\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.286984 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwlhm\" (UniqueName: \"kubernetes.io/projected/36294ba3-fcdd-45cd-b4ff-20ee280751da-kube-api-access-kwlhm\") pod \"nova-metadata-0\" (UID: \"36294ba3-fcdd-45cd-b4ff-20ee280751da\") " pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.336774 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.750980 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.789894 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.790090 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d940c452-f401-4c40-accd-cb3178bc0490","Type":"ContainerStarted","Data":"6495728d5eb291bc298003d3849c7a2291b20db607092e67a996db42ddd92122"} Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.876666 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.896869 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.909146 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.926492 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.928136 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.936905 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.939714 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:40:12 crc kubenswrapper[4931]: W0130 06:40:12.944068 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36294ba3_fcdd_45cd_b4ff_20ee280751da.slice/crio-0de1279c14e6faaac345c37b480ef12bead176ff1e3f16d3852d6ef4545ce2e3 WatchSource:0}: Error finding container 0de1279c14e6faaac345c37b480ef12bead176ff1e3f16d3852d6ef4545ce2e3: Status 404 returned error can't find the container with id 0de1279c14e6faaac345c37b480ef12bead176ff1e3f16d3852d6ef4545ce2e3 Jan 30 06:40:12 crc kubenswrapper[4931]: I0130 06:40:12.959284 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.084841 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79469af6-a764-49c6-beaf-b49185c1028a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.084892 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79469af6-a764-49c6-beaf-b49185c1028a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.084940 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxwjc\" (UniqueName: \"kubernetes.io/projected/79469af6-a764-49c6-beaf-b49185c1028a-kube-api-access-wxwjc\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.186172 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79469af6-a764-49c6-beaf-b49185c1028a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.186528 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79469af6-a764-49c6-beaf-b49185c1028a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.186589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxwjc\" (UniqueName: \"kubernetes.io/projected/79469af6-a764-49c6-beaf-b49185c1028a-kube-api-access-wxwjc\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.190216 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79469af6-a764-49c6-beaf-b49185c1028a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.190279 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79469af6-a764-49c6-beaf-b49185c1028a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.206859 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxwjc\" (UniqueName: \"kubernetes.io/projected/79469af6-a764-49c6-beaf-b49185c1028a-kube-api-access-wxwjc\") pod \"nova-cell0-conductor-0\" (UID: \"79469af6-a764-49c6-beaf-b49185c1028a\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.278123 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.433135 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2530f454-5ee2-4767-8c0b-75d50ba8a44b" path="/var/lib/kubelet/pods/2530f454-5ee2-4767-8c0b-75d50ba8a44b/volumes" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.433815 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e99598c-cc27-462b-8c5b-9647fdc031dc" path="/var/lib/kubelet/pods/2e99598c-cc27-462b-8c5b-9647fdc031dc/volumes" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.434542 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c56d95d-5087-41db-a759-2273aef32a3c" path="/var/lib/kubelet/pods/4c56d95d-5087-41db-a759-2273aef32a3c/volumes" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.435807 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b04495-2e29-4188-adbe-e6ed3669c25a" path="/var/lib/kubelet/pods/c9b04495-2e29-4188-adbe-e6ed3669c25a/volumes" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.726216 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:40:13 crc kubenswrapper[4931]: W0130 06:40:13.727632 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79469af6_a764_49c6_beaf_b49185c1028a.slice/crio-7e510364c79ed8c922a3e902159f23b4f7230095aca34552eca7cbe39662356b WatchSource:0}: Error finding container 7e510364c79ed8c922a3e902159f23b4f7230095aca34552eca7cbe39662356b: Status 404 returned error can't find the container with id 7e510364c79ed8c922a3e902159f23b4f7230095aca34552eca7cbe39662356b Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.802248 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"79469af6-a764-49c6-beaf-b49185c1028a","Type":"ContainerStarted","Data":"7e510364c79ed8c922a3e902159f23b4f7230095aca34552eca7cbe39662356b"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.803523 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a12d7d7a-0b33-425e-98be-5a28ef924b22","Type":"ContainerStarted","Data":"9c3841e76949c898f5084dfbae3848ae947bb2904d6e1524763919edb31359be"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.803555 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a12d7d7a-0b33-425e-98be-5a28ef924b22","Type":"ContainerStarted","Data":"6f216ac671f7c21db7f3cdf84661e810eae62c7e4d3d98ff0ef3c27955f3208a"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.803768 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.807252 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d940c452-f401-4c40-accd-cb3178bc0490","Type":"ContainerStarted","Data":"cfb777df56a8a7c5b99556e60a375cf7ef57707229568ce73495832b458767a3"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.807278 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d940c452-f401-4c40-accd-cb3178bc0490","Type":"ContainerStarted","Data":"618a6fdd12b73afd5214a120da54b9f11515ef3e4da9fd5f154fcdc8715b32b5"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.809260 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36294ba3-fcdd-45cd-b4ff-20ee280751da","Type":"ContainerStarted","Data":"4ce125804c1146334c7ecfb1ec53114a5b004f6125b34f0141167c6e3dc499b6"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.809301 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36294ba3-fcdd-45cd-b4ff-20ee280751da","Type":"ContainerStarted","Data":"8bd0d94b0dcb72a0c45c8d080998a74a848a0d84e80c7f099ef2e81819c37caf"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.809311 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36294ba3-fcdd-45cd-b4ff-20ee280751da","Type":"ContainerStarted","Data":"0de1279c14e6faaac345c37b480ef12bead176ff1e3f16d3852d6ef4545ce2e3"} Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.829791 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.829769508 podStartE2EDuration="2.829769508s" podCreationTimestamp="2026-01-30 06:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:13.823741709 +0000 UTC m=+5549.193651966" watchObservedRunningTime="2026-01-30 06:40:13.829769508 +0000 UTC m=+5549.199679765" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.844118 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.844089018 podStartE2EDuration="2.844089018s" podCreationTimestamp="2026-01-30 06:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:13.83879341 +0000 UTC m=+5549.208703657" watchObservedRunningTime="2026-01-30 06:40:13.844089018 +0000 UTC m=+5549.213999275" Jan 30 06:40:13 crc kubenswrapper[4931]: I0130 06:40:13.857641 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.857624807 podStartE2EDuration="2.857624807s" podCreationTimestamp="2026-01-30 06:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:13.856265609 +0000 UTC m=+5549.226175886" watchObservedRunningTime="2026-01-30 06:40:13.857624807 +0000 UTC m=+5549.227535064" Jan 30 06:40:14 crc kubenswrapper[4931]: I0130 06:40:14.322834 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:14 crc kubenswrapper[4931]: I0130 06:40:14.818603 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"79469af6-a764-49c6-beaf-b49185c1028a","Type":"ContainerStarted","Data":"e907de379cac71ffd9eab7b864bb9ea7f16af34f0a9aa65c9d273acd302937e7"} Jan 30 06:40:14 crc kubenswrapper[4931]: I0130 06:40:14.840269 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.840249516 podStartE2EDuration="2.840249516s" podCreationTimestamp="2026-01-30 06:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:14.831126891 +0000 UTC m=+5550.201037158" watchObservedRunningTime="2026-01-30 06:40:14.840249516 +0000 UTC m=+5550.210159773" Jan 30 06:40:15 crc kubenswrapper[4931]: I0130 06:40:15.017452 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 06:40:15 crc kubenswrapper[4931]: I0130 06:40:15.829472 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:17 crc kubenswrapper[4931]: I0130 06:40:17.337920 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:40:17 crc kubenswrapper[4931]: I0130 06:40:17.339188 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:40:18 crc kubenswrapper[4931]: I0130 06:40:18.310675 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 06:40:19 crc kubenswrapper[4931]: I0130 06:40:19.322281 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:19 crc kubenswrapper[4931]: I0130 06:40:19.336455 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:19 crc kubenswrapper[4931]: I0130 06:40:19.884136 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:40:20 crc kubenswrapper[4931]: I0130 06:40:20.016983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 06:40:20 crc kubenswrapper[4931]: I0130 06:40:20.049554 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 06:40:20 crc kubenswrapper[4931]: I0130 06:40:20.930736 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 06:40:22 crc kubenswrapper[4931]: I0130 06:40:22.211082 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:40:22 crc kubenswrapper[4931]: I0130 06:40:22.211136 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:40:22 crc kubenswrapper[4931]: I0130 06:40:22.286818 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 06:40:22 crc kubenswrapper[4931]: I0130 06:40:22.338234 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:40:22 crc kubenswrapper[4931]: I0130 06:40:22.338293 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:40:23 crc kubenswrapper[4931]: I0130 06:40:23.293593 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d940c452-f401-4c40-accd-cb3178bc0490" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:40:23 crc kubenswrapper[4931]: I0130 06:40:23.293661 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d940c452-f401-4c40-accd-cb3178bc0490" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:40:23 crc kubenswrapper[4931]: I0130 06:40:23.420608 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="36294ba3-fcdd-45cd-b4ff-20ee280751da" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.88:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:40:23 crc kubenswrapper[4931]: I0130 06:40:23.421011 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="36294ba3-fcdd-45cd-b4ff-20ee280751da" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.88:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.604952 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.607789 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.610523 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.628695 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.783204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.783271 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.783379 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-scripts\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.783464 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2007a5f0-e092-4e2d-b41b-a32d073affcb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.783559 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.783600 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4smft\" (UniqueName: \"kubernetes.io/projected/2007a5f0-e092-4e2d-b41b-a32d073affcb-kube-api-access-4smft\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.884625 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.884664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.884736 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-scripts\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.885473 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2007a5f0-e092-4e2d-b41b-a32d073affcb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.885551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.885579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4smft\" (UniqueName: \"kubernetes.io/projected/2007a5f0-e092-4e2d-b41b-a32d073affcb-kube-api-access-4smft\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.885729 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2007a5f0-e092-4e2d-b41b-a32d073affcb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.889912 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.890553 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.891621 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.906989 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-scripts\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.922948 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4smft\" (UniqueName: \"kubernetes.io/projected/2007a5f0-e092-4e2d-b41b-a32d073affcb-kube-api-access-4smft\") pod \"cinder-scheduler-0\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:26 crc kubenswrapper[4931]: I0130 06:40:26.943800 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:40:27 crc kubenswrapper[4931]: W0130 06:40:27.475562 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2007a5f0_e092_4e2d_b41b_a32d073affcb.slice/crio-d15afda74e3554cef98d12203f17762f433838a8050f7e4bd23eddab3a91800b WatchSource:0}: Error finding container d15afda74e3554cef98d12203f17762f433838a8050f7e4bd23eddab3a91800b: Status 404 returned error can't find the container with id d15afda74e3554cef98d12203f17762f433838a8050f7e4bd23eddab3a91800b Jan 30 06:40:27 crc kubenswrapper[4931]: I0130 06:40:27.479127 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:27 crc kubenswrapper[4931]: I0130 06:40:27.956610 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2007a5f0-e092-4e2d-b41b-a32d073affcb","Type":"ContainerStarted","Data":"d15afda74e3554cef98d12203f17762f433838a8050f7e4bd23eddab3a91800b"} Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.163662 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.163899 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api-log" containerID="cri-o://63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a" gracePeriod=30 Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.164299 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api" containerID="cri-o://fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71" gracePeriod=30 Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.865622 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.867330 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.869289 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.890388 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.967621 4931 generic.go:334] "Generic (PLEG): container finished" podID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerID="63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a" exitCode=143 Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.967689 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91d301dc-5f68-4e1b-ae27-51aa02e45789","Type":"ContainerDied","Data":"63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a"} Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.970444 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2007a5f0-e092-4e2d-b41b-a32d073affcb","Type":"ContainerStarted","Data":"804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7"} Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.970489 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2007a5f0-e092-4e2d-b41b-a32d073affcb","Type":"ContainerStarted","Data":"a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341"} Jan 30 06:40:28 crc kubenswrapper[4931]: I0130 06:40:28.995945 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.995919395 podStartE2EDuration="2.995919395s" podCreationTimestamp="2026-01-30 06:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:28.99143658 +0000 UTC m=+5564.361346857" watchObservedRunningTime="2026-01-30 06:40:28.995919395 +0000 UTC m=+5564.365829652" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022295 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-run\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022354 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022411 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022504 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022530 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022567 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022591 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022613 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022654 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022684 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022756 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzl5\" (UniqueName: \"kubernetes.io/projected/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-kube-api-access-sbzl5\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.022884 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.023012 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.023067 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124367 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124501 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124530 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-run\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124545 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124598 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124628 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124657 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124677 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124761 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124812 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124828 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.124953 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125102 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125176 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125537 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125610 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125634 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125688 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125733 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125753 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125800 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.125866 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzl5\" (UniqueName: \"kubernetes.io/projected/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-kube-api-access-sbzl5\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.126087 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-run\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.130239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.130596 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.130943 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.132094 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.144594 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.155932 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzl5\" (UniqueName: \"kubernetes.io/projected/6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc-kube-api-access-sbzl5\") pod \"cinder-volume-volume1-0\" (UID: \"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc\") " pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.182865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.757280 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 30 06:40:29 crc kubenswrapper[4931]: W0130 06:40:29.772043 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac5ad1d_f9ee_4fe6_8625_d30e49c099fc.slice/crio-5642d24f487a3fbd2fcb0926ffcb8b2a69a6c03c5d478f10c7fb2b597fadb0fe WatchSource:0}: Error finding container 5642d24f487a3fbd2fcb0926ffcb8b2a69a6c03c5d478f10c7fb2b597fadb0fe: Status 404 returned error can't find the container with id 5642d24f487a3fbd2fcb0926ffcb8b2a69a6c03c5d478f10c7fb2b597fadb0fe Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.787383 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.790094 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.794847 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.809280 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941069 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-scripts\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941133 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941159 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54m9w\" (UniqueName: \"kubernetes.io/projected/9be12b3c-c79f-4719-ab10-e3370519fbe3-kube-api-access-54m9w\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941208 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-config-data\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941265 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941283 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941324 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9be12b3c-c79f-4719-ab10-e3370519fbe3-ceph\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941346 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941367 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-sys\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941388 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941463 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-lib-modules\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941506 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-run\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941534 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941553 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-dev\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941621 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.941643 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:29 crc kubenswrapper[4931]: I0130 06:40:29.979771 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc","Type":"ContainerStarted","Data":"5642d24f487a3fbd2fcb0926ffcb8b2a69a6c03c5d478f10c7fb2b597fadb0fe"} Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.042911 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-scripts\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043141 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043247 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54m9w\" (UniqueName: \"kubernetes.io/projected/9be12b3c-c79f-4719-ab10-e3370519fbe3-kube-api-access-54m9w\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043350 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-config-data\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043260 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043443 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043724 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9be12b3c-c79f-4719-ab10-e3370519fbe3-ceph\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043741 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043767 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043868 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-sys\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043887 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-sys\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.043899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044048 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-lib-modules\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044136 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-run\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044161 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-lib-modules\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044218 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044190 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044246 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-run\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044291 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-dev\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-dev\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044387 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044450 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.044645 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9be12b3c-c79f-4719-ab10-e3370519fbe3-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.048779 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9be12b3c-c79f-4719-ab10-e3370519fbe3-ceph\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.049514 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-config-data\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.051996 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.052253 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.059980 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be12b3c-c79f-4719-ab10-e3370519fbe3-scripts\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.072493 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54m9w\" (UniqueName: \"kubernetes.io/projected/9be12b3c-c79f-4719-ab10-e3370519fbe3-kube-api-access-54m9w\") pod \"cinder-backup-0\" (UID: \"9be12b3c-c79f-4719-ab10-e3370519fbe3\") " pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.141708 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.736402 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 30 06:40:30 crc kubenswrapper[4931]: I0130 06:40:30.994701 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9be12b3c-c79f-4719-ab10-e3370519fbe3","Type":"ContainerStarted","Data":"9e6db50e631e8f7cd1c4419c8dc147e9cd931bd6dde3cc6f4e72b4f794f82ae7"} Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.419941 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.83:8776/healthcheck\": dial tcp 10.217.1.83:8776: connect: connection refused" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.786132 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881030 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data-custom\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881107 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-combined-ca-bundle\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881166 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d301dc-5f68-4e1b-ae27-51aa02e45789-logs\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881272 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91d301dc-5f68-4e1b-ae27-51aa02e45789-etc-machine-id\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881301 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv5dr\" (UniqueName: \"kubernetes.io/projected/91d301dc-5f68-4e1b-ae27-51aa02e45789-kube-api-access-lv5dr\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881342 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881369 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-scripts\") pod \"91d301dc-5f68-4e1b-ae27-51aa02e45789\" (UID: \"91d301dc-5f68-4e1b-ae27-51aa02e45789\") " Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881712 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d301dc-5f68-4e1b-ae27-51aa02e45789-logs" (OuterVolumeSpecName: "logs") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.881814 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d301dc-5f68-4e1b-ae27-51aa02e45789-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.882504 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91d301dc-5f68-4e1b-ae27-51aa02e45789-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.886000 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.887776 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d301dc-5f68-4e1b-ae27-51aa02e45789-kube-api-access-lv5dr" (OuterVolumeSpecName: "kube-api-access-lv5dr") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "kube-api-access-lv5dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.890071 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-scripts" (OuterVolumeSpecName: "scripts") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.938670 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.945228 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.967469 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data" (OuterVolumeSpecName: "config-data") pod "91d301dc-5f68-4e1b-ae27-51aa02e45789" (UID: "91d301dc-5f68-4e1b-ae27-51aa02e45789"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.983358 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91d301dc-5f68-4e1b-ae27-51aa02e45789-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.983498 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv5dr\" (UniqueName: \"kubernetes.io/projected/91d301dc-5f68-4e1b-ae27-51aa02e45789-kube-api-access-lv5dr\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.983515 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.983525 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.983538 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:31 crc kubenswrapper[4931]: I0130 06:40:31.983550 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d301dc-5f68-4e1b-ae27-51aa02e45789-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.007092 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc","Type":"ContainerStarted","Data":"b063f0ec733a75c515c3444f9995fcc2e2f8271849b0bef3378fd8d8b5836815"} Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.007143 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc","Type":"ContainerStarted","Data":"41ced069b41fadb731c4969dff4b70199edbd0b789a0831d47ae3305decc53b2"} Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.011252 4931 generic.go:334] "Generic (PLEG): container finished" podID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerID="fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71" exitCode=0 Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.011316 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91d301dc-5f68-4e1b-ae27-51aa02e45789","Type":"ContainerDied","Data":"fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71"} Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.011381 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91d301dc-5f68-4e1b-ae27-51aa02e45789","Type":"ContainerDied","Data":"845bdc0d8784551cc86053ca285aa0afc7bce0f017005659d5e194550515ea02"} Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.011404 4931 scope.go:117] "RemoveContainer" containerID="fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.011622 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.038257 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.511902227 podStartE2EDuration="4.038231268s" podCreationTimestamp="2026-01-30 06:40:28 +0000 UTC" firstStartedPulling="2026-01-30 06:40:29.774726663 +0000 UTC m=+5565.144636920" lastFinishedPulling="2026-01-30 06:40:31.301055694 +0000 UTC m=+5566.670965961" observedRunningTime="2026-01-30 06:40:32.025100491 +0000 UTC m=+5567.395010748" watchObservedRunningTime="2026-01-30 06:40:32.038231268 +0000 UTC m=+5567.408141525" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.059323 4931 scope.go:117] "RemoveContainer" containerID="63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.066246 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.078486 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.092464 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:40:32 crc kubenswrapper[4931]: E0130 06:40:32.093076 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api-log" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.093096 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api-log" Jan 30 06:40:32 crc kubenswrapper[4931]: E0130 06:40:32.093114 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.093120 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.093310 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api-log" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.093331 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" containerName="cinder-api" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.094314 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.096276 4931 scope.go:117] "RemoveContainer" containerID="fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71" Jan 30 06:40:32 crc kubenswrapper[4931]: E0130 06:40:32.098570 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71\": container with ID starting with fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71 not found: ID does not exist" containerID="fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.098608 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71"} err="failed to get container status \"fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71\": rpc error: code = NotFound desc = could not find container \"fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71\": container with ID starting with fa29d3337c1f83563ce2742082bf23d118321e1590a92b71d5c9c2ccd94e7f71 not found: ID does not exist" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.098635 4931 scope.go:117] "RemoveContainer" containerID="63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.098862 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.101165 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:40:32 crc kubenswrapper[4931]: E0130 06:40:32.102615 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a\": container with ID starting with 63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a not found: ID does not exist" containerID="63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.102656 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a"} err="failed to get container status \"63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a\": rpc error: code = NotFound desc = could not find container \"63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a\": container with ID starting with 63f1e96dead88c4b078fa71100e8b05fe69a9964932dd921f7cca354e6177c0a not found: ID does not exist" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186543 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-scripts\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186678 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbzz8\" (UniqueName: \"kubernetes.io/projected/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-kube-api-access-xbzz8\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186761 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-logs\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186813 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186899 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-config-data\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.186929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-config-data-custom\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.217493 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.217570 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.218476 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.218509 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.226683 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.226807 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.288845 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-config-data-custom\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.288936 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.288983 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-scripts\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.289008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbzz8\" (UniqueName: \"kubernetes.io/projected/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-kube-api-access-xbzz8\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.289031 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.289089 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-logs\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.289151 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.289268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-config-data\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.290485 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-logs\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.301485 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-config-data\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.301854 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.302186 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-config-data-custom\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.302964 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-scripts\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.316005 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbzz8\" (UniqueName: \"kubernetes.io/projected/abe3ac27-91a6-4c8d-880c-b94ad5bd7aea-kube-api-access-xbzz8\") pod \"cinder-api-0\" (UID: \"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea\") " pod="openstack/cinder-api-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.344091 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.350631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.354082 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 06:40:32 crc kubenswrapper[4931]: I0130 06:40:32.420238 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:40:33 crc kubenswrapper[4931]: I0130 06:40:33.022646 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9be12b3c-c79f-4719-ab10-e3370519fbe3","Type":"ContainerStarted","Data":"241980f49c40f95d2ccdfb5357b574f3c4bdb551b651c252c35ffe07760989a5"} Jan 30 06:40:33 crc kubenswrapper[4931]: I0130 06:40:33.023173 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:40:33 crc kubenswrapper[4931]: I0130 06:40:33.029476 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 06:40:33 crc kubenswrapper[4931]: I0130 06:40:33.434341 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d301dc-5f68-4e1b-ae27-51aa02e45789" path="/var/lib/kubelet/pods/91d301dc-5f68-4e1b-ae27-51aa02e45789/volumes" Jan 30 06:40:34 crc kubenswrapper[4931]: I0130 06:40:34.053899 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea","Type":"ContainerStarted","Data":"b795b23d62ea617100f52742f280a1726eb7614ebc8642bffe33eb4dd426f5a2"} Jan 30 06:40:34 crc kubenswrapper[4931]: I0130 06:40:34.053953 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea","Type":"ContainerStarted","Data":"c94b8d210b840686b1f5b5650a9d9ed09fbcc5403a55b6dbea820787dbbe4a03"} Jan 30 06:40:34 crc kubenswrapper[4931]: I0130 06:40:34.056946 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9be12b3c-c79f-4719-ab10-e3370519fbe3","Type":"ContainerStarted","Data":"dd4ba9b31d20fedfba2cc15424b62059543696bef04d4dc149131425e2fecb4f"} Jan 30 06:40:34 crc kubenswrapper[4931]: I0130 06:40:34.101916 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.307649185 podStartE2EDuration="5.10188878s" podCreationTimestamp="2026-01-30 06:40:29 +0000 UTC" firstStartedPulling="2026-01-30 06:40:30.720890614 +0000 UTC m=+5566.090800881" lastFinishedPulling="2026-01-30 06:40:32.515130219 +0000 UTC m=+5567.885040476" observedRunningTime="2026-01-30 06:40:34.095333447 +0000 UTC m=+5569.465243744" watchObservedRunningTime="2026-01-30 06:40:34.10188878 +0000 UTC m=+5569.471799067" Jan 30 06:40:34 crc kubenswrapper[4931]: I0130 06:40:34.184592 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:35 crc kubenswrapper[4931]: I0130 06:40:35.073486 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"abe3ac27-91a6-4c8d-880c-b94ad5bd7aea","Type":"ContainerStarted","Data":"342d06d8ce0fb2789d4cd5be8ca534ba878a2b478b5e7f10823efea4311de174"} Jan 30 06:40:35 crc kubenswrapper[4931]: I0130 06:40:35.075753 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 06:40:35 crc kubenswrapper[4931]: I0130 06:40:35.114236 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.114207321 podStartE2EDuration="3.114207321s" podCreationTimestamp="2026-01-30 06:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:35.099709356 +0000 UTC m=+5570.469619703" watchObservedRunningTime="2026-01-30 06:40:35.114207321 +0000 UTC m=+5570.484117618" Jan 30 06:40:35 crc kubenswrapper[4931]: I0130 06:40:35.142508 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 30 06:40:37 crc kubenswrapper[4931]: I0130 06:40:37.232248 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 06:40:37 crc kubenswrapper[4931]: I0130 06:40:37.347060 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:38 crc kubenswrapper[4931]: I0130 06:40:38.105309 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="cinder-scheduler" containerID="cri-o://a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341" gracePeriod=30 Jan 30 06:40:38 crc kubenswrapper[4931]: I0130 06:40:38.105376 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="probe" containerID="cri-o://804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7" gracePeriod=30 Jan 30 06:40:39 crc kubenswrapper[4931]: I0130 06:40:39.117562 4931 generic.go:334] "Generic (PLEG): container finished" podID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerID="804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7" exitCode=0 Jan 30 06:40:39 crc kubenswrapper[4931]: I0130 06:40:39.117636 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2007a5f0-e092-4e2d-b41b-a32d073affcb","Type":"ContainerDied","Data":"804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7"} Jan 30 06:40:39 crc kubenswrapper[4931]: I0130 06:40:39.385312 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.339606 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.659156 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765551 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4smft\" (UniqueName: \"kubernetes.io/projected/2007a5f0-e092-4e2d-b41b-a32d073affcb-kube-api-access-4smft\") pod \"2007a5f0-e092-4e2d-b41b-a32d073affcb\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765596 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data\") pod \"2007a5f0-e092-4e2d-b41b-a32d073affcb\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765671 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data-custom\") pod \"2007a5f0-e092-4e2d-b41b-a32d073affcb\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765699 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2007a5f0-e092-4e2d-b41b-a32d073affcb-etc-machine-id\") pod \"2007a5f0-e092-4e2d-b41b-a32d073affcb\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765777 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-scripts\") pod \"2007a5f0-e092-4e2d-b41b-a32d073affcb\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765819 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-combined-ca-bundle\") pod \"2007a5f0-e092-4e2d-b41b-a32d073affcb\" (UID: \"2007a5f0-e092-4e2d-b41b-a32d073affcb\") " Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.765997 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2007a5f0-e092-4e2d-b41b-a32d073affcb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2007a5f0-e092-4e2d-b41b-a32d073affcb" (UID: "2007a5f0-e092-4e2d-b41b-a32d073affcb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.766512 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2007a5f0-e092-4e2d-b41b-a32d073affcb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.771750 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-scripts" (OuterVolumeSpecName: "scripts") pod "2007a5f0-e092-4e2d-b41b-a32d073affcb" (UID: "2007a5f0-e092-4e2d-b41b-a32d073affcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.772446 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2007a5f0-e092-4e2d-b41b-a32d073affcb" (UID: "2007a5f0-e092-4e2d-b41b-a32d073affcb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.780218 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2007a5f0-e092-4e2d-b41b-a32d073affcb-kube-api-access-4smft" (OuterVolumeSpecName: "kube-api-access-4smft") pod "2007a5f0-e092-4e2d-b41b-a32d073affcb" (UID: "2007a5f0-e092-4e2d-b41b-a32d073affcb"). InnerVolumeSpecName "kube-api-access-4smft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.819449 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2007a5f0-e092-4e2d-b41b-a32d073affcb" (UID: "2007a5f0-e092-4e2d-b41b-a32d073affcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.856494 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data" (OuterVolumeSpecName: "config-data") pod "2007a5f0-e092-4e2d-b41b-a32d073affcb" (UID: "2007a5f0-e092-4e2d-b41b-a32d073affcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.867803 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.868039 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4smft\" (UniqueName: \"kubernetes.io/projected/2007a5f0-e092-4e2d-b41b-a32d073affcb-kube-api-access-4smft\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.868052 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.868063 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:40 crc kubenswrapper[4931]: I0130 06:40:40.868073 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2007a5f0-e092-4e2d-b41b-a32d073affcb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.147677 4931 generic.go:334] "Generic (PLEG): container finished" podID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerID="a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341" exitCode=0 Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.147742 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2007a5f0-e092-4e2d-b41b-a32d073affcb","Type":"ContainerDied","Data":"a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341"} Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.147787 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2007a5f0-e092-4e2d-b41b-a32d073affcb","Type":"ContainerDied","Data":"d15afda74e3554cef98d12203f17762f433838a8050f7e4bd23eddab3a91800b"} Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.147810 4931 scope.go:117] "RemoveContainer" containerID="804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.147826 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.177801 4931 scope.go:117] "RemoveContainer" containerID="a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.201549 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.218751 4931 scope.go:117] "RemoveContainer" containerID="804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7" Jan 30 06:40:41 crc kubenswrapper[4931]: E0130 06:40:41.219174 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7\": container with ID starting with 804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7 not found: ID does not exist" containerID="804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.219214 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7"} err="failed to get container status \"804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7\": rpc error: code = NotFound desc = could not find container \"804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7\": container with ID starting with 804dabd22ea25238e25d9c6a0e8795f0865ccf0fe93f2320a7a8bd027766dca7 not found: ID does not exist" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.219241 4931 scope.go:117] "RemoveContainer" containerID="a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341" Jan 30 06:40:41 crc kubenswrapper[4931]: E0130 06:40:41.219782 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341\": container with ID starting with a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341 not found: ID does not exist" containerID="a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.219853 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341"} err="failed to get container status \"a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341\": rpc error: code = NotFound desc = could not find container \"a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341\": container with ID starting with a2c7a2fd3f1dbd78cbc3957d77cdeb8cf798121b652c6b3276edb45a82a90341 not found: ID does not exist" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.221247 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.270505 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:41 crc kubenswrapper[4931]: E0130 06:40:41.270971 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="probe" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.270988 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="probe" Jan 30 06:40:41 crc kubenswrapper[4931]: E0130 06:40:41.271003 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="cinder-scheduler" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.271009 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="cinder-scheduler" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.271197 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="cinder-scheduler" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.271213 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" containerName="probe" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.272234 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.276687 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.289177 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.377783 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.377921 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.378109 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.378177 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.378372 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41673f24-5c01-4401-839f-55da60930b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.378452 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trs27\" (UniqueName: \"kubernetes.io/projected/41673f24-5c01-4401-839f-55da60930b4d-kube-api-access-trs27\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.437997 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2007a5f0-e092-4e2d-b41b-a32d073affcb" path="/var/lib/kubelet/pods/2007a5f0-e092-4e2d-b41b-a32d073affcb/volumes" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41673f24-5c01-4401-839f-55da60930b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480300 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trs27\" (UniqueName: \"kubernetes.io/projected/41673f24-5c01-4401-839f-55da60930b4d-kube-api-access-trs27\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480360 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480455 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41673f24-5c01-4401-839f-55da60930b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480496 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.480854 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.486617 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.486765 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.488460 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.497043 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trs27\" (UniqueName: \"kubernetes.io/projected/41673f24-5c01-4401-839f-55da60930b4d-kube-api-access-trs27\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.503308 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41673f24-5c01-4401-839f-55da60930b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"41673f24-5c01-4401-839f-55da60930b4d\") " pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.592169 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.778095 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlzk5"] Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.780801 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.790227 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlzk5"] Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.887953 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-utilities\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.888024 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-catalog-content\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.888093 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgkbx\" (UniqueName: \"kubernetes.io/projected/eb599a5d-5068-4fab-bf45-937582c34eca-kube-api-access-sgkbx\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.992001 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-utilities\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.992151 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-catalog-content\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.992274 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgkbx\" (UniqueName: \"kubernetes.io/projected/eb599a5d-5068-4fab-bf45-937582c34eca-kube-api-access-sgkbx\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.993218 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-utilities\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:41 crc kubenswrapper[4931]: I0130 06:40:41.993239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-catalog-content\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:42 crc kubenswrapper[4931]: I0130 06:40:42.012054 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgkbx\" (UniqueName: \"kubernetes.io/projected/eb599a5d-5068-4fab-bf45-937582c34eca-kube-api-access-sgkbx\") pod \"redhat-operators-jlzk5\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:42 crc kubenswrapper[4931]: I0130 06:40:42.119798 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:40:42 crc kubenswrapper[4931]: I0130 06:40:42.124460 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:42 crc kubenswrapper[4931]: I0130 06:40:42.157502 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"41673f24-5c01-4401-839f-55da60930b4d","Type":"ContainerStarted","Data":"0bca28356d538272b347fc705dd6df1b5b2c095a93c54f361d0cddb95a6926cf"} Jan 30 06:40:42 crc kubenswrapper[4931]: I0130 06:40:42.586562 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlzk5"] Jan 30 06:40:43 crc kubenswrapper[4931]: I0130 06:40:43.170201 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"41673f24-5c01-4401-839f-55da60930b4d","Type":"ContainerStarted","Data":"5f39efd8390221c52290c8ba3f91c44523e0379119244139ebeb46bf34ec68d0"} Jan 30 06:40:43 crc kubenswrapper[4931]: I0130 06:40:43.172481 4931 generic.go:334] "Generic (PLEG): container finished" podID="eb599a5d-5068-4fab-bf45-937582c34eca" containerID="53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e" exitCode=0 Jan 30 06:40:43 crc kubenswrapper[4931]: I0130 06:40:43.172511 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerDied","Data":"53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e"} Jan 30 06:40:43 crc kubenswrapper[4931]: I0130 06:40:43.172527 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerStarted","Data":"f05b848343471061d40f87134776e9f8a1736a2776fb3239d3e410c22aaedae8"} Jan 30 06:40:44 crc kubenswrapper[4931]: I0130 06:40:44.186213 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"41673f24-5c01-4401-839f-55da60930b4d","Type":"ContainerStarted","Data":"f88c3a6367ef2f74a464be569950933828eb7608482628c57c45a4dafb65e494"} Jan 30 06:40:44 crc kubenswrapper[4931]: I0130 06:40:44.191808 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerStarted","Data":"ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade"} Jan 30 06:40:44 crc kubenswrapper[4931]: I0130 06:40:44.224267 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.224246092 podStartE2EDuration="3.224246092s" podCreationTimestamp="2026-01-30 06:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:44.212514494 +0000 UTC m=+5579.582424751" watchObservedRunningTime="2026-01-30 06:40:44.224246092 +0000 UTC m=+5579.594156359" Jan 30 06:40:44 crc kubenswrapper[4931]: I0130 06:40:44.494870 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 06:40:46 crc kubenswrapper[4931]: I0130 06:40:46.212496 4931 generic.go:334] "Generic (PLEG): container finished" podID="eb599a5d-5068-4fab-bf45-937582c34eca" containerID="ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade" exitCode=0 Jan 30 06:40:46 crc kubenswrapper[4931]: I0130 06:40:46.212671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerDied","Data":"ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade"} Jan 30 06:40:46 crc kubenswrapper[4931]: I0130 06:40:46.592700 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 06:40:47 crc kubenswrapper[4931]: I0130 06:40:47.228294 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerStarted","Data":"0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f"} Jan 30 06:40:47 crc kubenswrapper[4931]: I0130 06:40:47.263039 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlzk5" podStartSLOduration=2.567082178 podStartE2EDuration="6.263004655s" podCreationTimestamp="2026-01-30 06:40:41 +0000 UTC" firstStartedPulling="2026-01-30 06:40:43.180209574 +0000 UTC m=+5578.550119831" lastFinishedPulling="2026-01-30 06:40:46.876132051 +0000 UTC m=+5582.246042308" observedRunningTime="2026-01-30 06:40:47.255382581 +0000 UTC m=+5582.625292838" watchObservedRunningTime="2026-01-30 06:40:47.263004655 +0000 UTC m=+5582.632914952" Jan 30 06:40:51 crc kubenswrapper[4931]: I0130 06:40:51.782938 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 06:40:52 crc kubenswrapper[4931]: I0130 06:40:52.125406 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:52 crc kubenswrapper[4931]: I0130 06:40:52.125754 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:40:53 crc kubenswrapper[4931]: I0130 06:40:53.177358 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlzk5" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="registry-server" probeResult="failure" output=< Jan 30 06:40:53 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:40:53 crc kubenswrapper[4931]: > Jan 30 06:40:57 crc kubenswrapper[4931]: I0130 06:40:57.363452 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:40:57 crc kubenswrapper[4931]: I0130 06:40:57.364169 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:41:02 crc kubenswrapper[4931]: I0130 06:41:02.188164 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:41:02 crc kubenswrapper[4931]: I0130 06:41:02.280628 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:41:02 crc kubenswrapper[4931]: I0130 06:41:02.433043 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlzk5"] Jan 30 06:41:03 crc kubenswrapper[4931]: I0130 06:41:03.426561 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlzk5" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="registry-server" containerID="cri-o://0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f" gracePeriod=2 Jan 30 06:41:03 crc kubenswrapper[4931]: I0130 06:41:03.969483 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.093787 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgkbx\" (UniqueName: \"kubernetes.io/projected/eb599a5d-5068-4fab-bf45-937582c34eca-kube-api-access-sgkbx\") pod \"eb599a5d-5068-4fab-bf45-937582c34eca\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.093903 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-catalog-content\") pod \"eb599a5d-5068-4fab-bf45-937582c34eca\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.093927 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-utilities\") pod \"eb599a5d-5068-4fab-bf45-937582c34eca\" (UID: \"eb599a5d-5068-4fab-bf45-937582c34eca\") " Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.095781 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-utilities" (OuterVolumeSpecName: "utilities") pod "eb599a5d-5068-4fab-bf45-937582c34eca" (UID: "eb599a5d-5068-4fab-bf45-937582c34eca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.103090 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb599a5d-5068-4fab-bf45-937582c34eca-kube-api-access-sgkbx" (OuterVolumeSpecName: "kube-api-access-sgkbx") pod "eb599a5d-5068-4fab-bf45-937582c34eca" (UID: "eb599a5d-5068-4fab-bf45-937582c34eca"). InnerVolumeSpecName "kube-api-access-sgkbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.197354 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgkbx\" (UniqueName: \"kubernetes.io/projected/eb599a5d-5068-4fab-bf45-937582c34eca-kube-api-access-sgkbx\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.197413 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.213787 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb599a5d-5068-4fab-bf45-937582c34eca" (UID: "eb599a5d-5068-4fab-bf45-937582c34eca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.299480 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb599a5d-5068-4fab-bf45-937582c34eca-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.442538 4931 generic.go:334] "Generic (PLEG): container finished" podID="eb599a5d-5068-4fab-bf45-937582c34eca" containerID="0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f" exitCode=0 Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.442600 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerDied","Data":"0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f"} Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.442640 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlzk5" event={"ID":"eb599a5d-5068-4fab-bf45-937582c34eca","Type":"ContainerDied","Data":"f05b848343471061d40f87134776e9f8a1736a2776fb3239d3e410c22aaedae8"} Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.442668 4931 scope.go:117] "RemoveContainer" containerID="0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.442872 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlzk5" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.509322 4931 scope.go:117] "RemoveContainer" containerID="ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.509454 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlzk5"] Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.533138 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlzk5"] Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.550553 4931 scope.go:117] "RemoveContainer" containerID="53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.593042 4931 scope.go:117] "RemoveContainer" containerID="0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f" Jan 30 06:41:04 crc kubenswrapper[4931]: E0130 06:41:04.593460 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f\": container with ID starting with 0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f not found: ID does not exist" containerID="0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.593493 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f"} err="failed to get container status \"0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f\": rpc error: code = NotFound desc = could not find container \"0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f\": container with ID starting with 0ee05a3ad3b0c1019b7e1d336a82186f59b7cf43c43a12a8b93665177821ed8f not found: ID does not exist" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.593514 4931 scope.go:117] "RemoveContainer" containerID="ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade" Jan 30 06:41:04 crc kubenswrapper[4931]: E0130 06:41:04.593801 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade\": container with ID starting with ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade not found: ID does not exist" containerID="ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.593858 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade"} err="failed to get container status \"ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade\": rpc error: code = NotFound desc = could not find container \"ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade\": container with ID starting with ba3091fc93684e0350a82996dc3bab508983b4d574fa3f6038591e98d98c1ade not found: ID does not exist" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.593876 4931 scope.go:117] "RemoveContainer" containerID="53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e" Jan 30 06:41:04 crc kubenswrapper[4931]: E0130 06:41:04.594196 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e\": container with ID starting with 53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e not found: ID does not exist" containerID="53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e" Jan 30 06:41:04 crc kubenswrapper[4931]: I0130 06:41:04.594222 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e"} err="failed to get container status \"53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e\": rpc error: code = NotFound desc = could not find container \"53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e\": container with ID starting with 53f457b98adb8f7e5c0cd7641b26244cf7089d77f7f8044d2f7dcc33e6146a2e not found: ID does not exist" Jan 30 06:41:05 crc kubenswrapper[4931]: I0130 06:41:05.438620 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" path="/var/lib/kubelet/pods/eb599a5d-5068-4fab-bf45-937582c34eca/volumes" Jan 30 06:41:27 crc kubenswrapper[4931]: I0130 06:41:27.363264 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:41:27 crc kubenswrapper[4931]: I0130 06:41:27.363970 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:41:57 crc kubenswrapper[4931]: I0130 06:41:57.363048 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:41:57 crc kubenswrapper[4931]: I0130 06:41:57.363695 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:41:57 crc kubenswrapper[4931]: I0130 06:41:57.363995 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:41:57 crc kubenswrapper[4931]: I0130 06:41:57.365121 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:41:57 crc kubenswrapper[4931]: I0130 06:41:57.365193 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" gracePeriod=600 Jan 30 06:41:57 crc kubenswrapper[4931]: E0130 06:41:57.493624 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:41:58 crc kubenswrapper[4931]: I0130 06:41:58.062015 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" exitCode=0 Jan 30 06:41:58 crc kubenswrapper[4931]: I0130 06:41:58.062077 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7"} Jan 30 06:41:58 crc kubenswrapper[4931]: I0130 06:41:58.062123 4931 scope.go:117] "RemoveContainer" containerID="7acc7dd93c5206c6ff8ebf3271a041083dbd9bcf0e00cc88a42d6c0b4c7429dd" Jan 30 06:41:58 crc kubenswrapper[4931]: I0130 06:41:58.062957 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:41:58 crc kubenswrapper[4931]: E0130 06:41:58.063529 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:42:12 crc kubenswrapper[4931]: I0130 06:42:12.422520 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:42:12 crc kubenswrapper[4931]: E0130 06:42:12.423465 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.093745 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bn5b4"] Jan 30 06:42:13 crc kubenswrapper[4931]: E0130 06:42:13.094490 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="extract-content" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.094521 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="extract-content" Jan 30 06:42:13 crc kubenswrapper[4931]: E0130 06:42:13.094570 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="registry-server" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.094583 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="registry-server" Jan 30 06:42:13 crc kubenswrapper[4931]: E0130 06:42:13.094610 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="extract-utilities" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.094625 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="extract-utilities" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.094935 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb599a5d-5068-4fab-bf45-937582c34eca" containerName="registry-server" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.097320 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.111359 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bn5b4"] Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.201770 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f9cc\" (UniqueName: \"kubernetes.io/projected/b9cea615-7c24-42ec-b0b3-ba654afb5e48-kube-api-access-6f9cc\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.201828 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-utilities\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.201889 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-catalog-content\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.303943 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9cc\" (UniqueName: \"kubernetes.io/projected/b9cea615-7c24-42ec-b0b3-ba654afb5e48-kube-api-access-6f9cc\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.304027 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-utilities\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.304099 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-catalog-content\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.305036 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-catalog-content\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.305194 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-utilities\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.330608 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9cc\" (UniqueName: \"kubernetes.io/projected/b9cea615-7c24-42ec-b0b3-ba654afb5e48-kube-api-access-6f9cc\") pod \"certified-operators-bn5b4\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.438534 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:13 crc kubenswrapper[4931]: I0130 06:42:13.990109 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bn5b4"] Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.242500 4931 generic.go:334] "Generic (PLEG): container finished" podID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerID="8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf" exitCode=0 Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.242540 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bn5b4" event={"ID":"b9cea615-7c24-42ec-b0b3-ba654afb5e48","Type":"ContainerDied","Data":"8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf"} Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.242568 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bn5b4" event={"ID":"b9cea615-7c24-42ec-b0b3-ba654afb5e48","Type":"ContainerStarted","Data":"07b667d5a687f0233563586b73c3bec225ac3979ee53165d693ac755c7d56b48"} Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.512337 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f58fk"] Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.518780 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.533709 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f58fk"] Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.633433 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-catalog-content\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.633659 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqjz\" (UniqueName: \"kubernetes.io/projected/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-kube-api-access-qpqjz\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.633803 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-utilities\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.735730 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-utilities\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.735858 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-catalog-content\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.735900 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqjz\" (UniqueName: \"kubernetes.io/projected/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-kube-api-access-qpqjz\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.736465 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-catalog-content\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.737724 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-utilities\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.755818 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqjz\" (UniqueName: \"kubernetes.io/projected/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-kube-api-access-qpqjz\") pod \"redhat-marketplace-f58fk\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:14 crc kubenswrapper[4931]: I0130 06:42:14.853744 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:15 crc kubenswrapper[4931]: I0130 06:42:15.324076 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f58fk"] Jan 30 06:42:15 crc kubenswrapper[4931]: W0130 06:42:15.331593 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3b4ace_6e1f_47cb_86a9_e5488138abc1.slice/crio-95b1f6f5d405e856dde3770b24820eb431009b98160bff8df51161ca28427373 WatchSource:0}: Error finding container 95b1f6f5d405e856dde3770b24820eb431009b98160bff8df51161ca28427373: Status 404 returned error can't find the container with id 95b1f6f5d405e856dde3770b24820eb431009b98160bff8df51161ca28427373 Jan 30 06:42:16 crc kubenswrapper[4931]: I0130 06:42:16.268017 4931 generic.go:334] "Generic (PLEG): container finished" podID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerID="6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35" exitCode=0 Jan 30 06:42:16 crc kubenswrapper[4931]: I0130 06:42:16.268097 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bn5b4" event={"ID":"b9cea615-7c24-42ec-b0b3-ba654afb5e48","Type":"ContainerDied","Data":"6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35"} Jan 30 06:42:16 crc kubenswrapper[4931]: I0130 06:42:16.271083 4931 generic.go:334] "Generic (PLEG): container finished" podID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerID="82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560" exitCode=0 Jan 30 06:42:16 crc kubenswrapper[4931]: I0130 06:42:16.271125 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f58fk" event={"ID":"3a3b4ace-6e1f-47cb-86a9-e5488138abc1","Type":"ContainerDied","Data":"82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560"} Jan 30 06:42:16 crc kubenswrapper[4931]: I0130 06:42:16.271151 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f58fk" event={"ID":"3a3b4ace-6e1f-47cb-86a9-e5488138abc1","Type":"ContainerStarted","Data":"95b1f6f5d405e856dde3770b24820eb431009b98160bff8df51161ca28427373"} Jan 30 06:42:17 crc kubenswrapper[4931]: I0130 06:42:17.283504 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bn5b4" event={"ID":"b9cea615-7c24-42ec-b0b3-ba654afb5e48","Type":"ContainerStarted","Data":"68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf"} Jan 30 06:42:17 crc kubenswrapper[4931]: I0130 06:42:17.286627 4931 generic.go:334] "Generic (PLEG): container finished" podID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerID="c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3" exitCode=0 Jan 30 06:42:17 crc kubenswrapper[4931]: I0130 06:42:17.286697 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f58fk" event={"ID":"3a3b4ace-6e1f-47cb-86a9-e5488138abc1","Type":"ContainerDied","Data":"c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3"} Jan 30 06:42:17 crc kubenswrapper[4931]: I0130 06:42:17.327458 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bn5b4" podStartSLOduration=1.8615570830000001 podStartE2EDuration="4.327412077s" podCreationTimestamp="2026-01-30 06:42:13 +0000 UTC" firstStartedPulling="2026-01-30 06:42:14.244013436 +0000 UTC m=+5669.613923683" lastFinishedPulling="2026-01-30 06:42:16.70986838 +0000 UTC m=+5672.079778677" observedRunningTime="2026-01-30 06:42:17.304004442 +0000 UTC m=+5672.673914699" watchObservedRunningTime="2026-01-30 06:42:17.327412077 +0000 UTC m=+5672.697322354" Jan 30 06:42:18 crc kubenswrapper[4931]: I0130 06:42:18.298622 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f58fk" event={"ID":"3a3b4ace-6e1f-47cb-86a9-e5488138abc1","Type":"ContainerStarted","Data":"33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097"} Jan 30 06:42:18 crc kubenswrapper[4931]: I0130 06:42:18.331405 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f58fk" podStartSLOduration=2.914215787 podStartE2EDuration="4.331379904s" podCreationTimestamp="2026-01-30 06:42:14 +0000 UTC" firstStartedPulling="2026-01-30 06:42:16.275030105 +0000 UTC m=+5671.644940402" lastFinishedPulling="2026-01-30 06:42:17.692194252 +0000 UTC m=+5673.062104519" observedRunningTime="2026-01-30 06:42:18.323238757 +0000 UTC m=+5673.693149034" watchObservedRunningTime="2026-01-30 06:42:18.331379904 +0000 UTC m=+5673.701290191" Jan 30 06:42:23 crc kubenswrapper[4931]: I0130 06:42:23.445583 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:23 crc kubenswrapper[4931]: I0130 06:42:23.446263 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:23 crc kubenswrapper[4931]: I0130 06:42:23.534667 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:24 crc kubenswrapper[4931]: I0130 06:42:24.447628 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:24 crc kubenswrapper[4931]: I0130 06:42:24.525972 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bn5b4"] Jan 30 06:42:24 crc kubenswrapper[4931]: I0130 06:42:24.854139 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:24 crc kubenswrapper[4931]: I0130 06:42:24.854249 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:24 crc kubenswrapper[4931]: I0130 06:42:24.921808 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:25 crc kubenswrapper[4931]: I0130 06:42:25.475851 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:26 crc kubenswrapper[4931]: I0130 06:42:26.403348 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bn5b4" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="registry-server" containerID="cri-o://68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf" gracePeriod=2 Jan 30 06:42:26 crc kubenswrapper[4931]: I0130 06:42:26.423134 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:42:26 crc kubenswrapper[4931]: E0130 06:42:26.423592 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:42:26 crc kubenswrapper[4931]: I0130 06:42:26.878324 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f58fk"] Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.002349 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.111047 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-catalog-content\") pod \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.111410 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-utilities\") pod \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.111789 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f9cc\" (UniqueName: \"kubernetes.io/projected/b9cea615-7c24-42ec-b0b3-ba654afb5e48-kube-api-access-6f9cc\") pod \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\" (UID: \"b9cea615-7c24-42ec-b0b3-ba654afb5e48\") " Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.112667 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-utilities" (OuterVolumeSpecName: "utilities") pod "b9cea615-7c24-42ec-b0b3-ba654afb5e48" (UID: "b9cea615-7c24-42ec-b0b3-ba654afb5e48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.112996 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.121989 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9cea615-7c24-42ec-b0b3-ba654afb5e48-kube-api-access-6f9cc" (OuterVolumeSpecName: "kube-api-access-6f9cc") pod "b9cea615-7c24-42ec-b0b3-ba654afb5e48" (UID: "b9cea615-7c24-42ec-b0b3-ba654afb5e48"). InnerVolumeSpecName "kube-api-access-6f9cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.183144 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9cea615-7c24-42ec-b0b3-ba654afb5e48" (UID: "b9cea615-7c24-42ec-b0b3-ba654afb5e48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.216349 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f9cc\" (UniqueName: \"kubernetes.io/projected/b9cea615-7c24-42ec-b0b3-ba654afb5e48-kube-api-access-6f9cc\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.216467 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cea615-7c24-42ec-b0b3-ba654afb5e48-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.438355 4931 generic.go:334] "Generic (PLEG): container finished" podID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerID="68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf" exitCode=0 Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.438612 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bn5b4" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.439461 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f58fk" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="registry-server" containerID="cri-o://33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097" gracePeriod=2 Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.454736 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bn5b4" event={"ID":"b9cea615-7c24-42ec-b0b3-ba654afb5e48","Type":"ContainerDied","Data":"68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf"} Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.454820 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bn5b4" event={"ID":"b9cea615-7c24-42ec-b0b3-ba654afb5e48","Type":"ContainerDied","Data":"07b667d5a687f0233563586b73c3bec225ac3979ee53165d693ac755c7d56b48"} Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.454862 4931 scope.go:117] "RemoveContainer" containerID="68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.500474 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bn5b4"] Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.503190 4931 scope.go:117] "RemoveContainer" containerID="6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.511147 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bn5b4"] Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.574891 4931 scope.go:117] "RemoveContainer" containerID="8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.671481 4931 scope.go:117] "RemoveContainer" containerID="68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf" Jan 30 06:42:27 crc kubenswrapper[4931]: E0130 06:42:27.672378 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf\": container with ID starting with 68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf not found: ID does not exist" containerID="68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.672478 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf"} err="failed to get container status \"68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf\": rpc error: code = NotFound desc = could not find container \"68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf\": container with ID starting with 68259ee6ca07d14b60b659f82f59dfb82a716e0e1c7f6e1d45171b2eadc5aecf not found: ID does not exist" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.672521 4931 scope.go:117] "RemoveContainer" containerID="6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35" Jan 30 06:42:27 crc kubenswrapper[4931]: E0130 06:42:27.673123 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35\": container with ID starting with 6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35 not found: ID does not exist" containerID="6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.673163 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35"} err="failed to get container status \"6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35\": rpc error: code = NotFound desc = could not find container \"6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35\": container with ID starting with 6bd86b79bf75d3dfdcb96d3398f82d4396a81b534462fd1fef006bc55f442a35 not found: ID does not exist" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.673190 4931 scope.go:117] "RemoveContainer" containerID="8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf" Jan 30 06:42:27 crc kubenswrapper[4931]: E0130 06:42:27.673916 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf\": container with ID starting with 8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf not found: ID does not exist" containerID="8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.673959 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf"} err="failed to get container status \"8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf\": rpc error: code = NotFound desc = could not find container \"8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf\": container with ID starting with 8d15430013f5c26b2602305fbef33139ae2278fae40d50766e8c390fcc24c7cf not found: ID does not exist" Jan 30 06:42:27 crc kubenswrapper[4931]: I0130 06:42:27.998187 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.033855 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-catalog-content\") pod \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.033957 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-utilities\") pod \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.034032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpqjz\" (UniqueName: \"kubernetes.io/projected/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-kube-api-access-qpqjz\") pod \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\" (UID: \"3a3b4ace-6e1f-47cb-86a9-e5488138abc1\") " Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.041318 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-utilities" (OuterVolumeSpecName: "utilities") pod "3a3b4ace-6e1f-47cb-86a9-e5488138abc1" (UID: "3a3b4ace-6e1f-47cb-86a9-e5488138abc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.041530 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-kube-api-access-qpqjz" (OuterVolumeSpecName: "kube-api-access-qpqjz") pod "3a3b4ace-6e1f-47cb-86a9-e5488138abc1" (UID: "3a3b4ace-6e1f-47cb-86a9-e5488138abc1"). InnerVolumeSpecName "kube-api-access-qpqjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.077561 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a3b4ace-6e1f-47cb-86a9-e5488138abc1" (UID: "3a3b4ace-6e1f-47cb-86a9-e5488138abc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.135925 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.136219 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.136302 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpqjz\" (UniqueName: \"kubernetes.io/projected/3a3b4ace-6e1f-47cb-86a9-e5488138abc1-kube-api-access-qpqjz\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.447853 4931 generic.go:334] "Generic (PLEG): container finished" podID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerID="33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097" exitCode=0 Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.447907 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f58fk" event={"ID":"3a3b4ace-6e1f-47cb-86a9-e5488138abc1","Type":"ContainerDied","Data":"33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097"} Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.447938 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f58fk" event={"ID":"3a3b4ace-6e1f-47cb-86a9-e5488138abc1","Type":"ContainerDied","Data":"95b1f6f5d405e856dde3770b24820eb431009b98160bff8df51161ca28427373"} Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.447955 4931 scope.go:117] "RemoveContainer" containerID="33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.448046 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f58fk" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.481209 4931 scope.go:117] "RemoveContainer" containerID="c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.489399 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f58fk"] Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.500038 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f58fk"] Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.504297 4931 scope.go:117] "RemoveContainer" containerID="82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.535998 4931 scope.go:117] "RemoveContainer" containerID="33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097" Jan 30 06:42:28 crc kubenswrapper[4931]: E0130 06:42:28.536361 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097\": container with ID starting with 33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097 not found: ID does not exist" containerID="33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.536392 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097"} err="failed to get container status \"33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097\": rpc error: code = NotFound desc = could not find container \"33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097\": container with ID starting with 33b05cbdc1617d484d891da5422fefa3e3f3db732ebd7f3db528bd94dad2d097 not found: ID does not exist" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.536412 4931 scope.go:117] "RemoveContainer" containerID="c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3" Jan 30 06:42:28 crc kubenswrapper[4931]: E0130 06:42:28.536598 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3\": container with ID starting with c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3 not found: ID does not exist" containerID="c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.536623 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3"} err="failed to get container status \"c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3\": rpc error: code = NotFound desc = could not find container \"c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3\": container with ID starting with c8aa8b279e240894abe786d23c00883f82a3c5a04ca06c870773fa7f9e0837a3 not found: ID does not exist" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.536635 4931 scope.go:117] "RemoveContainer" containerID="82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560" Jan 30 06:42:28 crc kubenswrapper[4931]: E0130 06:42:28.537040 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560\": container with ID starting with 82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560 not found: ID does not exist" containerID="82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560" Jan 30 06:42:28 crc kubenswrapper[4931]: I0130 06:42:28.537062 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560"} err="failed to get container status \"82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560\": rpc error: code = NotFound desc = could not find container \"82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560\": container with ID starting with 82ffd628dbc1781de1a42d2d3dfd1ecf27324557af794f3be3a9579fbf4a1560 not found: ID does not exist" Jan 30 06:42:29 crc kubenswrapper[4931]: I0130 06:42:29.435953 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" path="/var/lib/kubelet/pods/3a3b4ace-6e1f-47cb-86a9-e5488138abc1/volumes" Jan 30 06:42:29 crc kubenswrapper[4931]: I0130 06:42:29.436914 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" path="/var/lib/kubelet/pods/b9cea615-7c24-42ec-b0b3-ba654afb5e48/volumes" Jan 30 06:42:31 crc kubenswrapper[4931]: I0130 06:42:31.068896 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4gjjc"] Jan 30 06:42:31 crc kubenswrapper[4931]: I0130 06:42:31.081148 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9c4e-account-create-update-vz7cn"] Jan 30 06:42:31 crc kubenswrapper[4931]: I0130 06:42:31.093100 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4gjjc"] Jan 30 06:42:31 crc kubenswrapper[4931]: I0130 06:42:31.103756 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9c4e-account-create-update-vz7cn"] Jan 30 06:42:31 crc kubenswrapper[4931]: I0130 06:42:31.443359 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b58826-6a83-4c91-a9f7-8c6c861c509b" path="/var/lib/kubelet/pods/d5b58826-6a83-4c91-a9f7-8c6c861c509b/volumes" Jan 30 06:42:31 crc kubenswrapper[4931]: I0130 06:42:31.444717 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95012d1-b402-48eb-baf4-36fabfd1e4f2" path="/var/lib/kubelet/pods/d95012d1-b402-48eb-baf4-36fabfd1e4f2/volumes" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.025253 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cgvfd"] Jan 30 06:42:33 crc kubenswrapper[4931]: E0130 06:42:33.025845 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="extract-content" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.025869 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="extract-content" Jan 30 06:42:33 crc kubenswrapper[4931]: E0130 06:42:33.025904 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="extract-content" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.025916 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="extract-content" Jan 30 06:42:33 crc kubenswrapper[4931]: E0130 06:42:33.025936 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="extract-utilities" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.025947 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="extract-utilities" Jan 30 06:42:33 crc kubenswrapper[4931]: E0130 06:42:33.025972 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="extract-utilities" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.025982 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="extract-utilities" Jan 30 06:42:33 crc kubenswrapper[4931]: E0130 06:42:33.026008 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="registry-server" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.026022 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="registry-server" Jan 30 06:42:33 crc kubenswrapper[4931]: E0130 06:42:33.026039 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="registry-server" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.026050 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="registry-server" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.026320 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3b4ace-6e1f-47cb-86a9-e5488138abc1" containerName="registry-server" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.026338 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9cea615-7c24-42ec-b0b3-ba654afb5e48" containerName="registry-server" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.027348 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.029122 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-run-ovn\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.029166 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-log-ovn\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.029214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-scripts\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.029404 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-run\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.029577 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66sx7\" (UniqueName: \"kubernetes.io/projected/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-kube-api-access-66sx7\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.037000 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.037030 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jzqpv" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.050564 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cgvfd"] Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.063862 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5tgfr"] Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.065681 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.081198 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5tgfr"] Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.131035 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66sx7\" (UniqueName: \"kubernetes.io/projected/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-kube-api-access-66sx7\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.131127 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-run-ovn\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.131157 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-log-ovn\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.131195 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-scripts\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.131242 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-run\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.131596 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-run\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.132622 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-run-ovn\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.132674 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-var-log-ovn\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.134386 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-scripts\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.177328 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66sx7\" (UniqueName: \"kubernetes.io/projected/9bf15e4b-1a09-401b-87e9-97cff0ee8c91-kube-api-access-66sx7\") pod \"ovn-controller-cgvfd\" (UID: \"9bf15e4b-1a09-401b-87e9-97cff0ee8c91\") " pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.232459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-log\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.232792 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-etc-ovs\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.232894 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-lib\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.232984 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sjmd\" (UniqueName: \"kubernetes.io/projected/56a7c911-151f-42ff-b005-58bdaecd5d8b-kube-api-access-2sjmd\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.233067 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-run\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.233190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a7c911-151f-42ff-b005-58bdaecd5d8b-scripts\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.334837 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-log\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.335704 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-etc-ovs\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.335887 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-lib\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.336040 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sjmd\" (UniqueName: \"kubernetes.io/projected/56a7c911-151f-42ff-b005-58bdaecd5d8b-kube-api-access-2sjmd\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.336393 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-run\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.336599 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a7c911-151f-42ff-b005-58bdaecd5d8b-scripts\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.335851 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-etc-ovs\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.335311 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-log\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.336545 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-run\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.335997 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/56a7c911-151f-42ff-b005-58bdaecd5d8b-var-lib\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.338449 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a7c911-151f-42ff-b005-58bdaecd5d8b-scripts\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.355494 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.356064 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sjmd\" (UniqueName: \"kubernetes.io/projected/56a7c911-151f-42ff-b005-58bdaecd5d8b-kube-api-access-2sjmd\") pod \"ovn-controller-ovs-5tgfr\" (UID: \"56a7c911-151f-42ff-b005-58bdaecd5d8b\") " pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.380765 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:33 crc kubenswrapper[4931]: I0130 06:42:33.841513 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cgvfd"] Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.295118 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5tgfr"] Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.427541 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cs66w"] Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.430202 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.434719 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.466263 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cs66w"] Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.475079 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-ovs-rundir\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.475184 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-config\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.476391 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjwh\" (UniqueName: \"kubernetes.io/projected/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-kube-api-access-gtjwh\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.476439 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-ovn-rundir\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.528289 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cgvfd" event={"ID":"9bf15e4b-1a09-401b-87e9-97cff0ee8c91","Type":"ContainerStarted","Data":"0c3c3eeca6c73b3ff16722355fd66f5c651bc1882cab39286a44a1e8ad76c897"} Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.528329 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cgvfd" event={"ID":"9bf15e4b-1a09-401b-87e9-97cff0ee8c91","Type":"ContainerStarted","Data":"5d07d9338e1c0281f7605804ef343063b26310b357099627addd260ceaf76b53"} Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.530410 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-cgvfd" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.532211 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tgfr" event={"ID":"56a7c911-151f-42ff-b005-58bdaecd5d8b","Type":"ContainerStarted","Data":"396cc406a2c1dc92073d551789c027ec668df6ffb85ebe2a4b91603199d49adc"} Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.556550 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-cgvfd" podStartSLOduration=2.556529366 podStartE2EDuration="2.556529366s" podCreationTimestamp="2026-01-30 06:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:34.548011268 +0000 UTC m=+5689.917921535" watchObservedRunningTime="2026-01-30 06:42:34.556529366 +0000 UTC m=+5689.926439623" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.577639 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-config\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.578666 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjwh\" (UniqueName: \"kubernetes.io/projected/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-kube-api-access-gtjwh\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.578743 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-ovn-rundir\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.578896 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-ovs-rundir\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.579184 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-ovs-rundir\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.580143 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-config\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.580397 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-ovn-rundir\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.603112 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjwh\" (UniqueName: \"kubernetes.io/projected/608bb576-83fd-4c7c-b8b3-a4f9ff46b661-kube-api-access-gtjwh\") pod \"ovn-controller-metrics-cs66w\" (UID: \"608bb576-83fd-4c7c-b8b3-a4f9ff46b661\") " pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.720231 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-5v2g5"] Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.724354 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.737532 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-5v2g5"] Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.770466 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cs66w" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.781321 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-operator-scripts\") pod \"octavia-db-create-5v2g5\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.781377 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbxs\" (UniqueName: \"kubernetes.io/projected/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-kube-api-access-rgbxs\") pod \"octavia-db-create-5v2g5\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.888518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-operator-scripts\") pod \"octavia-db-create-5v2g5\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.888866 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbxs\" (UniqueName: \"kubernetes.io/projected/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-kube-api-access-rgbxs\") pod \"octavia-db-create-5v2g5\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.890137 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-operator-scripts\") pod \"octavia-db-create-5v2g5\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:34 crc kubenswrapper[4931]: I0130 06:42:34.914637 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbxs\" (UniqueName: \"kubernetes.io/projected/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-kube-api-access-rgbxs\") pod \"octavia-db-create-5v2g5\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.043823 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.108504 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cs66w"] Jan 30 06:42:35 crc kubenswrapper[4931]: W0130 06:42:35.116075 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod608bb576_83fd_4c7c_b8b3_a4f9ff46b661.slice/crio-870d92e75412e44964d9ea5d91d90956408a5d712425501bbc09693af95cd863 WatchSource:0}: Error finding container 870d92e75412e44964d9ea5d91d90956408a5d712425501bbc09693af95cd863: Status 404 returned error can't find the container with id 870d92e75412e44964d9ea5d91d90956408a5d712425501bbc09693af95cd863 Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.531544 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-5v2g5"] Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.552125 4931 generic.go:334] "Generic (PLEG): container finished" podID="56a7c911-151f-42ff-b005-58bdaecd5d8b" containerID="ed613fb39544acb199b57f50f713767fc77e2920f1d258851689a1db05c78b4b" exitCode=0 Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.552483 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tgfr" event={"ID":"56a7c911-151f-42ff-b005-58bdaecd5d8b","Type":"ContainerDied","Data":"ed613fb39544acb199b57f50f713767fc77e2920f1d258851689a1db05c78b4b"} Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.556733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cs66w" event={"ID":"608bb576-83fd-4c7c-b8b3-a4f9ff46b661","Type":"ContainerStarted","Data":"db61070dbe381cafba8aa9a2ec0fa041b77e8020fdbc00408666a5001a641f61"} Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.556760 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cs66w" event={"ID":"608bb576-83fd-4c7c-b8b3-a4f9ff46b661","Type":"ContainerStarted","Data":"870d92e75412e44964d9ea5d91d90956408a5d712425501bbc09693af95cd863"} Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.564101 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5v2g5" event={"ID":"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031","Type":"ContainerStarted","Data":"4f31ce8325d581327e30dec1b76d7448dbaa5c149d47de8e58a7992f4f878229"} Jan 30 06:42:35 crc kubenswrapper[4931]: I0130 06:42:35.600033 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cs66w" podStartSLOduration=1.600015049 podStartE2EDuration="1.600015049s" podCreationTimestamp="2026-01-30 06:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:35.585384379 +0000 UTC m=+5690.955294636" watchObservedRunningTime="2026-01-30 06:42:35.600015049 +0000 UTC m=+5690.969925306" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.361242 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db99-account-create-update-8dpb5"] Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.363143 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.366631 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.375933 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db99-account-create-update-8dpb5"] Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.431544 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-685pf\" (UniqueName: \"kubernetes.io/projected/21268b76-c5b2-457f-a433-ff2da3b9bd10-kube-api-access-685pf\") pod \"octavia-db99-account-create-update-8dpb5\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.431920 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21268b76-c5b2-457f-a433-ff2da3b9bd10-operator-scripts\") pod \"octavia-db99-account-create-update-8dpb5\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.534005 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-685pf\" (UniqueName: \"kubernetes.io/projected/21268b76-c5b2-457f-a433-ff2da3b9bd10-kube-api-access-685pf\") pod \"octavia-db99-account-create-update-8dpb5\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.534124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21268b76-c5b2-457f-a433-ff2da3b9bd10-operator-scripts\") pod \"octavia-db99-account-create-update-8dpb5\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.535202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21268b76-c5b2-457f-a433-ff2da3b9bd10-operator-scripts\") pod \"octavia-db99-account-create-update-8dpb5\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.555372 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-685pf\" (UniqueName: \"kubernetes.io/projected/21268b76-c5b2-457f-a433-ff2da3b9bd10-kube-api-access-685pf\") pod \"octavia-db99-account-create-update-8dpb5\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.572874 4931 generic.go:334] "Generic (PLEG): container finished" podID="cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" containerID="fc2609bf05101f454b500a411b2af7bec596ed7b2a503264f64b371462ed10d1" exitCode=0 Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.572964 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5v2g5" event={"ID":"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031","Type":"ContainerDied","Data":"fc2609bf05101f454b500a411b2af7bec596ed7b2a503264f64b371462ed10d1"} Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.575375 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tgfr" event={"ID":"56a7c911-151f-42ff-b005-58bdaecd5d8b","Type":"ContainerStarted","Data":"89c255ef545094cd8326f3a470ab7d7a2d1bab357be8a008fe08a91ef20fb55e"} Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.575449 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5tgfr" event={"ID":"56a7c911-151f-42ff-b005-58bdaecd5d8b","Type":"ContainerStarted","Data":"929ba428a303a75db914f2ca8c52c2f16380b2e1370ef0fef0e2e21040d610dd"} Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.575650 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.627590 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5tgfr" podStartSLOduration=4.627566656 podStartE2EDuration="4.627566656s" podCreationTimestamp="2026-01-30 06:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:36.618693588 +0000 UTC m=+5691.988603865" watchObservedRunningTime="2026-01-30 06:42:36.627566656 +0000 UTC m=+5691.997476913" Jan 30 06:42:36 crc kubenswrapper[4931]: I0130 06:42:36.681324 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.049601 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ckz5s"] Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.058764 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ckz5s"] Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.178377 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db99-account-create-update-8dpb5"] Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.435968 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65588685-2245-486d-b7a9-95b8a71f8ff7" path="/var/lib/kubelet/pods/65588685-2245-486d-b7a9-95b8a71f8ff7/volumes" Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.590760 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db99-account-create-update-8dpb5" event={"ID":"21268b76-c5b2-457f-a433-ff2da3b9bd10","Type":"ContainerStarted","Data":"e4d54f2baf7c89b6f0656a4f82e037b9be01132659b5c11c20729732cbeb53c0"} Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.590837 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db99-account-create-update-8dpb5" event={"ID":"21268b76-c5b2-457f-a433-ff2da3b9bd10","Type":"ContainerStarted","Data":"5bd83a89b959b37a4f5ed3c77a498449419d8ecd7c9dcba1f993459e25b6e2c3"} Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.591272 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.612399 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db99-account-create-update-8dpb5" podStartSLOduration=1.612354586 podStartE2EDuration="1.612354586s" podCreationTimestamp="2026-01-30 06:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:37.60854118 +0000 UTC m=+5692.978451487" watchObservedRunningTime="2026-01-30 06:42:37.612354586 +0000 UTC m=+5692.982264883" Jan 30 06:42:37 crc kubenswrapper[4931]: I0130 06:42:37.988210 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.069611 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgbxs\" (UniqueName: \"kubernetes.io/projected/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-kube-api-access-rgbxs\") pod \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.069720 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-operator-scripts\") pod \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\" (UID: \"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031\") " Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.070228 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" (UID: "cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.070604 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.075793 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-kube-api-access-rgbxs" (OuterVolumeSpecName: "kube-api-access-rgbxs") pod "cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" (UID: "cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031"). InnerVolumeSpecName "kube-api-access-rgbxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.172631 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgbxs\" (UniqueName: \"kubernetes.io/projected/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031-kube-api-access-rgbxs\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.606183 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5v2g5" event={"ID":"cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031","Type":"ContainerDied","Data":"4f31ce8325d581327e30dec1b76d7448dbaa5c149d47de8e58a7992f4f878229"} Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.606244 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f31ce8325d581327e30dec1b76d7448dbaa5c149d47de8e58a7992f4f878229" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.606338 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5v2g5" Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.609808 4931 generic.go:334] "Generic (PLEG): container finished" podID="21268b76-c5b2-457f-a433-ff2da3b9bd10" containerID="e4d54f2baf7c89b6f0656a4f82e037b9be01132659b5c11c20729732cbeb53c0" exitCode=0 Jan 30 06:42:38 crc kubenswrapper[4931]: I0130 06:42:38.609880 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db99-account-create-update-8dpb5" event={"ID":"21268b76-c5b2-457f-a433-ff2da3b9bd10","Type":"ContainerDied","Data":"e4d54f2baf7c89b6f0656a4f82e037b9be01132659b5c11c20729732cbeb53c0"} Jan 30 06:42:39 crc kubenswrapper[4931]: I0130 06:42:39.422330 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:42:39 crc kubenswrapper[4931]: E0130 06:42:39.423100 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.129898 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.214465 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-685pf\" (UniqueName: \"kubernetes.io/projected/21268b76-c5b2-457f-a433-ff2da3b9bd10-kube-api-access-685pf\") pod \"21268b76-c5b2-457f-a433-ff2da3b9bd10\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.214773 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21268b76-c5b2-457f-a433-ff2da3b9bd10-operator-scripts\") pod \"21268b76-c5b2-457f-a433-ff2da3b9bd10\" (UID: \"21268b76-c5b2-457f-a433-ff2da3b9bd10\") " Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.215590 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21268b76-c5b2-457f-a433-ff2da3b9bd10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21268b76-c5b2-457f-a433-ff2da3b9bd10" (UID: "21268b76-c5b2-457f-a433-ff2da3b9bd10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.223948 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21268b76-c5b2-457f-a433-ff2da3b9bd10-kube-api-access-685pf" (OuterVolumeSpecName: "kube-api-access-685pf") pod "21268b76-c5b2-457f-a433-ff2da3b9bd10" (UID: "21268b76-c5b2-457f-a433-ff2da3b9bd10"). InnerVolumeSpecName "kube-api-access-685pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.326369 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21268b76-c5b2-457f-a433-ff2da3b9bd10-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.326890 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-685pf\" (UniqueName: \"kubernetes.io/projected/21268b76-c5b2-457f-a433-ff2da3b9bd10-kube-api-access-685pf\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.641115 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db99-account-create-update-8dpb5" event={"ID":"21268b76-c5b2-457f-a433-ff2da3b9bd10","Type":"ContainerDied","Data":"5bd83a89b959b37a4f5ed3c77a498449419d8ecd7c9dcba1f993459e25b6e2c3"} Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.641174 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bd83a89b959b37a4f5ed3c77a498449419d8ecd7c9dcba1f993459e25b6e2c3" Jan 30 06:42:40 crc kubenswrapper[4931]: I0130 06:42:40.641265 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db99-account-create-update-8dpb5" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.351415 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-scftb"] Jan 30 06:42:42 crc kubenswrapper[4931]: E0130 06:42:42.352169 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21268b76-c5b2-457f-a433-ff2da3b9bd10" containerName="mariadb-account-create-update" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.352187 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="21268b76-c5b2-457f-a433-ff2da3b9bd10" containerName="mariadb-account-create-update" Jan 30 06:42:42 crc kubenswrapper[4931]: E0130 06:42:42.352204 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" containerName="mariadb-database-create" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.352212 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" containerName="mariadb-database-create" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.352462 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" containerName="mariadb-database-create" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.352498 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="21268b76-c5b2-457f-a433-ff2da3b9bd10" containerName="mariadb-account-create-update" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.353199 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.377872 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-scftb"] Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.466224 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd2kj\" (UniqueName: \"kubernetes.io/projected/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-kube-api-access-gd2kj\") pod \"octavia-persistence-db-create-scftb\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.466306 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-operator-scripts\") pod \"octavia-persistence-db-create-scftb\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.568543 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-operator-scripts\") pod \"octavia-persistence-db-create-scftb\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.569687 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-operator-scripts\") pod \"octavia-persistence-db-create-scftb\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.570982 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd2kj\" (UniqueName: \"kubernetes.io/projected/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-kube-api-access-gd2kj\") pod \"octavia-persistence-db-create-scftb\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.591023 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd2kj\" (UniqueName: \"kubernetes.io/projected/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-kube-api-access-gd2kj\") pod \"octavia-persistence-db-create-scftb\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:42 crc kubenswrapper[4931]: I0130 06:42:42.690892 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.188534 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-scftb"] Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.543273 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-06bb-account-create-update-9w2dc"] Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.544741 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.557037 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.559621 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-06bb-account-create-update-9w2dc"] Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.593297 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2k85\" (UniqueName: \"kubernetes.io/projected/cffbc623-924e-4952-890f-da78398d60fb-kube-api-access-g2k85\") pod \"octavia-06bb-account-create-update-9w2dc\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.593435 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cffbc623-924e-4952-890f-da78398d60fb-operator-scripts\") pod \"octavia-06bb-account-create-update-9w2dc\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.674607 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-scftb" event={"ID":"7c32b952-ea20-4e38-be3d-0ca833fb8aaf","Type":"ContainerStarted","Data":"e1e6f3089c65c6555f71402d2087406c6daee75232130f6c5b8762d180358f01"} Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.674657 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-scftb" event={"ID":"7c32b952-ea20-4e38-be3d-0ca833fb8aaf","Type":"ContainerStarted","Data":"7d096987e285866c77fbd01b68c7f6ac747b1c7e0268d9f81a6b6350b6a68d69"} Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.695672 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2k85\" (UniqueName: \"kubernetes.io/projected/cffbc623-924e-4952-890f-da78398d60fb-kube-api-access-g2k85\") pod \"octavia-06bb-account-create-update-9w2dc\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.695739 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cffbc623-924e-4952-890f-da78398d60fb-operator-scripts\") pod \"octavia-06bb-account-create-update-9w2dc\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.697325 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-persistence-db-create-scftb" podStartSLOduration=1.697306749 podStartE2EDuration="1.697306749s" podCreationTimestamp="2026-01-30 06:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:43.696733963 +0000 UTC m=+5699.066644220" watchObservedRunningTime="2026-01-30 06:42:43.697306749 +0000 UTC m=+5699.067217006" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.697770 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cffbc623-924e-4952-890f-da78398d60fb-operator-scripts\") pod \"octavia-06bb-account-create-update-9w2dc\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.715861 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2k85\" (UniqueName: \"kubernetes.io/projected/cffbc623-924e-4952-890f-da78398d60fb-kube-api-access-g2k85\") pod \"octavia-06bb-account-create-update-9w2dc\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:43 crc kubenswrapper[4931]: I0130 06:42:43.866604 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:44 crc kubenswrapper[4931]: I0130 06:42:44.376698 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-06bb-account-create-update-9w2dc"] Jan 30 06:42:44 crc kubenswrapper[4931]: I0130 06:42:44.687780 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-06bb-account-create-update-9w2dc" event={"ID":"cffbc623-924e-4952-890f-da78398d60fb","Type":"ContainerStarted","Data":"530188cbe277ff66c92e88116244fb7e483ee43f7966980292a116725f942bcd"} Jan 30 06:42:44 crc kubenswrapper[4931]: I0130 06:42:44.688050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-06bb-account-create-update-9w2dc" event={"ID":"cffbc623-924e-4952-890f-da78398d60fb","Type":"ContainerStarted","Data":"c14c6caddb40b4927eb0e087bda428a9231bf8741a027a2476743c80e0b8dcda"} Jan 30 06:42:44 crc kubenswrapper[4931]: I0130 06:42:44.690263 4931 generic.go:334] "Generic (PLEG): container finished" podID="7c32b952-ea20-4e38-be3d-0ca833fb8aaf" containerID="e1e6f3089c65c6555f71402d2087406c6daee75232130f6c5b8762d180358f01" exitCode=0 Jan 30 06:42:44 crc kubenswrapper[4931]: I0130 06:42:44.690286 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-scftb" event={"ID":"7c32b952-ea20-4e38-be3d-0ca833fb8aaf","Type":"ContainerDied","Data":"e1e6f3089c65c6555f71402d2087406c6daee75232130f6c5b8762d180358f01"} Jan 30 06:42:44 crc kubenswrapper[4931]: I0130 06:42:44.710570 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-06bb-account-create-update-9w2dc" podStartSLOduration=1.7105528460000001 podStartE2EDuration="1.710552846s" podCreationTimestamp="2026-01-30 06:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:44.704932709 +0000 UTC m=+5700.074842966" watchObservedRunningTime="2026-01-30 06:42:44.710552846 +0000 UTC m=+5700.080463103" Jan 30 06:42:45 crc kubenswrapper[4931]: I0130 06:42:45.701837 4931 generic.go:334] "Generic (PLEG): container finished" podID="cffbc623-924e-4952-890f-da78398d60fb" containerID="530188cbe277ff66c92e88116244fb7e483ee43f7966980292a116725f942bcd" exitCode=0 Jan 30 06:42:45 crc kubenswrapper[4931]: I0130 06:42:45.702302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-06bb-account-create-update-9w2dc" event={"ID":"cffbc623-924e-4952-890f-da78398d60fb","Type":"ContainerDied","Data":"530188cbe277ff66c92e88116244fb7e483ee43f7966980292a116725f942bcd"} Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.243108 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.342942 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-operator-scripts\") pod \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.343025 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd2kj\" (UniqueName: \"kubernetes.io/projected/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-kube-api-access-gd2kj\") pod \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\" (UID: \"7c32b952-ea20-4e38-be3d-0ca833fb8aaf\") " Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.343456 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c32b952-ea20-4e38-be3d-0ca833fb8aaf" (UID: "7c32b952-ea20-4e38-be3d-0ca833fb8aaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.348233 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-kube-api-access-gd2kj" (OuterVolumeSpecName: "kube-api-access-gd2kj") pod "7c32b952-ea20-4e38-be3d-0ca833fb8aaf" (UID: "7c32b952-ea20-4e38-be3d-0ca833fb8aaf"). InnerVolumeSpecName "kube-api-access-gd2kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.445873 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.445925 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd2kj\" (UniqueName: \"kubernetes.io/projected/7c32b952-ea20-4e38-be3d-0ca833fb8aaf-kube-api-access-gd2kj\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.714386 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-scftb" Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.714391 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-scftb" event={"ID":"7c32b952-ea20-4e38-be3d-0ca833fb8aaf","Type":"ContainerDied","Data":"7d096987e285866c77fbd01b68c7f6ac747b1c7e0268d9f81a6b6350b6a68d69"} Jan 30 06:42:46 crc kubenswrapper[4931]: I0130 06:42:46.714477 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d096987e285866c77fbd01b68c7f6ac747b1c7e0268d9f81a6b6350b6a68d69" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.046485 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.168598 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cffbc623-924e-4952-890f-da78398d60fb-operator-scripts\") pod \"cffbc623-924e-4952-890f-da78398d60fb\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.168888 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2k85\" (UniqueName: \"kubernetes.io/projected/cffbc623-924e-4952-890f-da78398d60fb-kube-api-access-g2k85\") pod \"cffbc623-924e-4952-890f-da78398d60fb\" (UID: \"cffbc623-924e-4952-890f-da78398d60fb\") " Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.169145 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cffbc623-924e-4952-890f-da78398d60fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cffbc623-924e-4952-890f-da78398d60fb" (UID: "cffbc623-924e-4952-890f-da78398d60fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.169562 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cffbc623-924e-4952-890f-da78398d60fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.175399 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffbc623-924e-4952-890f-da78398d60fb-kube-api-access-g2k85" (OuterVolumeSpecName: "kube-api-access-g2k85") pod "cffbc623-924e-4952-890f-da78398d60fb" (UID: "cffbc623-924e-4952-890f-da78398d60fb"). InnerVolumeSpecName "kube-api-access-g2k85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.271206 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2k85\" (UniqueName: \"kubernetes.io/projected/cffbc623-924e-4952-890f-da78398d60fb-kube-api-access-g2k85\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.746083 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-06bb-account-create-update-9w2dc" event={"ID":"cffbc623-924e-4952-890f-da78398d60fb","Type":"ContainerDied","Data":"c14c6caddb40b4927eb0e087bda428a9231bf8741a027a2476743c80e0b8dcda"} Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.746123 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14c6caddb40b4927eb0e087bda428a9231bf8741a027a2476743c80e0b8dcda" Jan 30 06:42:47 crc kubenswrapper[4931]: I0130 06:42:47.746175 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-06bb-account-create-update-9w2dc" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.237832 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-79684d7c94-4r69m"] Jan 30 06:42:49 crc kubenswrapper[4931]: E0130 06:42:49.238467 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffbc623-924e-4952-890f-da78398d60fb" containerName="mariadb-account-create-update" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.238478 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffbc623-924e-4952-890f-da78398d60fb" containerName="mariadb-account-create-update" Jan 30 06:42:49 crc kubenswrapper[4931]: E0130 06:42:49.238494 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c32b952-ea20-4e38-be3d-0ca833fb8aaf" containerName="mariadb-database-create" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.238500 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c32b952-ea20-4e38-be3d-0ca833fb8aaf" containerName="mariadb-database-create" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.238695 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffbc623-924e-4952-890f-da78398d60fb" containerName="mariadb-account-create-update" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.238711 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c32b952-ea20-4e38-be3d-0ca833fb8aaf" containerName="mariadb-database-create" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.239968 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.241757 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-8cr2b" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.242676 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.243184 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.259885 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-79684d7c94-4r69m"] Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.414332 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-combined-ca-bundle\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.414519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-octavia-run\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.414786 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-config-data-merged\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.414885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-config-data\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.414994 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-scripts\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.516260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-scripts\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.516347 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-combined-ca-bundle\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.516405 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-octavia-run\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.516540 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-config-data-merged\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.516595 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-config-data\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.517266 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-config-data-merged\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.517607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-octavia-run\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.523841 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-combined-ca-bundle\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.534571 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-scripts\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.535652 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395-config-data\") pod \"octavia-api-79684d7c94-4r69m\" (UID: \"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395\") " pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:49 crc kubenswrapper[4931]: I0130 06:42:49.561349 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:42:50 crc kubenswrapper[4931]: I0130 06:42:50.079999 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-79684d7c94-4r69m"] Jan 30 06:42:50 crc kubenswrapper[4931]: I0130 06:42:50.422259 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:42:50 crc kubenswrapper[4931]: E0130 06:42:50.422583 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:42:50 crc kubenswrapper[4931]: I0130 06:42:50.775377 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79684d7c94-4r69m" event={"ID":"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395","Type":"ContainerStarted","Data":"f5467262858899783db67c1a783f82564094e34125fc6828b0ef571352e14f0a"} Jan 30 06:42:51 crc kubenswrapper[4931]: I0130 06:42:51.150808 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dj4rz"] Jan 30 06:42:51 crc kubenswrapper[4931]: I0130 06:42:51.166810 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dj4rz"] Jan 30 06:42:51 crc kubenswrapper[4931]: I0130 06:42:51.439036 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e6e702-ef29-49bd-836a-f46b2abd51cc" path="/var/lib/kubelet/pods/26e6e702-ef29-49bd-836a-f46b2abd51cc/volumes" Jan 30 06:42:54 crc kubenswrapper[4931]: I0130 06:42:54.979075 4931 scope.go:117] "RemoveContainer" containerID="4c48b12d2b0648e9dfb8706d20edad9baaeb7875a958059b6dc53ace27e47c1e" Jan 30 06:43:00 crc kubenswrapper[4931]: I0130 06:43:00.402207 4931 scope.go:117] "RemoveContainer" containerID="cdd69294edc054ababcd2665bc64a680aa08a0804663ff331148d5c4aedf9140" Jan 30 06:43:00 crc kubenswrapper[4931]: I0130 06:43:00.457551 4931 scope.go:117] "RemoveContainer" containerID="cbeb8eeb50114e9f7323f10c636e65f223a4a4ce3cb8536dd62a68b32d0fcd46" Jan 30 06:43:00 crc kubenswrapper[4931]: I0130 06:43:00.485831 4931 scope.go:117] "RemoveContainer" containerID="a4061e639e286e3a321d0a950315a3048946e43d437d1b9673f6d152b515bf12" Jan 30 06:43:00 crc kubenswrapper[4931]: I0130 06:43:00.904611 4931 generic.go:334] "Generic (PLEG): container finished" podID="eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395" containerID="af4f8c24491864d362f08c260d7724ee10d7788a36dc391894542a043ed83880" exitCode=0 Jan 30 06:43:00 crc kubenswrapper[4931]: I0130 06:43:00.904665 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79684d7c94-4r69m" event={"ID":"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395","Type":"ContainerDied","Data":"af4f8c24491864d362f08c260d7724ee10d7788a36dc391894542a043ed83880"} Jan 30 06:43:01 crc kubenswrapper[4931]: I0130 06:43:01.924873 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79684d7c94-4r69m" event={"ID":"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395","Type":"ContainerStarted","Data":"472b256dc43e23ba13a9c133b6ae84fb385d090c66373de1f283ad92d133dbfd"} Jan 30 06:43:01 crc kubenswrapper[4931]: I0130 06:43:01.925346 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:43:01 crc kubenswrapper[4931]: I0130 06:43:01.925360 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79684d7c94-4r69m" event={"ID":"eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395","Type":"ContainerStarted","Data":"c169ceba7e478d663c2966aefcc21ea691f8d7a9b6b4e66c4a90e69171df77cb"} Jan 30 06:43:01 crc kubenswrapper[4931]: I0130 06:43:01.925378 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:43:01 crc kubenswrapper[4931]: I0130 06:43:01.962070 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-79684d7c94-4r69m" podStartSLOduration=2.564466702 podStartE2EDuration="12.962049505s" podCreationTimestamp="2026-01-30 06:42:49 +0000 UTC" firstStartedPulling="2026-01-30 06:42:50.090034722 +0000 UTC m=+5705.459944979" lastFinishedPulling="2026-01-30 06:43:00.487617525 +0000 UTC m=+5715.857527782" observedRunningTime="2026-01-30 06:43:01.957467607 +0000 UTC m=+5717.327377894" watchObservedRunningTime="2026-01-30 06:43:01.962049505 +0000 UTC m=+5717.331959782" Jan 30 06:43:04 crc kubenswrapper[4931]: I0130 06:43:04.422013 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:43:04 crc kubenswrapper[4931]: E0130 06:43:04.423033 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.784382 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-g99g6"] Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.786024 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.792689 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.792970 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.793052 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.800808 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-g99g6"] Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.930783 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2c16935c-c83b-4b45-b4cd-b61f20ee764f-hm-ports\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.931244 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c16935c-c83b-4b45-b4cd-b61f20ee764f-scripts\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.931369 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c16935c-c83b-4b45-b4cd-b61f20ee764f-config-data-merged\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:06 crc kubenswrapper[4931]: I0130 06:43:06.931392 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c16935c-c83b-4b45-b4cd-b61f20ee764f-config-data\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.032899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2c16935c-c83b-4b45-b4cd-b61f20ee764f-hm-ports\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.032991 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c16935c-c83b-4b45-b4cd-b61f20ee764f-scripts\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.033050 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c16935c-c83b-4b45-b4cd-b61f20ee764f-config-data-merged\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.033067 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c16935c-c83b-4b45-b4cd-b61f20ee764f-config-data\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.033548 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c16935c-c83b-4b45-b4cd-b61f20ee764f-config-data-merged\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.033710 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2c16935c-c83b-4b45-b4cd-b61f20ee764f-hm-ports\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.039146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c16935c-c83b-4b45-b4cd-b61f20ee764f-scripts\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.047908 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c16935c-c83b-4b45-b4cd-b61f20ee764f-config-data\") pod \"octavia-rsyslog-g99g6\" (UID: \"2c16935c-c83b-4b45-b4cd-b61f20ee764f\") " pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.104594 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.492078 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-6zvwf"] Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.494791 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.497273 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.514022 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-6zvwf"] Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.646274 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3758e44c-007e-45da-87b8-ed2fdeb3e23c-httpd-config\") pod \"octavia-image-upload-65dd99cb46-6zvwf\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.646344 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/3758e44c-007e-45da-87b8-ed2fdeb3e23c-amphora-image\") pod \"octavia-image-upload-65dd99cb46-6zvwf\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.676213 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-g99g6"] Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.748767 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3758e44c-007e-45da-87b8-ed2fdeb3e23c-httpd-config\") pod \"octavia-image-upload-65dd99cb46-6zvwf\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.748846 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/3758e44c-007e-45da-87b8-ed2fdeb3e23c-amphora-image\") pod \"octavia-image-upload-65dd99cb46-6zvwf\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.749354 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/3758e44c-007e-45da-87b8-ed2fdeb3e23c-amphora-image\") pod \"octavia-image-upload-65dd99cb46-6zvwf\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.754859 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3758e44c-007e-45da-87b8-ed2fdeb3e23c-httpd-config\") pod \"octavia-image-upload-65dd99cb46-6zvwf\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.830251 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:07 crc kubenswrapper[4931]: I0130 06:43:07.904105 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-g99g6"] Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.034781 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-g99g6" event={"ID":"2c16935c-c83b-4b45-b4cd-b61f20ee764f","Type":"ContainerStarted","Data":"03c44fb8c53413e83f41c5315c903df78cd0125b4779c5ce07049143f187fe0e"} Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.335639 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-6zvwf"] Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.416921 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-cgvfd" podUID="9bf15e4b-1a09-401b-87e9-97cff0ee8c91" containerName="ovn-controller" probeResult="failure" output=< Jan 30 06:43:08 crc kubenswrapper[4931]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 06:43:08 crc kubenswrapper[4931]: > Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.420253 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.437305 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5tgfr" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.529563 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cgvfd-config-w2szm"] Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.530876 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.532973 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.539602 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cgvfd-config-w2szm"] Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.664553 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-log-ovn\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.664670 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-scripts\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.664707 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.664757 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-additional-scripts\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.664798 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qvh\" (UniqueName: \"kubernetes.io/projected/2f8082cd-2af8-4181-9d8f-73436fea45bc-kube-api-access-x7qvh\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.664817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run-ovn\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.766027 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-additional-scripts\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.766156 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qvh\" (UniqueName: \"kubernetes.io/projected/2f8082cd-2af8-4181-9d8f-73436fea45bc-kube-api-access-x7qvh\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.766195 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run-ovn\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.766948 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run-ovn\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.767127 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-log-ovn\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.767221 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-log-ovn\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.767238 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-additional-scripts\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.767371 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-scripts\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.767699 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.767826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.774138 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-scripts\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.795157 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qvh\" (UniqueName: \"kubernetes.io/projected/2f8082cd-2af8-4181-9d8f-73436fea45bc-kube-api-access-x7qvh\") pod \"ovn-controller-cgvfd-config-w2szm\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:08 crc kubenswrapper[4931]: I0130 06:43:08.859319 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:09 crc kubenswrapper[4931]: I0130 06:43:09.058247 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" event={"ID":"3758e44c-007e-45da-87b8-ed2fdeb3e23c","Type":"ContainerStarted","Data":"02c7a36d17a999021624a691df6d0451e8418cc04c05cd38164901646377f2ae"} Jan 30 06:43:09 crc kubenswrapper[4931]: I0130 06:43:09.372190 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cgvfd-config-w2szm"] Jan 30 06:43:09 crc kubenswrapper[4931]: W0130 06:43:09.378863 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f8082cd_2af8_4181_9d8f_73436fea45bc.slice/crio-a2f0e03cf5afdfffd45d581ccaf50b549e6e76e484005e16bc912ac74f8f09c8 WatchSource:0}: Error finding container a2f0e03cf5afdfffd45d581ccaf50b549e6e76e484005e16bc912ac74f8f09c8: Status 404 returned error can't find the container with id a2f0e03cf5afdfffd45d581ccaf50b549e6e76e484005e16bc912ac74f8f09c8 Jan 30 06:43:10 crc kubenswrapper[4931]: I0130 06:43:10.070879 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cgvfd-config-w2szm" event={"ID":"2f8082cd-2af8-4181-9d8f-73436fea45bc","Type":"ContainerStarted","Data":"3f61c0f02fd2c2ced024d1e703805185ca2f7ee2e42863e0e8a06a4f812766d2"} Jan 30 06:43:10 crc kubenswrapper[4931]: I0130 06:43:10.076756 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cgvfd-config-w2szm" event={"ID":"2f8082cd-2af8-4181-9d8f-73436fea45bc","Type":"ContainerStarted","Data":"a2f0e03cf5afdfffd45d581ccaf50b549e6e76e484005e16bc912ac74f8f09c8"} Jan 30 06:43:10 crc kubenswrapper[4931]: I0130 06:43:10.097086 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-cgvfd-config-w2szm" podStartSLOduration=2.097069139 podStartE2EDuration="2.097069139s" podCreationTimestamp="2026-01-30 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:43:10.089643051 +0000 UTC m=+5725.459553308" watchObservedRunningTime="2026-01-30 06:43:10.097069139 +0000 UTC m=+5725.466979396" Jan 30 06:43:11 crc kubenswrapper[4931]: I0130 06:43:11.082777 4931 generic.go:334] "Generic (PLEG): container finished" podID="2f8082cd-2af8-4181-9d8f-73436fea45bc" containerID="3f61c0f02fd2c2ced024d1e703805185ca2f7ee2e42863e0e8a06a4f812766d2" exitCode=0 Jan 30 06:43:11 crc kubenswrapper[4931]: I0130 06:43:11.082893 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cgvfd-config-w2szm" event={"ID":"2f8082cd-2af8-4181-9d8f-73436fea45bc","Type":"ContainerDied","Data":"3f61c0f02fd2c2ced024d1e703805185ca2f7ee2e42863e0e8a06a4f812766d2"} Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.706571 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.870687 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-scripts\") pod \"2f8082cd-2af8-4181-9d8f-73436fea45bc\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.870788 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-additional-scripts\") pod \"2f8082cd-2af8-4181-9d8f-73436fea45bc\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.871832 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-scripts" (OuterVolumeSpecName: "scripts") pod "2f8082cd-2af8-4181-9d8f-73436fea45bc" (UID: "2f8082cd-2af8-4181-9d8f-73436fea45bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.871847 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2f8082cd-2af8-4181-9d8f-73436fea45bc" (UID: "2f8082cd-2af8-4181-9d8f-73436fea45bc"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.871949 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7qvh\" (UniqueName: \"kubernetes.io/projected/2f8082cd-2af8-4181-9d8f-73436fea45bc-kube-api-access-x7qvh\") pod \"2f8082cd-2af8-4181-9d8f-73436fea45bc\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.872853 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run-ovn\") pod \"2f8082cd-2af8-4181-9d8f-73436fea45bc\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.872882 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run\") pod \"2f8082cd-2af8-4181-9d8f-73436fea45bc\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.872903 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-log-ovn\") pod \"2f8082cd-2af8-4181-9d8f-73436fea45bc\" (UID: \"2f8082cd-2af8-4181-9d8f-73436fea45bc\") " Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873231 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run" (OuterVolumeSpecName: "var-run") pod "2f8082cd-2af8-4181-9d8f-73436fea45bc" (UID: "2f8082cd-2af8-4181-9d8f-73436fea45bc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873262 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2f8082cd-2af8-4181-9d8f-73436fea45bc" (UID: "2f8082cd-2af8-4181-9d8f-73436fea45bc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873293 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2f8082cd-2af8-4181-9d8f-73436fea45bc" (UID: "2f8082cd-2af8-4181-9d8f-73436fea45bc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873347 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873367 4931 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8082cd-2af8-4181-9d8f-73436fea45bc-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873383 4931 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.873395 4931 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.886702 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8082cd-2af8-4181-9d8f-73436fea45bc-kube-api-access-x7qvh" (OuterVolumeSpecName: "kube-api-access-x7qvh") pod "2f8082cd-2af8-4181-9d8f-73436fea45bc" (UID: "2f8082cd-2af8-4181-9d8f-73436fea45bc"). InnerVolumeSpecName "kube-api-access-x7qvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.975849 4931 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f8082cd-2af8-4181-9d8f-73436fea45bc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:12 crc kubenswrapper[4931]: I0130 06:43:12.975897 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7qvh\" (UniqueName: \"kubernetes.io/projected/2f8082cd-2af8-4181-9d8f-73436fea45bc-kube-api-access-x7qvh\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.103762 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-g99g6" event={"ID":"2c16935c-c83b-4b45-b4cd-b61f20ee764f","Type":"ContainerStarted","Data":"c3925675d16d4559e27133114b0b918739304e4ebddd62d74c52e33ae5f25e71"} Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.106992 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cgvfd-config-w2szm" event={"ID":"2f8082cd-2af8-4181-9d8f-73436fea45bc","Type":"ContainerDied","Data":"a2f0e03cf5afdfffd45d581ccaf50b549e6e76e484005e16bc912ac74f8f09c8"} Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.107030 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f0e03cf5afdfffd45d581ccaf50b549e6e76e484005e16bc912ac74f8f09c8" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.107078 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cgvfd-config-w2szm" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.171205 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cgvfd-config-w2szm"] Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.187568 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-cgvfd-config-w2szm"] Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.314537 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-2clsb"] Jan 30 06:43:13 crc kubenswrapper[4931]: E0130 06:43:13.315002 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8082cd-2af8-4181-9d8f-73436fea45bc" containerName="ovn-config" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.315021 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8082cd-2af8-4181-9d8f-73436fea45bc" containerName="ovn-config" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.315201 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8082cd-2af8-4181-9d8f-73436fea45bc" containerName="ovn-config" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.316221 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.318065 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.323406 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-2clsb"] Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.384550 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data-merged\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.384650 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.385637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-scripts\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.385663 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-combined-ca-bundle\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.408914 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-cgvfd" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.437164 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8082cd-2af8-4181-9d8f-73436fea45bc" path="/var/lib/kubelet/pods/2f8082cd-2af8-4181-9d8f-73436fea45bc/volumes" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.487870 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-scripts\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.487915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-combined-ca-bundle\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.488071 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data-merged\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.488142 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.489766 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data-merged\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.493530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-scripts\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.493631 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.507988 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-combined-ca-bundle\") pod \"octavia-db-sync-2clsb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:13 crc kubenswrapper[4931]: I0130 06:43:13.642965 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:14 crc kubenswrapper[4931]: I0130 06:43:14.128244 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-2clsb"] Jan 30 06:43:14 crc kubenswrapper[4931]: W0130 06:43:14.135193 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c7503f1_c8e7_4b48_9dad_a4a221ebdbbb.slice/crio-ae7cb9704c5cb2662c592a30651057ff3bbb6f8f549e1ae4f1cbae5155be4ce3 WatchSource:0}: Error finding container ae7cb9704c5cb2662c592a30651057ff3bbb6f8f549e1ae4f1cbae5155be4ce3: Status 404 returned error can't find the container with id ae7cb9704c5cb2662c592a30651057ff3bbb6f8f549e1ae4f1cbae5155be4ce3 Jan 30 06:43:15 crc kubenswrapper[4931]: I0130 06:43:15.143189 4931 generic.go:334] "Generic (PLEG): container finished" podID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerID="c4a34d96918d76961993d44bfa88235147b0d786fa8c13e4e74a51d5c91d0e97" exitCode=0 Jan 30 06:43:15 crc kubenswrapper[4931]: I0130 06:43:15.143617 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-2clsb" event={"ID":"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb","Type":"ContainerDied","Data":"c4a34d96918d76961993d44bfa88235147b0d786fa8c13e4e74a51d5c91d0e97"} Jan 30 06:43:15 crc kubenswrapper[4931]: I0130 06:43:15.143645 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-2clsb" event={"ID":"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb","Type":"ContainerStarted","Data":"ae7cb9704c5cb2662c592a30651057ff3bbb6f8f549e1ae4f1cbae5155be4ce3"} Jan 30 06:43:15 crc kubenswrapper[4931]: I0130 06:43:15.153677 4931 generic.go:334] "Generic (PLEG): container finished" podID="2c16935c-c83b-4b45-b4cd-b61f20ee764f" containerID="c3925675d16d4559e27133114b0b918739304e4ebddd62d74c52e33ae5f25e71" exitCode=0 Jan 30 06:43:15 crc kubenswrapper[4931]: I0130 06:43:15.153731 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-g99g6" event={"ID":"2c16935c-c83b-4b45-b4cd-b61f20ee764f","Type":"ContainerDied","Data":"c3925675d16d4559e27133114b0b918739304e4ebddd62d74c52e33ae5f25e71"} Jan 30 06:43:16 crc kubenswrapper[4931]: I0130 06:43:16.423107 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:43:16 crc kubenswrapper[4931]: E0130 06:43:16.423660 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:43:19 crc kubenswrapper[4931]: I0130 06:43:19.192186 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" event={"ID":"3758e44c-007e-45da-87b8-ed2fdeb3e23c","Type":"ContainerStarted","Data":"f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da"} Jan 30 06:43:19 crc kubenswrapper[4931]: I0130 06:43:19.196199 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-2clsb" event={"ID":"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb","Type":"ContainerStarted","Data":"870048acdaf12d315c0defcb750ddf7659f2ce415320432c25b469719ef6d1d5"} Jan 30 06:43:19 crc kubenswrapper[4931]: I0130 06:43:19.235204 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-2clsb" podStartSLOduration=6.235183668 podStartE2EDuration="6.235183668s" podCreationTimestamp="2026-01-30 06:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:43:19.228438209 +0000 UTC m=+5734.598348466" watchObservedRunningTime="2026-01-30 06:43:19.235183668 +0000 UTC m=+5734.605093925" Jan 30 06:43:20 crc kubenswrapper[4931]: I0130 06:43:20.222852 4931 generic.go:334] "Generic (PLEG): container finished" podID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerID="f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da" exitCode=0 Jan 30 06:43:20 crc kubenswrapper[4931]: I0130 06:43:20.222921 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" event={"ID":"3758e44c-007e-45da-87b8-ed2fdeb3e23c","Type":"ContainerDied","Data":"f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da"} Jan 30 06:43:21 crc kubenswrapper[4931]: I0130 06:43:21.239753 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" event={"ID":"3758e44c-007e-45da-87b8-ed2fdeb3e23c","Type":"ContainerStarted","Data":"929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7"} Jan 30 06:43:21 crc kubenswrapper[4931]: I0130 06:43:21.246968 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-g99g6" event={"ID":"2c16935c-c83b-4b45-b4cd-b61f20ee764f","Type":"ContainerStarted","Data":"2772eecf4cd1ca5855b05ddd85c6c56124106ff7213c37304726e3e201063cd6"} Jan 30 06:43:21 crc kubenswrapper[4931]: I0130 06:43:21.248001 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:21 crc kubenswrapper[4931]: I0130 06:43:21.278312 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" podStartSLOduration=3.7254788899999998 podStartE2EDuration="14.278285615s" podCreationTimestamp="2026-01-30 06:43:07 +0000 UTC" firstStartedPulling="2026-01-30 06:43:08.364270993 +0000 UTC m=+5723.734181250" lastFinishedPulling="2026-01-30 06:43:18.917077718 +0000 UTC m=+5734.286987975" observedRunningTime="2026-01-30 06:43:21.269698375 +0000 UTC m=+5736.639608672" watchObservedRunningTime="2026-01-30 06:43:21.278285615 +0000 UTC m=+5736.648195902" Jan 30 06:43:23 crc kubenswrapper[4931]: I0130 06:43:23.367492 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:43:23 crc kubenswrapper[4931]: I0130 06:43:23.417361 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-g99g6" podStartSLOduration=4.877635547 podStartE2EDuration="17.417340828s" podCreationTimestamp="2026-01-30 06:43:06 +0000 UTC" firstStartedPulling="2026-01-30 06:43:07.684800614 +0000 UTC m=+5723.054710871" lastFinishedPulling="2026-01-30 06:43:20.224505865 +0000 UTC m=+5735.594416152" observedRunningTime="2026-01-30 06:43:21.313672545 +0000 UTC m=+5736.683582822" watchObservedRunningTime="2026-01-30 06:43:23.417340828 +0000 UTC m=+5738.787251095" Jan 30 06:43:23 crc kubenswrapper[4931]: I0130 06:43:23.652113 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-79684d7c94-4r69m" Jan 30 06:43:25 crc kubenswrapper[4931]: I0130 06:43:25.338286 4931 generic.go:334] "Generic (PLEG): container finished" podID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerID="870048acdaf12d315c0defcb750ddf7659f2ce415320432c25b469719ef6d1d5" exitCode=0 Jan 30 06:43:25 crc kubenswrapper[4931]: I0130 06:43:25.338714 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-2clsb" event={"ID":"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb","Type":"ContainerDied","Data":"870048acdaf12d315c0defcb750ddf7659f2ce415320432c25b469719ef6d1d5"} Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.820225 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.981217 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-combined-ca-bundle\") pod \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.981305 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data\") pod \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.981362 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-scripts\") pod \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.981482 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data-merged\") pod \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\" (UID: \"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb\") " Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.986609 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data" (OuterVolumeSpecName: "config-data") pod "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" (UID: "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:26 crc kubenswrapper[4931]: I0130 06:43:26.991622 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-scripts" (OuterVolumeSpecName: "scripts") pod "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" (UID: "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.005396 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" (UID: "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.005462 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" (UID: "6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.084244 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.084272 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.084281 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.084289 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb-config-data-merged\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.363554 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-2clsb" event={"ID":"6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb","Type":"ContainerDied","Data":"ae7cb9704c5cb2662c592a30651057ff3bbb6f8f549e1ae4f1cbae5155be4ce3"} Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.363601 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7cb9704c5cb2662c592a30651057ff3bbb6f8f549e1ae4f1cbae5155be4ce3" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.363629 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-2clsb" Jan 30 06:43:27 crc kubenswrapper[4931]: I0130 06:43:27.422782 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:43:27 crc kubenswrapper[4931]: E0130 06:43:27.423246 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:43:37 crc kubenswrapper[4931]: I0130 06:43:37.166721 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-g99g6" Jan 30 06:43:39 crc kubenswrapper[4931]: I0130 06:43:39.422746 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:43:39 crc kubenswrapper[4931]: E0130 06:43:39.423525 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:43:50 crc kubenswrapper[4931]: I0130 06:43:50.764470 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-6zvwf"] Jan 30 06:43:50 crc kubenswrapper[4931]: I0130 06:43:50.765301 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerName="octavia-amphora-httpd" containerID="cri-o://929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7" gracePeriod=30 Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.313091 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.476840 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3758e44c-007e-45da-87b8-ed2fdeb3e23c-httpd-config\") pod \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.476985 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/3758e44c-007e-45da-87b8-ed2fdeb3e23c-amphora-image\") pod \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\" (UID: \"3758e44c-007e-45da-87b8-ed2fdeb3e23c\") " Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.509005 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3758e44c-007e-45da-87b8-ed2fdeb3e23c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3758e44c-007e-45da-87b8-ed2fdeb3e23c" (UID: "3758e44c-007e-45da-87b8-ed2fdeb3e23c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.570236 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3758e44c-007e-45da-87b8-ed2fdeb3e23c-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "3758e44c-007e-45da-87b8-ed2fdeb3e23c" (UID: "3758e44c-007e-45da-87b8-ed2fdeb3e23c"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.579071 4931 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/3758e44c-007e-45da-87b8-ed2fdeb3e23c-amphora-image\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.579126 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3758e44c-007e-45da-87b8-ed2fdeb3e23c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.648839 4931 generic.go:334] "Generic (PLEG): container finished" podID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerID="929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7" exitCode=0 Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.648885 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.648949 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" event={"ID":"3758e44c-007e-45da-87b8-ed2fdeb3e23c","Type":"ContainerDied","Data":"929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7"} Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.649312 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-6zvwf" event={"ID":"3758e44c-007e-45da-87b8-ed2fdeb3e23c","Type":"ContainerDied","Data":"02c7a36d17a999021624a691df6d0451e8418cc04c05cd38164901646377f2ae"} Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.649352 4931 scope.go:117] "RemoveContainer" containerID="929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.684496 4931 scope.go:117] "RemoveContainer" containerID="f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.684949 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-6zvwf"] Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.735203 4931 scope.go:117] "RemoveContainer" containerID="929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.735478 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-6zvwf"] Jan 30 06:43:51 crc kubenswrapper[4931]: E0130 06:43:51.735724 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7\": container with ID starting with 929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7 not found: ID does not exist" containerID="929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.735766 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7"} err="failed to get container status \"929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7\": rpc error: code = NotFound desc = could not find container \"929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7\": container with ID starting with 929ac0196599bf56ecf437d08494c70ab46bf574987f0b56c3266610e5ada2d7 not found: ID does not exist" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.735820 4931 scope.go:117] "RemoveContainer" containerID="f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da" Jan 30 06:43:51 crc kubenswrapper[4931]: E0130 06:43:51.736147 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da\": container with ID starting with f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da not found: ID does not exist" containerID="f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da" Jan 30 06:43:51 crc kubenswrapper[4931]: I0130 06:43:51.736182 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da"} err="failed to get container status \"f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da\": rpc error: code = NotFound desc = could not find container \"f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da\": container with ID starting with f42ddf8c49c2965abc82a7ab03416bcc67ad1377c7d9c720d14d82b166cad6da not found: ID does not exist" Jan 30 06:43:53 crc kubenswrapper[4931]: I0130 06:43:53.424195 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:43:53 crc kubenswrapper[4931]: E0130 06:43:53.425028 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:43:53 crc kubenswrapper[4931]: I0130 06:43:53.439851 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" path="/var/lib/kubelet/pods/3758e44c-007e-45da-87b8-ed2fdeb3e23c/volumes" Jan 30 06:44:07 crc kubenswrapper[4931]: I0130 06:44:07.422817 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:44:07 crc kubenswrapper[4931]: E0130 06:44:07.424195 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.158799 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-k6c7h"] Jan 30 06:44:12 crc kubenswrapper[4931]: E0130 06:44:12.160094 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerName="octavia-amphora-httpd" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.160114 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerName="octavia-amphora-httpd" Jan 30 06:44:12 crc kubenswrapper[4931]: E0130 06:44:12.160125 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerName="init" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.160133 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerName="init" Jan 30 06:44:12 crc kubenswrapper[4931]: E0130 06:44:12.160151 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerName="init" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.160159 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerName="init" Jan 30 06:44:12 crc kubenswrapper[4931]: E0130 06:44:12.160188 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerName="octavia-db-sync" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.160195 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerName="octavia-db-sync" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.160410 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3758e44c-007e-45da-87b8-ed2fdeb3e23c" containerName="octavia-amphora-httpd" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.160456 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" containerName="octavia-db-sync" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.161629 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.164996 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.165416 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.166618 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.190042 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-k6c7h"] Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.273654 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-amphora-certs\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.274350 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-combined-ca-bundle\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.274460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2c0bd14d-9378-4c91-87e8-4ec9681103e0-hm-ports\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.274641 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-config-data\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.275149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-scripts\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.275247 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c0bd14d-9378-4c91-87e8-4ec9681103e0-config-data-merged\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.377123 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2c0bd14d-9378-4c91-87e8-4ec9681103e0-hm-ports\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.377225 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-config-data\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.377292 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-scripts\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.377316 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c0bd14d-9378-4c91-87e8-4ec9681103e0-config-data-merged\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.377460 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-amphora-certs\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.377513 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-combined-ca-bundle\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.378694 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c0bd14d-9378-4c91-87e8-4ec9681103e0-config-data-merged\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.379051 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2c0bd14d-9378-4c91-87e8-4ec9681103e0-hm-ports\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.386836 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-scripts\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.388482 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-config-data\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.392363 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-amphora-certs\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.393636 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0bd14d-9378-4c91-87e8-4ec9681103e0-combined-ca-bundle\") pod \"octavia-healthmanager-k6c7h\" (UID: \"2c0bd14d-9378-4c91-87e8-4ec9681103e0\") " pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:12 crc kubenswrapper[4931]: I0130 06:44:12.487038 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.286790 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-k6c7h"] Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.931471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-k6c7h" event={"ID":"2c0bd14d-9378-4c91-87e8-4ec9681103e0","Type":"ContainerStarted","Data":"8a52de91951bbdcb31ccc68846d7cc777f1b38ff89e122b595ea35fd9a76b55b"} Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.931865 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-k6c7h" event={"ID":"2c0bd14d-9378-4c91-87e8-4ec9681103e0","Type":"ContainerStarted","Data":"bbb5eb015173bf1da64fbbaa1f0106d5a0771083daa2da6603ec6d8fe9b198e1"} Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.977695 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-fc9fv"] Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.979704 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.983383 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Jan 30 06:44:13 crc kubenswrapper[4931]: I0130 06:44:13.983685 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.008011 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-fc9fv"] Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.119769 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8e6a5234-c995-4b65-afb5-e59eedb65e7f-hm-ports\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.119857 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-combined-ca-bundle\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.119908 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-config-data\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.119963 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8e6a5234-c995-4b65-afb5-e59eedb65e7f-config-data-merged\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.120078 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-amphora-certs\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.120596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-scripts\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.221714 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-scripts\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.221769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8e6a5234-c995-4b65-afb5-e59eedb65e7f-hm-ports\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.221806 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-combined-ca-bundle\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.221844 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-config-data\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.221889 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8e6a5234-c995-4b65-afb5-e59eedb65e7f-config-data-merged\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.221934 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-amphora-certs\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.223110 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8e6a5234-c995-4b65-afb5-e59eedb65e7f-config-data-merged\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.223875 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8e6a5234-c995-4b65-afb5-e59eedb65e7f-hm-ports\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.227289 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-amphora-certs\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.227413 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-combined-ca-bundle\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.227618 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-config-data\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.236382 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6a5234-c995-4b65-afb5-e59eedb65e7f-scripts\") pod \"octavia-housekeeping-fc9fv\" (UID: \"8e6a5234-c995-4b65-afb5-e59eedb65e7f\") " pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.302117 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.873746 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-fc9fv"] Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.886087 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:44:14 crc kubenswrapper[4931]: I0130 06:44:14.941508 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-fc9fv" event={"ID":"8e6a5234-c995-4b65-afb5-e59eedb65e7f","Type":"ContainerStarted","Data":"6fb62f1608a0c2d4c77b4959ba80d11f43c4414d3987ec676a3507cf46ceb294"} Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.000267 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-68w9j"] Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.001874 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.005062 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.005271 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.027767 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-68w9j"] Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.075500 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-scripts\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.075845 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-amphora-certs\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.075878 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-config-data\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.076025 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5a81fa26-7f20-43ef-922e-a9e63ee73709-hm-ports\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.076069 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5a81fa26-7f20-43ef-922e-a9e63ee73709-config-data-merged\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.076287 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-combined-ca-bundle\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.178489 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5a81fa26-7f20-43ef-922e-a9e63ee73709-hm-ports\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.178555 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5a81fa26-7f20-43ef-922e-a9e63ee73709-config-data-merged\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.178611 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-combined-ca-bundle\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.178664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-scripts\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.178693 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-amphora-certs\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.178731 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-config-data\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.179203 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5a81fa26-7f20-43ef-922e-a9e63ee73709-config-data-merged\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.179592 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5a81fa26-7f20-43ef-922e-a9e63ee73709-hm-ports\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.185106 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-combined-ca-bundle\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.185125 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-scripts\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.185742 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-amphora-certs\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.189283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a81fa26-7f20-43ef-922e-a9e63ee73709-config-data\") pod \"octavia-worker-68w9j\" (UID: \"5a81fa26-7f20-43ef-922e-a9e63ee73709\") " pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.328169 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-68w9j" Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.954751 4931 generic.go:334] "Generic (PLEG): container finished" podID="2c0bd14d-9378-4c91-87e8-4ec9681103e0" containerID="8a52de91951bbdcb31ccc68846d7cc777f1b38ff89e122b595ea35fd9a76b55b" exitCode=0 Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.954837 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-k6c7h" event={"ID":"2c0bd14d-9378-4c91-87e8-4ec9681103e0","Type":"ContainerDied","Data":"8a52de91951bbdcb31ccc68846d7cc777f1b38ff89e122b595ea35fd9a76b55b"} Jan 30 06:44:15 crc kubenswrapper[4931]: I0130 06:44:15.982416 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-68w9j"] Jan 30 06:44:16 crc kubenswrapper[4931]: I0130 06:44:16.639548 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-k6c7h"] Jan 30 06:44:16 crc kubenswrapper[4931]: I0130 06:44:16.966242 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-k6c7h" event={"ID":"2c0bd14d-9378-4c91-87e8-4ec9681103e0","Type":"ContainerStarted","Data":"6122060f494296a8e4b2e1c6ee845d723adc0c686e59941c94d53816d5a63cda"} Jan 30 06:44:16 crc kubenswrapper[4931]: I0130 06:44:16.967239 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:16 crc kubenswrapper[4931]: I0130 06:44:16.967847 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-68w9j" event={"ID":"5a81fa26-7f20-43ef-922e-a9e63ee73709","Type":"ContainerStarted","Data":"f54d28225b778ea2ddc3b78776bb0991a57090591c3d8b47390b17a542b19f4e"} Jan 30 06:44:16 crc kubenswrapper[4931]: I0130 06:44:16.996709 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-k6c7h" podStartSLOduration=4.9966858819999995 podStartE2EDuration="4.996685882s" podCreationTimestamp="2026-01-30 06:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:16.987041692 +0000 UTC m=+5792.356951949" watchObservedRunningTime="2026-01-30 06:44:16.996685882 +0000 UTC m=+5792.366596149" Jan 30 06:44:17 crc kubenswrapper[4931]: I0130 06:44:17.988477 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-fc9fv" event={"ID":"8e6a5234-c995-4b65-afb5-e59eedb65e7f","Type":"ContainerStarted","Data":"8c7d323579524c3d074fa2b4072a03dfb7e2913bc31be87ed2bea09bc0e4423b"} Jan 30 06:44:19 crc kubenswrapper[4931]: I0130 06:44:19.002868 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-68w9j" event={"ID":"5a81fa26-7f20-43ef-922e-a9e63ee73709","Type":"ContainerStarted","Data":"f8c6954f1283d00375ad1adbb112c4e95428775aa61270f8fba6ef3d63b0d8ff"} Jan 30 06:44:19 crc kubenswrapper[4931]: I0130 06:44:19.007757 4931 generic.go:334] "Generic (PLEG): container finished" podID="8e6a5234-c995-4b65-afb5-e59eedb65e7f" containerID="8c7d323579524c3d074fa2b4072a03dfb7e2913bc31be87ed2bea09bc0e4423b" exitCode=0 Jan 30 06:44:19 crc kubenswrapper[4931]: I0130 06:44:19.007811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-fc9fv" event={"ID":"8e6a5234-c995-4b65-afb5-e59eedb65e7f","Type":"ContainerDied","Data":"8c7d323579524c3d074fa2b4072a03dfb7e2913bc31be87ed2bea09bc0e4423b"} Jan 30 06:44:19 crc kubenswrapper[4931]: I0130 06:44:19.421995 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:44:19 crc kubenswrapper[4931]: E0130 06:44:19.422712 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:44:20 crc kubenswrapper[4931]: I0130 06:44:20.021877 4931 generic.go:334] "Generic (PLEG): container finished" podID="5a81fa26-7f20-43ef-922e-a9e63ee73709" containerID="f8c6954f1283d00375ad1adbb112c4e95428775aa61270f8fba6ef3d63b0d8ff" exitCode=0 Jan 30 06:44:20 crc kubenswrapper[4931]: I0130 06:44:20.022477 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-68w9j" event={"ID":"5a81fa26-7f20-43ef-922e-a9e63ee73709","Type":"ContainerDied","Data":"f8c6954f1283d00375ad1adbb112c4e95428775aa61270f8fba6ef3d63b0d8ff"} Jan 30 06:44:20 crc kubenswrapper[4931]: I0130 06:44:20.031055 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-fc9fv" event={"ID":"8e6a5234-c995-4b65-afb5-e59eedb65e7f","Type":"ContainerStarted","Data":"2f855c61c20ebef2e100f8d56770a0cf709039a79ad9daf792dbd7ee43daac79"} Jan 30 06:44:20 crc kubenswrapper[4931]: I0130 06:44:20.031247 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:20 crc kubenswrapper[4931]: I0130 06:44:20.081342 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-fc9fv" podStartSLOduration=5.11263152 podStartE2EDuration="7.081314347s" podCreationTimestamp="2026-01-30 06:44:13 +0000 UTC" firstStartedPulling="2026-01-30 06:44:14.885788906 +0000 UTC m=+5790.255699173" lastFinishedPulling="2026-01-30 06:44:16.854471743 +0000 UTC m=+5792.224382000" observedRunningTime="2026-01-30 06:44:20.065051232 +0000 UTC m=+5795.434961499" watchObservedRunningTime="2026-01-30 06:44:20.081314347 +0000 UTC m=+5795.451224644" Jan 30 06:44:21 crc kubenswrapper[4931]: I0130 06:44:21.051414 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-68w9j" event={"ID":"5a81fa26-7f20-43ef-922e-a9e63ee73709","Type":"ContainerStarted","Data":"11accab5544da0b7dc5a0d5765d3610b5d7ded09781153af52120e63dd5fe556"} Jan 30 06:44:21 crc kubenswrapper[4931]: I0130 06:44:21.052100 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-68w9j" Jan 30 06:44:21 crc kubenswrapper[4931]: I0130 06:44:21.088862 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-68w9j" podStartSLOduration=4.770406644 podStartE2EDuration="7.088838084s" podCreationTimestamp="2026-01-30 06:44:14 +0000 UTC" firstStartedPulling="2026-01-30 06:44:15.994055002 +0000 UTC m=+5791.363965259" lastFinishedPulling="2026-01-30 06:44:18.312486442 +0000 UTC m=+5793.682396699" observedRunningTime="2026-01-30 06:44:21.082594639 +0000 UTC m=+5796.452504926" watchObservedRunningTime="2026-01-30 06:44:21.088838084 +0000 UTC m=+5796.458748381" Jan 30 06:44:27 crc kubenswrapper[4931]: I0130 06:44:27.514708 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-k6c7h" Jan 30 06:44:29 crc kubenswrapper[4931]: I0130 06:44:29.343054 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-fc9fv" Jan 30 06:44:30 crc kubenswrapper[4931]: I0130 06:44:30.405932 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-68w9j" Jan 30 06:44:31 crc kubenswrapper[4931]: I0130 06:44:31.423307 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:44:31 crc kubenswrapper[4931]: E0130 06:44:31.423896 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:44:45 crc kubenswrapper[4931]: I0130 06:44:45.441996 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:44:45 crc kubenswrapper[4931]: E0130 06:44:45.442925 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.159509 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb"] Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.161842 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.164922 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.165177 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.189582 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb"] Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.203695 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d0d02c3-f482-4b3c-b015-544fc50919b7-secret-volume\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.203800 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d0d02c3-f482-4b3c-b015-544fc50919b7-config-volume\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.203921 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6vjx\" (UniqueName: \"kubernetes.io/projected/3d0d02c3-f482-4b3c-b015-544fc50919b7-kube-api-access-k6vjx\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.305827 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d0d02c3-f482-4b3c-b015-544fc50919b7-secret-volume\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.305911 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d0d02c3-f482-4b3c-b015-544fc50919b7-config-volume\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.306020 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6vjx\" (UniqueName: \"kubernetes.io/projected/3d0d02c3-f482-4b3c-b015-544fc50919b7-kube-api-access-k6vjx\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.307628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d0d02c3-f482-4b3c-b015-544fc50919b7-config-volume\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.312109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d0d02c3-f482-4b3c-b015-544fc50919b7-secret-volume\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.326455 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6vjx\" (UniqueName: \"kubernetes.io/projected/3d0d02c3-f482-4b3c-b015-544fc50919b7-kube-api-access-k6vjx\") pod \"collect-profiles-29495925-rsbpb\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.448290 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:45:00 crc kubenswrapper[4931]: E0130 06:45:00.448820 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.498281 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.685285 4931 scope.go:117] "RemoveContainer" containerID="a67c2af59ce774fac5d99d16e2c4d0308297f692031a8a81e468f5bef97702ac" Jan 30 06:45:00 crc kubenswrapper[4931]: I0130 06:45:00.722933 4931 scope.go:117] "RemoveContainer" containerID="7956c67be4873a213d6ce531a234902ab8e420fef8d81bd2f9cc50a55b2ed19e" Jan 30 06:45:01 crc kubenswrapper[4931]: I0130 06:45:01.006145 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb"] Jan 30 06:45:01 crc kubenswrapper[4931]: I0130 06:45:01.638661 4931 generic.go:334] "Generic (PLEG): container finished" podID="3d0d02c3-f482-4b3c-b015-544fc50919b7" containerID="a28d2da22cb75022f4cebf8bfd4527b7d329cedd3a234cdb0a658e8ba3685b7e" exitCode=0 Jan 30 06:45:01 crc kubenswrapper[4931]: I0130 06:45:01.638728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" event={"ID":"3d0d02c3-f482-4b3c-b015-544fc50919b7","Type":"ContainerDied","Data":"a28d2da22cb75022f4cebf8bfd4527b7d329cedd3a234cdb0a658e8ba3685b7e"} Jan 30 06:45:01 crc kubenswrapper[4931]: I0130 06:45:01.639671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" event={"ID":"3d0d02c3-f482-4b3c-b015-544fc50919b7","Type":"ContainerStarted","Data":"a209fefa99ef75ed104126fe67e2831923a710d9621f51f36a272702f29e5f02"} Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.116964 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.216507 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6vjx\" (UniqueName: \"kubernetes.io/projected/3d0d02c3-f482-4b3c-b015-544fc50919b7-kube-api-access-k6vjx\") pod \"3d0d02c3-f482-4b3c-b015-544fc50919b7\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.216749 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d0d02c3-f482-4b3c-b015-544fc50919b7-config-volume\") pod \"3d0d02c3-f482-4b3c-b015-544fc50919b7\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.216887 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d0d02c3-f482-4b3c-b015-544fc50919b7-secret-volume\") pod \"3d0d02c3-f482-4b3c-b015-544fc50919b7\" (UID: \"3d0d02c3-f482-4b3c-b015-544fc50919b7\") " Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.218123 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d0d02c3-f482-4b3c-b015-544fc50919b7-config-volume" (OuterVolumeSpecName: "config-volume") pod "3d0d02c3-f482-4b3c-b015-544fc50919b7" (UID: "3d0d02c3-f482-4b3c-b015-544fc50919b7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.233793 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0d02c3-f482-4b3c-b015-544fc50919b7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3d0d02c3-f482-4b3c-b015-544fc50919b7" (UID: "3d0d02c3-f482-4b3c-b015-544fc50919b7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.233944 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0d02c3-f482-4b3c-b015-544fc50919b7-kube-api-access-k6vjx" (OuterVolumeSpecName: "kube-api-access-k6vjx") pod "3d0d02c3-f482-4b3c-b015-544fc50919b7" (UID: "3d0d02c3-f482-4b3c-b015-544fc50919b7"). InnerVolumeSpecName "kube-api-access-k6vjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.318583 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d0d02c3-f482-4b3c-b015-544fc50919b7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.318809 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6vjx\" (UniqueName: \"kubernetes.io/projected/3d0d02c3-f482-4b3c-b015-544fc50919b7-kube-api-access-k6vjx\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.318869 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d0d02c3-f482-4b3c-b015-544fc50919b7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.662344 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" event={"ID":"3d0d02c3-f482-4b3c-b015-544fc50919b7","Type":"ContainerDied","Data":"a209fefa99ef75ed104126fe67e2831923a710d9621f51f36a272702f29e5f02"} Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.662387 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a209fefa99ef75ed104126fe67e2831923a710d9621f51f36a272702f29e5f02" Jan 30 06:45:03 crc kubenswrapper[4931]: I0130 06:45:03.662811 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-rsbpb" Jan 30 06:45:04 crc kubenswrapper[4931]: I0130 06:45:04.209176 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g"] Jan 30 06:45:04 crc kubenswrapper[4931]: I0130 06:45:04.224244 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-jrm5g"] Jan 30 06:45:05 crc kubenswrapper[4931]: I0130 06:45:05.441456 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba" path="/var/lib/kubelet/pods/ffc6cfb6-b46d-49b4-b05d-e9c2b033a1ba/volumes" Jan 30 06:45:12 crc kubenswrapper[4931]: I0130 06:45:12.050730 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cxbxk"] Jan 30 06:45:12 crc kubenswrapper[4931]: I0130 06:45:12.062502 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-64f9-account-create-update-sm7kp"] Jan 30 06:45:12 crc kubenswrapper[4931]: I0130 06:45:12.072903 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cxbxk"] Jan 30 06:45:12 crc kubenswrapper[4931]: I0130 06:45:12.081836 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-64f9-account-create-update-sm7kp"] Jan 30 06:45:13 crc kubenswrapper[4931]: I0130 06:45:13.438417 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e8b686f-89e9-4561-b4da-73c3087f1913" path="/var/lib/kubelet/pods/7e8b686f-89e9-4561-b4da-73c3087f1913/volumes" Jan 30 06:45:13 crc kubenswrapper[4931]: I0130 06:45:13.440124 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1237d07-19d9-47bb-8fb8-42e905fcc41b" path="/var/lib/kubelet/pods/f1237d07-19d9-47bb-8fb8-42e905fcc41b/volumes" Jan 30 06:45:14 crc kubenswrapper[4931]: I0130 06:45:14.422729 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:45:14 crc kubenswrapper[4931]: E0130 06:45:14.423345 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.028850 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fd79f5877-c4n2v"] Jan 30 06:45:17 crc kubenswrapper[4931]: E0130 06:45:17.029718 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0d02c3-f482-4b3c-b015-544fc50919b7" containerName="collect-profiles" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.029732 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0d02c3-f482-4b3c-b015-544fc50919b7" containerName="collect-profiles" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.029914 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0d02c3-f482-4b3c-b015-544fc50919b7" containerName="collect-profiles" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.031032 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.038393 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zzqsh"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.039670 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.039725 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.039872 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-s44jf" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.039972 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.048909 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zzqsh"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.057954 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fd79f5877-c4n2v"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.094545 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.094779 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-log" containerID="cri-o://d05f5e569f00e57864985b93b02a9cff6181003207de011c326a946bc0f2b2f3" gracePeriod=30 Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.094859 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-httpd" containerID="cri-o://9f0644fcac396be5d0abd6c6b2b21170c61642dc6f5db73f638093f772151226" gracePeriod=30 Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.122655 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-scripts\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.122742 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00dba7a3-7492-4010-9931-1ed387dc22a7-logs\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.122762 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00dba7a3-7492-4010-9931-1ed387dc22a7-horizon-secret-key\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.122891 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-config-data\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.122929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5pmn\" (UniqueName: \"kubernetes.io/projected/00dba7a3-7492-4010-9931-1ed387dc22a7-kube-api-access-l5pmn\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.142069 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86ccfbfc65-5jz59"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.143910 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.158299 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.158629 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-log" containerID="cri-o://32f7ab137db348695ddb60d19f60238d0ad9feb2d7bfb8d7925247bd1ac76d50" gracePeriod=30 Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.158771 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-httpd" containerID="cri-o://d2e2adf695545abae4fa1879a24ad260b75515f25d086b16166bdfe80e55cc22" gracePeriod=30 Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.189773 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86ccfbfc65-5jz59"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.224925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-config-data\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.224986 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-scripts\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225005 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5pmn\" (UniqueName: \"kubernetes.io/projected/00dba7a3-7492-4010-9931-1ed387dc22a7-kube-api-access-l5pmn\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-config-data\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225051 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hdtv\" (UniqueName: \"kubernetes.io/projected/e180809a-b692-42c0-b821-723afe805954-kube-api-access-7hdtv\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e180809a-b692-42c0-b821-723afe805954-horizon-secret-key\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225104 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-scripts\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225271 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00dba7a3-7492-4010-9931-1ed387dc22a7-logs\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225312 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00dba7a3-7492-4010-9931-1ed387dc22a7-horizon-secret-key\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225515 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e180809a-b692-42c0-b821-723afe805954-logs\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.225809 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00dba7a3-7492-4010-9931-1ed387dc22a7-logs\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.226093 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-scripts\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.227107 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-config-data\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.236091 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00dba7a3-7492-4010-9931-1ed387dc22a7-horizon-secret-key\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.239734 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5pmn\" (UniqueName: \"kubernetes.io/projected/00dba7a3-7492-4010-9931-1ed387dc22a7-kube-api-access-l5pmn\") pod \"horizon-5fd79f5877-c4n2v\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.327053 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-config-data\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.327097 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hdtv\" (UniqueName: \"kubernetes.io/projected/e180809a-b692-42c0-b821-723afe805954-kube-api-access-7hdtv\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.327126 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e180809a-b692-42c0-b821-723afe805954-horizon-secret-key\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.327256 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e180809a-b692-42c0-b821-723afe805954-logs\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.327315 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-scripts\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.328002 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-scripts\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.328324 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e180809a-b692-42c0-b821-723afe805954-logs\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.328744 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-config-data\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.331329 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e180809a-b692-42c0-b821-723afe805954-horizon-secret-key\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.345717 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.348352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hdtv\" (UniqueName: \"kubernetes.io/projected/e180809a-b692-42c0-b821-723afe805954-kube-api-access-7hdtv\") pod \"horizon-86ccfbfc65-5jz59\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.431964 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60872807-e034-4844-9f79-8005640c308c" path="/var/lib/kubelet/pods/60872807-e034-4844-9f79-8005640c308c/volumes" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.493124 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.736720 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fd79f5877-c4n2v"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.759603 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b8f5df775-m6dvd"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.761718 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.776949 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b8f5df775-m6dvd"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.805666 4931 generic.go:334] "Generic (PLEG): container finished" podID="50e397ef-0630-40db-a591-28d7584dee76" containerID="32f7ab137db348695ddb60d19f60238d0ad9feb2d7bfb8d7925247bd1ac76d50" exitCode=143 Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.805712 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50e397ef-0630-40db-a591-28d7584dee76","Type":"ContainerDied","Data":"32f7ab137db348695ddb60d19f60238d0ad9feb2d7bfb8d7925247bd1ac76d50"} Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.809935 4931 generic.go:334] "Generic (PLEG): container finished" podID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerID="d05f5e569f00e57864985b93b02a9cff6181003207de011c326a946bc0f2b2f3" exitCode=143 Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.809959 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ab7585b-916e-4a6a-8aa8-da769aaa437e","Type":"ContainerDied","Data":"d05f5e569f00e57864985b93b02a9cff6181003207de011c326a946bc0f2b2f3"} Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.821367 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fd79f5877-c4n2v"] Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.844214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-config-data\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.844354 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-477f8\" (UniqueName: \"kubernetes.io/projected/1d524b32-d060-41f3-88a6-d5339c438fff-kube-api-access-477f8\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.844394 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d524b32-d060-41f3-88a6-d5339c438fff-horizon-secret-key\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.844436 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-scripts\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.844461 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d524b32-d060-41f3-88a6-d5339c438fff-logs\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.946392 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-477f8\" (UniqueName: \"kubernetes.io/projected/1d524b32-d060-41f3-88a6-d5339c438fff-kube-api-access-477f8\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.946488 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d524b32-d060-41f3-88a6-d5339c438fff-horizon-secret-key\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.946521 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-scripts\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.946550 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d524b32-d060-41f3-88a6-d5339c438fff-logs\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.946592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-config-data\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.948094 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-config-data\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.949580 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d524b32-d060-41f3-88a6-d5339c438fff-logs\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.950106 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-scripts\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.962747 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d524b32-d060-41f3-88a6-d5339c438fff-horizon-secret-key\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.968118 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-477f8\" (UniqueName: \"kubernetes.io/projected/1d524b32-d060-41f3-88a6-d5339c438fff-kube-api-access-477f8\") pod \"horizon-7b8f5df775-m6dvd\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:17 crc kubenswrapper[4931]: I0130 06:45:17.992461 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86ccfbfc65-5jz59"] Jan 30 06:45:17 crc kubenswrapper[4931]: W0130 06:45:17.995387 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode180809a_b692_42c0_b821_723afe805954.slice/crio-6e7fd796d4cb0d311b3d0a188a2af32435b8c86d4ea09b091cca87d30bbf2b5d WatchSource:0}: Error finding container 6e7fd796d4cb0d311b3d0a188a2af32435b8c86d4ea09b091cca87d30bbf2b5d: Status 404 returned error can't find the container with id 6e7fd796d4cb0d311b3d0a188a2af32435b8c86d4ea09b091cca87d30bbf2b5d Jan 30 06:45:18 crc kubenswrapper[4931]: I0130 06:45:18.083745 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:18 crc kubenswrapper[4931]: I0130 06:45:18.585691 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b8f5df775-m6dvd"] Jan 30 06:45:18 crc kubenswrapper[4931]: W0130 06:45:18.588198 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d524b32_d060_41f3_88a6_d5339c438fff.slice/crio-b4e9652d680d0769885e340ea1c5e364bdd99c6ea6b532de8c239df55ba5c48d WatchSource:0}: Error finding container b4e9652d680d0769885e340ea1c5e364bdd99c6ea6b532de8c239df55ba5c48d: Status 404 returned error can't find the container with id b4e9652d680d0769885e340ea1c5e364bdd99c6ea6b532de8c239df55ba5c48d Jan 30 06:45:18 crc kubenswrapper[4931]: I0130 06:45:18.827980 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd79f5877-c4n2v" event={"ID":"00dba7a3-7492-4010-9931-1ed387dc22a7","Type":"ContainerStarted","Data":"b88b4f7f43f890396ef5c3306741e800af49790969d1f953fb512e6e66e5766b"} Jan 30 06:45:18 crc kubenswrapper[4931]: I0130 06:45:18.832867 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86ccfbfc65-5jz59" event={"ID":"e180809a-b692-42c0-b821-723afe805954","Type":"ContainerStarted","Data":"6e7fd796d4cb0d311b3d0a188a2af32435b8c86d4ea09b091cca87d30bbf2b5d"} Jan 30 06:45:18 crc kubenswrapper[4931]: I0130 06:45:18.835389 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f5df775-m6dvd" event={"ID":"1d524b32-d060-41f3-88a6-d5339c438fff","Type":"ContainerStarted","Data":"b4e9652d680d0769885e340ea1c5e364bdd99c6ea6b532de8c239df55ba5c48d"} Jan 30 06:45:20 crc kubenswrapper[4931]: I0130 06:45:20.256293 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.47:9292/healthcheck\": read tcp 10.217.0.2:53626->10.217.1.47:9292: read: connection reset by peer" Jan 30 06:45:20 crc kubenswrapper[4931]: I0130 06:45:20.256352 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.47:9292/healthcheck\": read tcp 10.217.0.2:53638->10.217.1.47:9292: read: connection reset by peer" Jan 30 06:45:20 crc kubenswrapper[4931]: I0130 06:45:20.861053 4931 generic.go:334] "Generic (PLEG): container finished" podID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerID="9f0644fcac396be5d0abd6c6b2b21170c61642dc6f5db73f638093f772151226" exitCode=0 Jan 30 06:45:20 crc kubenswrapper[4931]: I0130 06:45:20.861114 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ab7585b-916e-4a6a-8aa8-da769aaa437e","Type":"ContainerDied","Data":"9f0644fcac396be5d0abd6c6b2b21170c61642dc6f5db73f638093f772151226"} Jan 30 06:45:20 crc kubenswrapper[4931]: I0130 06:45:20.863970 4931 generic.go:334] "Generic (PLEG): container finished" podID="50e397ef-0630-40db-a591-28d7584dee76" containerID="d2e2adf695545abae4fa1879a24ad260b75515f25d086b16166bdfe80e55cc22" exitCode=0 Jan 30 06:45:20 crc kubenswrapper[4931]: I0130 06:45:20.863991 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50e397ef-0630-40db-a591-28d7584dee76","Type":"ContainerDied","Data":"d2e2adf695545abae4fa1879a24ad260b75515f25d086b16166bdfe80e55cc22"} Jan 30 06:45:21 crc kubenswrapper[4931]: I0130 06:45:21.250029 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.48:9292/healthcheck\": dial tcp 10.217.1.48:9292: connect: connection refused" Jan 30 06:45:21 crc kubenswrapper[4931]: I0130 06:45:21.250159 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.48:9292/healthcheck\": dial tcp 10.217.1.48:9292: connect: connection refused" Jan 30 06:45:24 crc kubenswrapper[4931]: I0130 06:45:24.948753 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000084 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-config-data\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000220 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-httpd-run\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000251 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-scripts\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-combined-ca-bundle\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000285 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-logs\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000341 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-ceph\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.000374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8j2s\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-kube-api-access-h8j2s\") pod \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\" (UID: \"7ab7585b-916e-4a6a-8aa8-da769aaa437e\") " Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.001267 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-logs" (OuterVolumeSpecName: "logs") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.001731 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.004935 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.004959 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ab7585b-916e-4a6a-8aa8-da769aaa437e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.005788 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-kube-api-access-h8j2s" (OuterVolumeSpecName: "kube-api-access-h8j2s") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "kube-api-access-h8j2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.011531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-ceph" (OuterVolumeSpecName: "ceph") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.018676 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-scripts" (OuterVolumeSpecName: "scripts") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.039216 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.081554 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-config-data" (OuterVolumeSpecName: "config-data") pod "7ab7585b-916e-4a6a-8aa8-da769aaa437e" (UID: "7ab7585b-916e-4a6a-8aa8-da769aaa437e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.107386 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.107448 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.107459 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab7585b-916e-4a6a-8aa8-da769aaa437e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.107469 4931 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-ceph\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.107477 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8j2s\" (UniqueName: \"kubernetes.io/projected/7ab7585b-916e-4a6a-8aa8-da769aaa437e-kube-api-access-h8j2s\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.925954 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd79f5877-c4n2v" event={"ID":"00dba7a3-7492-4010-9931-1ed387dc22a7","Type":"ContainerStarted","Data":"0ac695e770aa1e4dca8f308d0921c4323577d50a3d4f87168b23f7117ef54020"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.926295 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd79f5877-c4n2v" event={"ID":"00dba7a3-7492-4010-9931-1ed387dc22a7","Type":"ContainerStarted","Data":"3f69e23981e22c16e2b3eb960775aef58a246bf4d2a400c78b43e6f979492f62"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.926221 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fd79f5877-c4n2v" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon" containerID="cri-o://0ac695e770aa1e4dca8f308d0921c4323577d50a3d4f87168b23f7117ef54020" gracePeriod=30 Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.926102 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fd79f5877-c4n2v" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon-log" containerID="cri-o://3f69e23981e22c16e2b3eb960775aef58a246bf4d2a400c78b43e6f979492f62" gracePeriod=30 Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.931090 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86ccfbfc65-5jz59" event={"ID":"e180809a-b692-42c0-b821-723afe805954","Type":"ContainerStarted","Data":"700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.931133 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86ccfbfc65-5jz59" event={"ID":"e180809a-b692-42c0-b821-723afe805954","Type":"ContainerStarted","Data":"72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.936699 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f5df775-m6dvd" event={"ID":"1d524b32-d060-41f3-88a6-d5339c438fff","Type":"ContainerStarted","Data":"85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.936758 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f5df775-m6dvd" event={"ID":"1d524b32-d060-41f3-88a6-d5339c438fff","Type":"ContainerStarted","Data":"16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.939952 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ab7585b-916e-4a6a-8aa8-da769aaa437e","Type":"ContainerDied","Data":"266032ac228e0593b292c8f7becc87b38799dd433a3129a19387d1cf6ec27145"} Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.940017 4931 scope.go:117] "RemoveContainer" containerID="9f0644fcac396be5d0abd6c6b2b21170c61642dc6f5db73f638093f772151226" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.940182 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.970324 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fd79f5877-c4n2v" podStartSLOduration=2.122172865 podStartE2EDuration="8.970297948s" podCreationTimestamp="2026-01-30 06:45:17 +0000 UTC" firstStartedPulling="2026-01-30 06:45:17.851000253 +0000 UTC m=+5853.220910510" lastFinishedPulling="2026-01-30 06:45:24.699125336 +0000 UTC m=+5860.069035593" observedRunningTime="2026-01-30 06:45:25.951823001 +0000 UTC m=+5861.321733288" watchObservedRunningTime="2026-01-30 06:45:25.970297948 +0000 UTC m=+5861.340208215" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.994531 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b8f5df775-m6dvd" podStartSLOduration=2.916776204 podStartE2EDuration="8.994516425s" podCreationTimestamp="2026-01-30 06:45:17 +0000 UTC" firstStartedPulling="2026-01-30 06:45:18.591961342 +0000 UTC m=+5853.961871599" lastFinishedPulling="2026-01-30 06:45:24.669701563 +0000 UTC m=+5860.039611820" observedRunningTime="2026-01-30 06:45:25.984341651 +0000 UTC m=+5861.354251918" watchObservedRunningTime="2026-01-30 06:45:25.994516425 +0000 UTC m=+5861.364426682" Jan 30 06:45:25 crc kubenswrapper[4931]: I0130 06:45:25.995034 4931 scope.go:117] "RemoveContainer" containerID="d05f5e569f00e57864985b93b02a9cff6181003207de011c326a946bc0f2b2f3" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.018979 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86ccfbfc65-5jz59" podStartSLOduration=2.352468168 podStartE2EDuration="9.018958709s" podCreationTimestamp="2026-01-30 06:45:17 +0000 UTC" firstStartedPulling="2026-01-30 06:45:17.99887682 +0000 UTC m=+5853.368787077" lastFinishedPulling="2026-01-30 06:45:24.665367371 +0000 UTC m=+5860.035277618" observedRunningTime="2026-01-30 06:45:26.010817511 +0000 UTC m=+5861.380727768" watchObservedRunningTime="2026-01-30 06:45:26.018958709 +0000 UTC m=+5861.388868966" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.040613 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.093665 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.104250 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:45:26 crc kubenswrapper[4931]: E0130 06:45:26.105007 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-log" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.105027 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-log" Jan 30 06:45:26 crc kubenswrapper[4931]: E0130 06:45:26.105053 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-httpd" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.105060 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-httpd" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.105392 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-log" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.105442 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" containerName="glance-httpd" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.112734 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.114275 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.117667 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.145412 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234226 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-scripts\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-httpd-run\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234391 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-logs\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234444 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-ceph\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234473 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-config-data\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234536 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-combined-ca-bundle\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234576 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7hqx\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-kube-api-access-x7hqx\") pod \"50e397ef-0630-40db-a591-28d7584dee76\" (UID: \"50e397ef-0630-40db-a591-28d7584dee76\") " Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234846 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234924 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94f5b24a-840b-4206-a190-63cd6339ed70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.234962 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94f5b24a-840b-4206-a190-63cd6339ed70-logs\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.235013 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bj8j\" (UniqueName: \"kubernetes.io/projected/94f5b24a-840b-4206-a190-63cd6339ed70-kube-api-access-8bj8j\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.235051 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-scripts\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.235096 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94f5b24a-840b-4206-a190-63cd6339ed70-ceph\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.235123 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-config-data\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.258878 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.259197 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-logs" (OuterVolumeSpecName: "logs") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.267091 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-kube-api-access-x7hqx" (OuterVolumeSpecName: "kube-api-access-x7hqx") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "kube-api-access-x7hqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.269549 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-ceph" (OuterVolumeSpecName: "ceph") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.269677 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-scripts" (OuterVolumeSpecName: "scripts") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.333570 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.336447 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bj8j\" (UniqueName: \"kubernetes.io/projected/94f5b24a-840b-4206-a190-63cd6339ed70-kube-api-access-8bj8j\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.336623 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-scripts\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.336745 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94f5b24a-840b-4206-a190-63cd6339ed70-ceph\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.336827 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-config-data\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.336910 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337016 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94f5b24a-840b-4206-a190-63cd6339ed70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337106 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94f5b24a-840b-4206-a190-63cd6339ed70-logs\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337234 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337302 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337369 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50e397ef-0630-40db-a591-28d7584dee76-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337442 4931 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-ceph\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337506 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.337576 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7hqx\" (UniqueName: \"kubernetes.io/projected/50e397ef-0630-40db-a591-28d7584dee76-kube-api-access-x7hqx\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.338010 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94f5b24a-840b-4206-a190-63cd6339ed70-logs\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.345972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94f5b24a-840b-4206-a190-63cd6339ed70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.359556 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-config-data" (OuterVolumeSpecName: "config-data") pod "50e397ef-0630-40db-a591-28d7584dee76" (UID: "50e397ef-0630-40db-a591-28d7584dee76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.360676 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94f5b24a-840b-4206-a190-63cd6339ed70-ceph\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.361976 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.362924 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-config-data\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.364037 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f5b24a-840b-4206-a190-63cd6339ed70-scripts\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.370202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bj8j\" (UniqueName: \"kubernetes.io/projected/94f5b24a-840b-4206-a190-63cd6339ed70-kube-api-access-8bj8j\") pod \"glance-default-external-api-0\" (UID: \"94f5b24a-840b-4206-a190-63cd6339ed70\") " pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.441994 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e397ef-0630-40db-a591-28d7584dee76-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.467911 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.954409 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.954402 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50e397ef-0630-40db-a591-28d7584dee76","Type":"ContainerDied","Data":"a16e5756f39cf431b53b36d453d1cc052129da02cb0cac04cc9d6bca4777d6a5"} Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.954824 4931 scope.go:117] "RemoveContainer" containerID="d2e2adf695545abae4fa1879a24ad260b75515f25d086b16166bdfe80e55cc22" Jan 30 06:45:26 crc kubenswrapper[4931]: I0130 06:45:26.991403 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:26.999539 4931 scope.go:117] "RemoveContainer" containerID="32f7ab137db348695ddb60d19f60238d0ad9feb2d7bfb8d7925247bd1ac76d50" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.005361 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.027240 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:45:27 crc kubenswrapper[4931]: E0130 06:45:27.027971 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-httpd" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.028003 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-httpd" Jan 30 06:45:27 crc kubenswrapper[4931]: E0130 06:45:27.028047 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-log" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.028060 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-log" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.028448 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-log" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.028486 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e397ef-0630-40db-a591-28d7584dee76" containerName="glance-httpd" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.029984 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.032916 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.038493 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057414 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057582 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057609 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqhxj\" (UniqueName: \"kubernetes.io/projected/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-kube-api-access-sqhxj\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057629 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-logs\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057644 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.057692 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.113111 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159127 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159465 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqhxj\" (UniqueName: \"kubernetes.io/projected/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-kube-api-access-sqhxj\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159499 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159522 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-logs\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159712 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.159749 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.160351 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-logs\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.160461 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.165674 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.168711 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.169079 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.172396 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.177239 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqhxj\" (UniqueName: \"kubernetes.io/projected/ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf-kube-api-access-sqhxj\") pod \"glance-default-internal-api-0\" (UID: \"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.346734 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.358281 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.438163 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e397ef-0630-40db-a591-28d7584dee76" path="/var/lib/kubelet/pods/50e397ef-0630-40db-a591-28d7584dee76/volumes" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.439849 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab7585b-916e-4a6a-8aa8-da769aaa437e" path="/var/lib/kubelet/pods/7ab7585b-916e-4a6a-8aa8-da769aaa437e/volumes" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.494412 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.494481 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.922809 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.974771 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf","Type":"ContainerStarted","Data":"920850b60e6a3bf1f1b66ac4ba177b4f4f3a4070aebbad1d428058f2829d34ce"} Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.977171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94f5b24a-840b-4206-a190-63cd6339ed70","Type":"ContainerStarted","Data":"31ba2be01ceb4b04307a9e18ff07ff463cb8f2d42b16f273c5db1453b725fb17"} Jan 30 06:45:27 crc kubenswrapper[4931]: I0130 06:45:27.977220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94f5b24a-840b-4206-a190-63cd6339ed70","Type":"ContainerStarted","Data":"c9c30a1dee136eec4233191c93835fcc4ffceca99e62438447aa5776af22fd73"} Jan 30 06:45:28 crc kubenswrapper[4931]: I0130 06:45:28.084654 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:28 crc kubenswrapper[4931]: I0130 06:45:28.084714 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:28 crc kubenswrapper[4931]: I0130 06:45:28.994713 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94f5b24a-840b-4206-a190-63cd6339ed70","Type":"ContainerStarted","Data":"b86c394a31d53bfff5771d20ccce9ccb2419535de7783460e0495582b005d673"} Jan 30 06:45:28 crc kubenswrapper[4931]: I0130 06:45:28.998858 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf","Type":"ContainerStarted","Data":"bca631f700ba78e2d22b5877ca51c64be36cfb8db82d0ffd7a84a8c2aa1dba36"} Jan 30 06:45:29 crc kubenswrapper[4931]: I0130 06:45:29.023212 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.023182796 podStartE2EDuration="3.023182796s" podCreationTimestamp="2026-01-30 06:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:29.012349253 +0000 UTC m=+5864.382259530" watchObservedRunningTime="2026-01-30 06:45:29.023182796 +0000 UTC m=+5864.393093093" Jan 30 06:45:29 crc kubenswrapper[4931]: I0130 06:45:29.422334 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:45:29 crc kubenswrapper[4931]: E0130 06:45:29.422929 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:45:30 crc kubenswrapper[4931]: I0130 06:45:30.010498 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf","Type":"ContainerStarted","Data":"f7ec83d39cecd08a7993701455bb9e15718ca83c5aaa26f24f3622dc0ee7ecb9"} Jan 30 06:45:30 crc kubenswrapper[4931]: I0130 06:45:30.040802 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.040777624 podStartE2EDuration="4.040777624s" podCreationTimestamp="2026-01-30 06:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:30.029733945 +0000 UTC m=+5865.399644222" watchObservedRunningTime="2026-01-30 06:45:30.040777624 +0000 UTC m=+5865.410687901" Jan 30 06:45:36 crc kubenswrapper[4931]: I0130 06:45:36.469464 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:45:36 crc kubenswrapper[4931]: I0130 06:45:36.470236 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:45:36 crc kubenswrapper[4931]: I0130 06:45:36.539254 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:45:36 crc kubenswrapper[4931]: I0130 06:45:36.562085 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.119284 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.119360 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.359225 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.359493 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.399257 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.445761 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:37 crc kubenswrapper[4931]: I0130 06:45:37.497250 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-86ccfbfc65-5jz59" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Jan 30 06:45:38 crc kubenswrapper[4931]: I0130 06:45:38.087037 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b8f5df775-m6dvd" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.116:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8080: connect: connection refused" Jan 30 06:45:38 crc kubenswrapper[4931]: I0130 06:45:38.132677 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:38 crc kubenswrapper[4931]: I0130 06:45:38.133043 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:39 crc kubenswrapper[4931]: I0130 06:45:38.999310 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:45:39 crc kubenswrapper[4931]: I0130 06:45:39.143350 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 06:45:39 crc kubenswrapper[4931]: I0130 06:45:39.403112 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:45:40 crc kubenswrapper[4931]: I0130 06:45:40.185908 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:40 crc kubenswrapper[4931]: I0130 06:45:40.186206 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 06:45:40 crc kubenswrapper[4931]: I0130 06:45:40.307664 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:45:42 crc kubenswrapper[4931]: I0130 06:45:42.424893 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:45:42 crc kubenswrapper[4931]: E0130 06:45:42.427209 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:45:46 crc kubenswrapper[4931]: I0130 06:45:46.064601 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ee04-account-create-update-5mxt8"] Jan 30 06:45:46 crc kubenswrapper[4931]: I0130 06:45:46.083533 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-w8ln6"] Jan 30 06:45:46 crc kubenswrapper[4931]: I0130 06:45:46.094148 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ee04-account-create-update-5mxt8"] Jan 30 06:45:46 crc kubenswrapper[4931]: I0130 06:45:46.103477 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-w8ln6"] Jan 30 06:45:47 crc kubenswrapper[4931]: I0130 06:45:47.443656 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259088b5-f22c-4773-a526-5ce0d618a3c9" path="/var/lib/kubelet/pods/259088b5-f22c-4773-a526-5ce0d618a3c9/volumes" Jan 30 06:45:47 crc kubenswrapper[4931]: I0130 06:45:47.446982 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3ee3e2-1067-4d91-8780-4ee1442ddccd" path="/var/lib/kubelet/pods/da3ee3e2-1067-4d91-8780-4ee1442ddccd/volumes" Jan 30 06:45:49 crc kubenswrapper[4931]: I0130 06:45:49.285208 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:49 crc kubenswrapper[4931]: I0130 06:45:49.852058 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:51 crc kubenswrapper[4931]: I0130 06:45:51.061921 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:45:51 crc kubenswrapper[4931]: I0130 06:45:51.569580 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:45:51 crc kubenswrapper[4931]: I0130 06:45:51.643711 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86ccfbfc65-5jz59"] Jan 30 06:45:51 crc kubenswrapper[4931]: I0130 06:45:51.644080 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86ccfbfc65-5jz59" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon-log" containerID="cri-o://72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0" gracePeriod=30 Jan 30 06:45:51 crc kubenswrapper[4931]: I0130 06:45:51.644217 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86ccfbfc65-5jz59" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" containerID="cri-o://700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111" gracePeriod=30 Jan 30 06:45:53 crc kubenswrapper[4931]: I0130 06:45:53.422936 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:45:53 crc kubenswrapper[4931]: E0130 06:45:53.423747 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:45:55 crc kubenswrapper[4931]: I0130 06:45:55.056922 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vxl4f"] Jan 30 06:45:55 crc kubenswrapper[4931]: I0130 06:45:55.075050 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vxl4f"] Jan 30 06:45:55 crc kubenswrapper[4931]: I0130 06:45:55.345311 4931 generic.go:334] "Generic (PLEG): container finished" podID="e180809a-b692-42c0-b821-723afe805954" containerID="700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111" exitCode=0 Jan 30 06:45:55 crc kubenswrapper[4931]: I0130 06:45:55.345376 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86ccfbfc65-5jz59" event={"ID":"e180809a-b692-42c0-b821-723afe805954","Type":"ContainerDied","Data":"700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111"} Jan 30 06:45:55 crc kubenswrapper[4931]: I0130 06:45:55.449562 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c67196-2e21-4ca1-81c6-ae1d0b68d461" path="/var/lib/kubelet/pods/a2c67196-2e21-4ca1-81c6-ae1d0b68d461/volumes" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.361619 4931 generic.go:334] "Generic (PLEG): container finished" podID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerID="0ac695e770aa1e4dca8f308d0921c4323577d50a3d4f87168b23f7117ef54020" exitCode=137 Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.362252 4931 generic.go:334] "Generic (PLEG): container finished" podID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerID="3f69e23981e22c16e2b3eb960775aef58a246bf4d2a400c78b43e6f979492f62" exitCode=137 Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.362289 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd79f5877-c4n2v" event={"ID":"00dba7a3-7492-4010-9931-1ed387dc22a7","Type":"ContainerDied","Data":"0ac695e770aa1e4dca8f308d0921c4323577d50a3d4f87168b23f7117ef54020"} Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.362328 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd79f5877-c4n2v" event={"ID":"00dba7a3-7492-4010-9931-1ed387dc22a7","Type":"ContainerDied","Data":"3f69e23981e22c16e2b3eb960775aef58a246bf4d2a400c78b43e6f979492f62"} Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.483850 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.657349 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00dba7a3-7492-4010-9931-1ed387dc22a7-logs\") pod \"00dba7a3-7492-4010-9931-1ed387dc22a7\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.657403 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00dba7a3-7492-4010-9931-1ed387dc22a7-horizon-secret-key\") pod \"00dba7a3-7492-4010-9931-1ed387dc22a7\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.657461 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-config-data\") pod \"00dba7a3-7492-4010-9931-1ed387dc22a7\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.657622 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5pmn\" (UniqueName: \"kubernetes.io/projected/00dba7a3-7492-4010-9931-1ed387dc22a7-kube-api-access-l5pmn\") pod \"00dba7a3-7492-4010-9931-1ed387dc22a7\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.657706 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-scripts\") pod \"00dba7a3-7492-4010-9931-1ed387dc22a7\" (UID: \"00dba7a3-7492-4010-9931-1ed387dc22a7\") " Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.658583 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00dba7a3-7492-4010-9931-1ed387dc22a7-logs" (OuterVolumeSpecName: "logs") pod "00dba7a3-7492-4010-9931-1ed387dc22a7" (UID: "00dba7a3-7492-4010-9931-1ed387dc22a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.669629 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dba7a3-7492-4010-9931-1ed387dc22a7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "00dba7a3-7492-4010-9931-1ed387dc22a7" (UID: "00dba7a3-7492-4010-9931-1ed387dc22a7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.671711 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00dba7a3-7492-4010-9931-1ed387dc22a7-kube-api-access-l5pmn" (OuterVolumeSpecName: "kube-api-access-l5pmn") pod "00dba7a3-7492-4010-9931-1ed387dc22a7" (UID: "00dba7a3-7492-4010-9931-1ed387dc22a7"). InnerVolumeSpecName "kube-api-access-l5pmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.687274 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-config-data" (OuterVolumeSpecName: "config-data") pod "00dba7a3-7492-4010-9931-1ed387dc22a7" (UID: "00dba7a3-7492-4010-9931-1ed387dc22a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.706953 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-scripts" (OuterVolumeSpecName: "scripts") pod "00dba7a3-7492-4010-9931-1ed387dc22a7" (UID: "00dba7a3-7492-4010-9931-1ed387dc22a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.760101 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5pmn\" (UniqueName: \"kubernetes.io/projected/00dba7a3-7492-4010-9931-1ed387dc22a7-kube-api-access-l5pmn\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.760134 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.760143 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00dba7a3-7492-4010-9931-1ed387dc22a7-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.760580 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00dba7a3-7492-4010-9931-1ed387dc22a7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4931]: I0130 06:45:56.760592 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00dba7a3-7492-4010-9931-1ed387dc22a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.401276 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd79f5877-c4n2v" event={"ID":"00dba7a3-7492-4010-9931-1ed387dc22a7","Type":"ContainerDied","Data":"b88b4f7f43f890396ef5c3306741e800af49790969d1f953fb512e6e66e5766b"} Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.402181 4931 scope.go:117] "RemoveContainer" containerID="0ac695e770aa1e4dca8f308d0921c4323577d50a3d4f87168b23f7117ef54020" Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.401406 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd79f5877-c4n2v" Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.478120 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fd79f5877-c4n2v"] Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.492023 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fd79f5877-c4n2v"] Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.494237 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86ccfbfc65-5jz59" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Jan 30 06:45:57 crc kubenswrapper[4931]: I0130 06:45:57.652247 4931 scope.go:117] "RemoveContainer" containerID="3f69e23981e22c16e2b3eb960775aef58a246bf4d2a400c78b43e6f979492f62" Jan 30 06:45:59 crc kubenswrapper[4931]: I0130 06:45:59.444891 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" path="/var/lib/kubelet/pods/00dba7a3-7492-4010-9931-1ed387dc22a7/volumes" Jan 30 06:46:00 crc kubenswrapper[4931]: I0130 06:46:00.829483 4931 scope.go:117] "RemoveContainer" containerID="18c71eb1241272ed04cbfce337c51a3320bfd0991c28ac36edc8dd0665668963" Jan 30 06:46:00 crc kubenswrapper[4931]: I0130 06:46:00.864696 4931 scope.go:117] "RemoveContainer" containerID="d12d2fc2afba982df405694f131f75a0e5433ce67ce82580ab99cf0746dfdbc2" Jan 30 06:46:00 crc kubenswrapper[4931]: I0130 06:46:00.906930 4931 scope.go:117] "RemoveContainer" containerID="2a367b7f63781dff8719e328044d9f7bfe39229339b2c9fd8828dc6b757b0a29" Jan 30 06:46:00 crc kubenswrapper[4931]: I0130 06:46:00.956595 4931 scope.go:117] "RemoveContainer" containerID="8641f9d89c670b316ae569c652c473fa47969340118c8804760552f9529867f0" Jan 30 06:46:00 crc kubenswrapper[4931]: I0130 06:46:00.994355 4931 scope.go:117] "RemoveContainer" containerID="c1b6dac81e48bbee78ee0997dd94e0a4ebd87490b839272e84bc72df445ca206" Jan 30 06:46:01 crc kubenswrapper[4931]: I0130 06:46:01.039778 4931 scope.go:117] "RemoveContainer" containerID="328e8eda0559ed6f531366255d38e56b8621e4607eb9add0633123842cfdda68" Jan 30 06:46:01 crc kubenswrapper[4931]: I0130 06:46:01.069357 4931 scope.go:117] "RemoveContainer" containerID="a418c4ea4e534161dc6a2d3882bfda776aacf77311c0dc409345881787f7574b" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.129081 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d459c77c7-fncxw"] Jan 30 06:46:05 crc kubenswrapper[4931]: E0130 06:46:05.130272 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.130290 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon" Jan 30 06:46:05 crc kubenswrapper[4931]: E0130 06:46:05.130322 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon-log" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.130333 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon-log" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.130609 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon-log" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.130632 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="00dba7a3-7492-4010-9931-1ed387dc22a7" containerName="horizon" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.131904 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.147519 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d459c77c7-fncxw"] Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.257086 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-horizon-secret-key\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.257161 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpjvc\" (UniqueName: \"kubernetes.io/projected/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-kube-api-access-qpjvc\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.257190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-scripts\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.257261 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-config-data\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.257327 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-logs\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.359072 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-horizon-secret-key\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.359159 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpjvc\" (UniqueName: \"kubernetes.io/projected/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-kube-api-access-qpjvc\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.359186 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-scripts\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.359235 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-config-data\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.359300 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-logs\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.359807 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-logs\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.360545 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-scripts\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.361345 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-config-data\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.368037 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-horizon-secret-key\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.381773 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpjvc\" (UniqueName: \"kubernetes.io/projected/4254a5c6-88bc-4b8f-a425-79d9bea9eb6d-kube-api-access-qpjvc\") pod \"horizon-5d459c77c7-fncxw\" (UID: \"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d\") " pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.431098 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:46:05 crc kubenswrapper[4931]: E0130 06:46:05.431534 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.468718 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:05 crc kubenswrapper[4931]: I0130 06:46:05.963946 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d459c77c7-fncxw"] Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.513614 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d459c77c7-fncxw" event={"ID":"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d","Type":"ContainerStarted","Data":"dae790fec84756ec577c5580230a30addbc905f61e2ddb53534192cbef4e266b"} Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.513939 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d459c77c7-fncxw" event={"ID":"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d","Type":"ContainerStarted","Data":"cec5e276d2533b840d8931e0ff453f085ac7186820b7dbf22e9c5b98a875574e"} Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.513955 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d459c77c7-fncxw" event={"ID":"4254a5c6-88bc-4b8f-a425-79d9bea9eb6d","Type":"ContainerStarted","Data":"e5ef97d6cee0490ed3dc2c6a36021e37dbe2a20b3c7fd83e92391cdd18ddab81"} Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.538129 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d459c77c7-fncxw" podStartSLOduration=1.538103813 podStartE2EDuration="1.538103813s" podCreationTimestamp="2026-01-30 06:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:06.53228085 +0000 UTC m=+5901.902191107" watchObservedRunningTime="2026-01-30 06:46:06.538103813 +0000 UTC m=+5901.908014080" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.672273 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-md7t7"] Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.674008 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.680058 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-md7t7"] Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.794320 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97hj\" (UniqueName: \"kubernetes.io/projected/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-kube-api-access-t97hj\") pod \"heat-db-create-md7t7\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.794710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-operator-scripts\") pod \"heat-db-create-md7t7\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.876257 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-0207-account-create-update-nwwgb"] Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.881467 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.884934 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.895646 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-0207-account-create-update-nwwgb"] Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.896487 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97hj\" (UniqueName: \"kubernetes.io/projected/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-kube-api-access-t97hj\") pod \"heat-db-create-md7t7\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.896550 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-operator-scripts\") pod \"heat-db-create-md7t7\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.897459 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-operator-scripts\") pod \"heat-db-create-md7t7\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.918977 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97hj\" (UniqueName: \"kubernetes.io/projected/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-kube-api-access-t97hj\") pod \"heat-db-create-md7t7\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " pod="openstack/heat-db-create-md7t7" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.998778 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbldw\" (UniqueName: \"kubernetes.io/projected/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-kube-api-access-wbldw\") pod \"heat-0207-account-create-update-nwwgb\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:06 crc kubenswrapper[4931]: I0130 06:46:06.999122 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-operator-scripts\") pod \"heat-0207-account-create-update-nwwgb\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.028472 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-md7t7" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.101410 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbldw\" (UniqueName: \"kubernetes.io/projected/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-kube-api-access-wbldw\") pod \"heat-0207-account-create-update-nwwgb\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.101575 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-operator-scripts\") pod \"heat-0207-account-create-update-nwwgb\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.102560 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-operator-scripts\") pod \"heat-0207-account-create-update-nwwgb\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.123525 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbldw\" (UniqueName: \"kubernetes.io/projected/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-kube-api-access-wbldw\") pod \"heat-0207-account-create-update-nwwgb\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.202486 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.494490 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86ccfbfc65-5jz59" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.538495 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-md7t7"] Jan 30 06:46:07 crc kubenswrapper[4931]: W0130 06:46:07.541620 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b44b5a_7476_44a4_b7ca_e6c246e9afdc.slice/crio-3d55ce172c678f76fc2348dc3f38851f5e702decefa7e63fc458f4461ec77231 WatchSource:0}: Error finding container 3d55ce172c678f76fc2348dc3f38851f5e702decefa7e63fc458f4461ec77231: Status 404 returned error can't find the container with id 3d55ce172c678f76fc2348dc3f38851f5e702decefa7e63fc458f4461ec77231 Jan 30 06:46:07 crc kubenswrapper[4931]: I0130 06:46:07.699404 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-0207-account-create-update-nwwgb"] Jan 30 06:46:08 crc kubenswrapper[4931]: I0130 06:46:08.535675 4931 generic.go:334] "Generic (PLEG): container finished" podID="4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" containerID="0c6e2269ccd94b91b1bc61c0d6038a0f312e1ff979dd42b9e25c99edd027ce3a" exitCode=0 Jan 30 06:46:08 crc kubenswrapper[4931]: I0130 06:46:08.536384 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-0207-account-create-update-nwwgb" event={"ID":"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7","Type":"ContainerDied","Data":"0c6e2269ccd94b91b1bc61c0d6038a0f312e1ff979dd42b9e25c99edd027ce3a"} Jan 30 06:46:08 crc kubenswrapper[4931]: I0130 06:46:08.536442 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-0207-account-create-update-nwwgb" event={"ID":"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7","Type":"ContainerStarted","Data":"7759660487bc155207cd5eb26c0d9fbddda7ab8c7b320ed171716e13152c14b8"} Jan 30 06:46:08 crc kubenswrapper[4931]: I0130 06:46:08.540162 4931 generic.go:334] "Generic (PLEG): container finished" podID="65b44b5a-7476-44a4-b7ca-e6c246e9afdc" containerID="461da2cabb65077a09c290e33233aae28ff5843458cdbe68b5fe17f6c78dd05f" exitCode=0 Jan 30 06:46:08 crc kubenswrapper[4931]: I0130 06:46:08.540208 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-md7t7" event={"ID":"65b44b5a-7476-44a4-b7ca-e6c246e9afdc","Type":"ContainerDied","Data":"461da2cabb65077a09c290e33233aae28ff5843458cdbe68b5fe17f6c78dd05f"} Jan 30 06:46:08 crc kubenswrapper[4931]: I0130 06:46:08.540238 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-md7t7" event={"ID":"65b44b5a-7476-44a4-b7ca-e6c246e9afdc","Type":"ContainerStarted","Data":"3d55ce172c678f76fc2348dc3f38851f5e702decefa7e63fc458f4461ec77231"} Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.071530 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.099955 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-md7t7" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.175165 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-operator-scripts\") pod \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.175262 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t97hj\" (UniqueName: \"kubernetes.io/projected/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-kube-api-access-t97hj\") pod \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\" (UID: \"65b44b5a-7476-44a4-b7ca-e6c246e9afdc\") " Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.175349 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbldw\" (UniqueName: \"kubernetes.io/projected/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-kube-api-access-wbldw\") pod \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.175394 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-operator-scripts\") pod \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\" (UID: \"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7\") " Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.176384 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65b44b5a-7476-44a4-b7ca-e6c246e9afdc" (UID: "65b44b5a-7476-44a4-b7ca-e6c246e9afdc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.176413 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" (UID: "4110f6ea-5daa-4a1f-8fc2-f9497b7024f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.183016 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-kube-api-access-wbldw" (OuterVolumeSpecName: "kube-api-access-wbldw") pod "4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" (UID: "4110f6ea-5daa-4a1f-8fc2-f9497b7024f7"). InnerVolumeSpecName "kube-api-access-wbldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.183604 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-kube-api-access-t97hj" (OuterVolumeSpecName: "kube-api-access-t97hj") pod "65b44b5a-7476-44a4-b7ca-e6c246e9afdc" (UID: "65b44b5a-7476-44a4-b7ca-e6c246e9afdc"). InnerVolumeSpecName "kube-api-access-t97hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.277857 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.277900 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t97hj\" (UniqueName: \"kubernetes.io/projected/65b44b5a-7476-44a4-b7ca-e6c246e9afdc-kube-api-access-t97hj\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.277915 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbldw\" (UniqueName: \"kubernetes.io/projected/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-kube-api-access-wbldw\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.277926 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.562847 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0207-account-create-update-nwwgb" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.562868 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-0207-account-create-update-nwwgb" event={"ID":"4110f6ea-5daa-4a1f-8fc2-f9497b7024f7","Type":"ContainerDied","Data":"7759660487bc155207cd5eb26c0d9fbddda7ab8c7b320ed171716e13152c14b8"} Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.562911 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7759660487bc155207cd5eb26c0d9fbddda7ab8c7b320ed171716e13152c14b8" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.564542 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-md7t7" event={"ID":"65b44b5a-7476-44a4-b7ca-e6c246e9afdc","Type":"ContainerDied","Data":"3d55ce172c678f76fc2348dc3f38851f5e702decefa7e63fc458f4461ec77231"} Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.564569 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d55ce172c678f76fc2348dc3f38851f5e702decefa7e63fc458f4461ec77231" Jan 30 06:46:10 crc kubenswrapper[4931]: I0130 06:46:10.564820 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-md7t7" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.035640 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-t75hv"] Jan 30 06:46:12 crc kubenswrapper[4931]: E0130 06:46:12.036484 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" containerName="mariadb-account-create-update" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.036504 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" containerName="mariadb-account-create-update" Jan 30 06:46:12 crc kubenswrapper[4931]: E0130 06:46:12.036525 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b44b5a-7476-44a4-b7ca-e6c246e9afdc" containerName="mariadb-database-create" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.036557 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b44b5a-7476-44a4-b7ca-e6c246e9afdc" containerName="mariadb-database-create" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.036927 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" containerName="mariadb-account-create-update" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.036965 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b44b5a-7476-44a4-b7ca-e6c246e9afdc" containerName="mariadb-database-create" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.037956 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.044179 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.044401 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-rpnp9" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.054825 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-t75hv"] Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.120260 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rzw\" (UniqueName: \"kubernetes.io/projected/eae9c157-1120-45ac-8d6c-cc417f364b1f-kube-api-access-p5rzw\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.120852 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-combined-ca-bundle\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.120895 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-config-data\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.228259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rzw\" (UniqueName: \"kubernetes.io/projected/eae9c157-1120-45ac-8d6c-cc417f364b1f-kube-api-access-p5rzw\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.228345 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-combined-ca-bundle\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.228381 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-config-data\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.234926 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-config-data\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.237689 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-combined-ca-bundle\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.247601 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rzw\" (UniqueName: \"kubernetes.io/projected/eae9c157-1120-45ac-8d6c-cc417f364b1f-kube-api-access-p5rzw\") pod \"heat-db-sync-t75hv\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.379227 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:12 crc kubenswrapper[4931]: I0130 06:46:12.876597 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-t75hv"] Jan 30 06:46:13 crc kubenswrapper[4931]: I0130 06:46:13.608244 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-t75hv" event={"ID":"eae9c157-1120-45ac-8d6c-cc417f364b1f","Type":"ContainerStarted","Data":"ec88771f0c85057efcfc2cbe78e40d1ed716715f7c536bdc884c1a61abdb4693"} Jan 30 06:46:15 crc kubenswrapper[4931]: I0130 06:46:15.469869 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:15 crc kubenswrapper[4931]: I0130 06:46:15.470217 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:16 crc kubenswrapper[4931]: I0130 06:46:16.422286 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:46:16 crc kubenswrapper[4931]: E0130 06:46:16.423065 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:46:17 crc kubenswrapper[4931]: I0130 06:46:17.493788 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86ccfbfc65-5jz59" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Jan 30 06:46:17 crc kubenswrapper[4931]: I0130 06:46:17.494217 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:46:19 crc kubenswrapper[4931]: I0130 06:46:19.688065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-t75hv" event={"ID":"eae9c157-1120-45ac-8d6c-cc417f364b1f","Type":"ContainerStarted","Data":"21f9a5808ddea5a133a2d53b441ae213a916b33040640a7576f4d7163df3f19d"} Jan 30 06:46:19 crc kubenswrapper[4931]: I0130 06:46:19.719765 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-t75hv" podStartSLOduration=1.5889306159999999 podStartE2EDuration="7.719745123s" podCreationTimestamp="2026-01-30 06:46:12 +0000 UTC" firstStartedPulling="2026-01-30 06:46:12.880099797 +0000 UTC m=+5908.250010094" lastFinishedPulling="2026-01-30 06:46:19.010914304 +0000 UTC m=+5914.380824601" observedRunningTime="2026-01-30 06:46:19.712946783 +0000 UTC m=+5915.082857070" watchObservedRunningTime="2026-01-30 06:46:19.719745123 +0000 UTC m=+5915.089655400" Jan 30 06:46:21 crc kubenswrapper[4931]: I0130 06:46:21.713179 4931 generic.go:334] "Generic (PLEG): container finished" podID="eae9c157-1120-45ac-8d6c-cc417f364b1f" containerID="21f9a5808ddea5a133a2d53b441ae213a916b33040640a7576f4d7163df3f19d" exitCode=0 Jan 30 06:46:21 crc kubenswrapper[4931]: I0130 06:46:21.713221 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-t75hv" event={"ID":"eae9c157-1120-45ac-8d6c-cc417f364b1f","Type":"ContainerDied","Data":"21f9a5808ddea5a133a2d53b441ae213a916b33040640a7576f4d7163df3f19d"} Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.027488 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.046483 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e180809a-b692-42c0-b821-723afe805954-horizon-secret-key\") pod \"e180809a-b692-42c0-b821-723afe805954\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.046957 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-scripts\") pod \"e180809a-b692-42c0-b821-723afe805954\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.047067 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-config-data\") pod \"e180809a-b692-42c0-b821-723afe805954\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.047136 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e180809a-b692-42c0-b821-723afe805954-logs\") pod \"e180809a-b692-42c0-b821-723afe805954\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.047181 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hdtv\" (UniqueName: \"kubernetes.io/projected/e180809a-b692-42c0-b821-723afe805954-kube-api-access-7hdtv\") pod \"e180809a-b692-42c0-b821-723afe805954\" (UID: \"e180809a-b692-42c0-b821-723afe805954\") " Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.051531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e180809a-b692-42c0-b821-723afe805954-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e180809a-b692-42c0-b821-723afe805954" (UID: "e180809a-b692-42c0-b821-723afe805954"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.051905 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e180809a-b692-42c0-b821-723afe805954-logs" (OuterVolumeSpecName: "logs") pod "e180809a-b692-42c0-b821-723afe805954" (UID: "e180809a-b692-42c0-b821-723afe805954"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.056218 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e180809a-b692-42c0-b821-723afe805954-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.056587 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e180809a-b692-42c0-b821-723afe805954-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.109195 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e180809a-b692-42c0-b821-723afe805954-kube-api-access-7hdtv" (OuterVolumeSpecName: "kube-api-access-7hdtv") pod "e180809a-b692-42c0-b821-723afe805954" (UID: "e180809a-b692-42c0-b821-723afe805954"). InnerVolumeSpecName "kube-api-access-7hdtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.110160 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-scripts" (OuterVolumeSpecName: "scripts") pod "e180809a-b692-42c0-b821-723afe805954" (UID: "e180809a-b692-42c0-b821-723afe805954"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.112487 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-config-data" (OuterVolumeSpecName: "config-data") pod "e180809a-b692-42c0-b821-723afe805954" (UID: "e180809a-b692-42c0-b821-723afe805954"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.158468 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.158520 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hdtv\" (UniqueName: \"kubernetes.io/projected/e180809a-b692-42c0-b821-723afe805954-kube-api-access-7hdtv\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.158539 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e180809a-b692-42c0-b821-723afe805954-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.733946 4931 generic.go:334] "Generic (PLEG): container finished" podID="e180809a-b692-42c0-b821-723afe805954" containerID="72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0" exitCode=137 Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.734016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86ccfbfc65-5jz59" event={"ID":"e180809a-b692-42c0-b821-723afe805954","Type":"ContainerDied","Data":"72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0"} Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.734050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86ccfbfc65-5jz59" event={"ID":"e180809a-b692-42c0-b821-723afe805954","Type":"ContainerDied","Data":"6e7fd796d4cb0d311b3d0a188a2af32435b8c86d4ea09b091cca87d30bbf2b5d"} Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.734069 4931 scope.go:117] "RemoveContainer" containerID="700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.733993 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86ccfbfc65-5jz59" Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.807976 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86ccfbfc65-5jz59"] Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.817905 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86ccfbfc65-5jz59"] Jan 30 06:46:22 crc kubenswrapper[4931]: I0130 06:46:22.985032 4931 scope.go:117] "RemoveContainer" containerID="72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.010962 4931 scope.go:117] "RemoveContainer" containerID="700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111" Jan 30 06:46:23 crc kubenswrapper[4931]: E0130 06:46:23.011702 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111\": container with ID starting with 700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111 not found: ID does not exist" containerID="700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.011733 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111"} err="failed to get container status \"700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111\": rpc error: code = NotFound desc = could not find container \"700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111\": container with ID starting with 700676473cd61ebacb9e15a0a971ab42c8f08fbaa84f601ba058ff31fbcd8111 not found: ID does not exist" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.011782 4931 scope.go:117] "RemoveContainer" containerID="72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0" Jan 30 06:46:23 crc kubenswrapper[4931]: E0130 06:46:23.012199 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0\": container with ID starting with 72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0 not found: ID does not exist" containerID="72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.012231 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0"} err="failed to get container status \"72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0\": rpc error: code = NotFound desc = could not find container \"72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0\": container with ID starting with 72e47d4ade77f5ded5f7a763729bf9cc9f93500e98e8c475f9d41f7b996918d0 not found: ID does not exist" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.107267 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.287469 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-combined-ca-bundle\") pod \"eae9c157-1120-45ac-8d6c-cc417f364b1f\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.287564 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5rzw\" (UniqueName: \"kubernetes.io/projected/eae9c157-1120-45ac-8d6c-cc417f364b1f-kube-api-access-p5rzw\") pod \"eae9c157-1120-45ac-8d6c-cc417f364b1f\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.287776 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-config-data\") pod \"eae9c157-1120-45ac-8d6c-cc417f364b1f\" (UID: \"eae9c157-1120-45ac-8d6c-cc417f364b1f\") " Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.294875 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae9c157-1120-45ac-8d6c-cc417f364b1f-kube-api-access-p5rzw" (OuterVolumeSpecName: "kube-api-access-p5rzw") pod "eae9c157-1120-45ac-8d6c-cc417f364b1f" (UID: "eae9c157-1120-45ac-8d6c-cc417f364b1f"). InnerVolumeSpecName "kube-api-access-p5rzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.345389 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eae9c157-1120-45ac-8d6c-cc417f364b1f" (UID: "eae9c157-1120-45ac-8d6c-cc417f364b1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.391652 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5rzw\" (UniqueName: \"kubernetes.io/projected/eae9c157-1120-45ac-8d6c-cc417f364b1f-kube-api-access-p5rzw\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.392106 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.409155 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-config-data" (OuterVolumeSpecName: "config-data") pod "eae9c157-1120-45ac-8d6c-cc417f364b1f" (UID: "eae9c157-1120-45ac-8d6c-cc417f364b1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.440780 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e180809a-b692-42c0-b821-723afe805954" path="/var/lib/kubelet/pods/e180809a-b692-42c0-b821-723afe805954/volumes" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.494077 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae9c157-1120-45ac-8d6c-cc417f364b1f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.745286 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-t75hv" event={"ID":"eae9c157-1120-45ac-8d6c-cc417f364b1f","Type":"ContainerDied","Data":"ec88771f0c85057efcfc2cbe78e40d1ed716715f7c536bdc884c1a61abdb4693"} Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.745329 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec88771f0c85057efcfc2cbe78e40d1ed716715f7c536bdc884c1a61abdb4693" Jan 30 06:46:23 crc kubenswrapper[4931]: I0130 06:46:23.745386 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-t75hv" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.025592 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6d6f44c564-6wts7"] Jan 30 06:46:25 crc kubenswrapper[4931]: E0130 06:46:25.026550 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.026588 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" Jan 30 06:46:25 crc kubenswrapper[4931]: E0130 06:46:25.026620 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae9c157-1120-45ac-8d6c-cc417f364b1f" containerName="heat-db-sync" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.026631 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae9c157-1120-45ac-8d6c-cc417f364b1f" containerName="heat-db-sync" Jan 30 06:46:25 crc kubenswrapper[4931]: E0130 06:46:25.026652 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon-log" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.026664 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon-log" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.026988 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.027019 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae9c157-1120-45ac-8d6c-cc417f364b1f" containerName="heat-db-sync" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.027054 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e180809a-b692-42c0-b821-723afe805954" containerName="horizon-log" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.028138 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.032291 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.032760 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.044392 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-rpnp9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.064847 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6d6f44c564-6wts7"] Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.128857 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-config-data-custom\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.128945 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrq8\" (UniqueName: \"kubernetes.io/projected/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-kube-api-access-sbrq8\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.128976 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-config-data\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.129073 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-combined-ca-bundle\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.231533 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-config-data\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.231676 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-combined-ca-bundle\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.231717 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-config-data-custom\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.231794 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrq8\" (UniqueName: \"kubernetes.io/projected/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-kube-api-access-sbrq8\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.240559 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-config-data\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.245255 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-combined-ca-bundle\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.249972 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-config-data-custom\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.251170 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrq8\" (UniqueName: \"kubernetes.io/projected/78cdbc3b-0ff9-4204-b62e-bc784e3fcb87-kube-api-access-sbrq8\") pod \"heat-engine-6d6f44c564-6wts7\" (UID: \"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87\") " pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.321121 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-795f886c68-gphf9"] Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.322358 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.329329 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.337085 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7944f98bdf-sfnzs"] Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.338245 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.341718 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.365498 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.376076 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7944f98bdf-sfnzs"] Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.385218 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-795f886c68-gphf9"] Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438081 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-config-data\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438135 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-config-data-custom\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438155 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td62v\" (UniqueName: \"kubernetes.io/projected/7094dd36-79d9-4c63-9441-1753815af4a7-kube-api-access-td62v\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438304 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gmh\" (UniqueName: \"kubernetes.io/projected/e3a9064f-a3e2-4734-8b77-9e42deff080a-kube-api-access-p5gmh\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438388 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-config-data\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438410 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-combined-ca-bundle\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438527 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-config-data-custom\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.438613 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-combined-ca-bundle\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.541895 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-config-data\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542011 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-combined-ca-bundle\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542262 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-config-data-custom\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542388 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-combined-ca-bundle\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542599 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-config-data\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542677 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-config-data-custom\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542719 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td62v\" (UniqueName: \"kubernetes.io/projected/7094dd36-79d9-4c63-9441-1753815af4a7-kube-api-access-td62v\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.542802 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gmh\" (UniqueName: \"kubernetes.io/projected/e3a9064f-a3e2-4734-8b77-9e42deff080a-kube-api-access-p5gmh\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.555560 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-config-data\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.563881 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-config-data-custom\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.569351 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-config-data\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.570279 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-config-data-custom\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.574608 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7094dd36-79d9-4c63-9441-1753815af4a7-combined-ca-bundle\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.581225 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gmh\" (UniqueName: \"kubernetes.io/projected/e3a9064f-a3e2-4734-8b77-9e42deff080a-kube-api-access-p5gmh\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.583868 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a9064f-a3e2-4734-8b77-9e42deff080a-combined-ca-bundle\") pod \"heat-api-795f886c68-gphf9\" (UID: \"e3a9064f-a3e2-4734-8b77-9e42deff080a\") " pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.585143 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td62v\" (UniqueName: \"kubernetes.io/projected/7094dd36-79d9-4c63-9441-1753815af4a7-kube-api-access-td62v\") pod \"heat-cfnapi-7944f98bdf-sfnzs\" (UID: \"7094dd36-79d9-4c63-9441-1753815af4a7\") " pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.655870 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.703809 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:25 crc kubenswrapper[4931]: I0130 06:46:25.921051 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6d6f44c564-6wts7"] Jan 30 06:46:25 crc kubenswrapper[4931]: W0130 06:46:25.926524 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cdbc3b_0ff9_4204_b62e_bc784e3fcb87.slice/crio-f834194230f523ed8c30e4b0b48089f146097cc3b4f36bfbae6c0dcffaecca47 WatchSource:0}: Error finding container f834194230f523ed8c30e4b0b48089f146097cc3b4f36bfbae6c0dcffaecca47: Status 404 returned error can't find the container with id f834194230f523ed8c30e4b0b48089f146097cc3b4f36bfbae6c0dcffaecca47 Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.110730 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-795f886c68-gphf9"] Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.250193 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7944f98bdf-sfnzs"] Jan 30 06:46:26 crc kubenswrapper[4931]: W0130 06:46:26.251221 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7094dd36_79d9_4c63_9441_1753815af4a7.slice/crio-3b56b4340ea6265de44572bd720f7b92a4cfe2b0d6f5d3e3e70e9928348f9997 WatchSource:0}: Error finding container 3b56b4340ea6265de44572bd720f7b92a4cfe2b0d6f5d3e3e70e9928348f9997: Status 404 returned error can't find the container with id 3b56b4340ea6265de44572bd720f7b92a4cfe2b0d6f5d3e3e70e9928348f9997 Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.819696 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" event={"ID":"7094dd36-79d9-4c63-9441-1753815af4a7","Type":"ContainerStarted","Data":"3b56b4340ea6265de44572bd720f7b92a4cfe2b0d6f5d3e3e70e9928348f9997"} Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.822958 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6d6f44c564-6wts7" event={"ID":"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87","Type":"ContainerStarted","Data":"c10d223cc51544d9fecfc8f7ac0fce0fb40f94528d9ce0fb8a1ef18e35c5cb6d"} Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.823016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6d6f44c564-6wts7" event={"ID":"78cdbc3b-0ff9-4204-b62e-bc784e3fcb87","Type":"ContainerStarted","Data":"f834194230f523ed8c30e4b0b48089f146097cc3b4f36bfbae6c0dcffaecca47"} Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.823265 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.825083 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-795f886c68-gphf9" event={"ID":"e3a9064f-a3e2-4734-8b77-9e42deff080a","Type":"ContainerStarted","Data":"cdb4f4fff3ef66eb047fc0fad5ba4ec59c5df1bd897863a7040dd254520fe016"} Jan 30 06:46:26 crc kubenswrapper[4931]: I0130 06:46:26.840071 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6d6f44c564-6wts7" podStartSLOduration=2.8400480310000003 podStartE2EDuration="2.840048031s" podCreationTimestamp="2026-01-30 06:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:26.836953684 +0000 UTC m=+5922.206863951" watchObservedRunningTime="2026-01-30 06:46:26.840048031 +0000 UTC m=+5922.209958288" Jan 30 06:46:27 crc kubenswrapper[4931]: I0130 06:46:27.383724 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.103812 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d459c77c7-fncxw" Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.162401 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b8f5df775-m6dvd"] Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.162753 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b8f5df775-m6dvd" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon-log" containerID="cri-o://16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb" gracePeriod=30 Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.162895 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b8f5df775-m6dvd" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" containerID="cri-o://85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88" gracePeriod=30 Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.853230 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" event={"ID":"7094dd36-79d9-4c63-9441-1753815af4a7","Type":"ContainerStarted","Data":"a8b2f0d8514937b1d4f06d703fc47131681d783d24f8dfbebd8154ecbfd67779"} Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.853889 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.855583 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-795f886c68-gphf9" event={"ID":"e3a9064f-a3e2-4734-8b77-9e42deff080a","Type":"ContainerStarted","Data":"61ced90ca96268bddf1abf323519f365962314a7c94828080c8fa1faf48a1a8a"} Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.856111 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.879078 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" podStartSLOduration=2.221850973 podStartE2EDuration="4.879058611s" podCreationTimestamp="2026-01-30 06:46:25 +0000 UTC" firstStartedPulling="2026-01-30 06:46:26.253880923 +0000 UTC m=+5921.623791180" lastFinishedPulling="2026-01-30 06:46:28.911088561 +0000 UTC m=+5924.280998818" observedRunningTime="2026-01-30 06:46:29.869291598 +0000 UTC m=+5925.239201855" watchObservedRunningTime="2026-01-30 06:46:29.879058611 +0000 UTC m=+5925.248968878" Jan 30 06:46:29 crc kubenswrapper[4931]: I0130 06:46:29.896605 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-795f886c68-gphf9" podStartSLOduration=2.108495972 podStartE2EDuration="4.896580131s" podCreationTimestamp="2026-01-30 06:46:25 +0000 UTC" firstStartedPulling="2026-01-30 06:46:26.121729026 +0000 UTC m=+5921.491639283" lastFinishedPulling="2026-01-30 06:46:28.909813185 +0000 UTC m=+5924.279723442" observedRunningTime="2026-01-30 06:46:29.89009039 +0000 UTC m=+5925.260000657" watchObservedRunningTime="2026-01-30 06:46:29.896580131 +0000 UTC m=+5925.266490388" Jan 30 06:46:31 crc kubenswrapper[4931]: I0130 06:46:31.422304 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:46:31 crc kubenswrapper[4931]: E0130 06:46:31.422759 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:46:32 crc kubenswrapper[4931]: I0130 06:46:32.893387 4931 generic.go:334] "Generic (PLEG): container finished" podID="1d524b32-d060-41f3-88a6-d5339c438fff" containerID="85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88" exitCode=0 Jan 30 06:46:32 crc kubenswrapper[4931]: I0130 06:46:32.893757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f5df775-m6dvd" event={"ID":"1d524b32-d060-41f3-88a6-d5339c438fff","Type":"ContainerDied","Data":"85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88"} Jan 30 06:46:36 crc kubenswrapper[4931]: I0130 06:46:36.897818 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-795f886c68-gphf9" Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.050721 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8be9-account-create-update-4qptt"] Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.060499 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fcgh6"] Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.073522 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fcgh6"] Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.083295 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7944f98bdf-sfnzs" Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.085794 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8be9-account-create-update-4qptt"] Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.436182 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cc38ea-1412-4e17-9c74-779b7c6d701c" path="/var/lib/kubelet/pods/f6cc38ea-1412-4e17-9c74-779b7c6d701c/volumes" Jan 30 06:46:37 crc kubenswrapper[4931]: I0130 06:46:37.436826 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5a82c2-728c-40a6-83b0-37ba70d84931" path="/var/lib/kubelet/pods/fe5a82c2-728c-40a6-83b0-37ba70d84931/volumes" Jan 30 06:46:38 crc kubenswrapper[4931]: I0130 06:46:38.084956 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b8f5df775-m6dvd" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.116:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8080: connect: connection refused" Jan 30 06:46:43 crc kubenswrapper[4931]: I0130 06:46:43.423622 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:46:43 crc kubenswrapper[4931]: E0130 06:46:43.426347 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:46:44 crc kubenswrapper[4931]: I0130 06:46:44.047140 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-529p5"] Jan 30 06:46:44 crc kubenswrapper[4931]: I0130 06:46:44.059626 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-529p5"] Jan 30 06:46:45 crc kubenswrapper[4931]: I0130 06:46:45.408750 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6d6f44c564-6wts7" Jan 30 06:46:45 crc kubenswrapper[4931]: I0130 06:46:45.456204 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff91271-f1e2-4aaf-adec-bc61ce9dedad" path="/var/lib/kubelet/pods/bff91271-f1e2-4aaf-adec-bc61ce9dedad/volumes" Jan 30 06:46:48 crc kubenswrapper[4931]: I0130 06:46:48.085366 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b8f5df775-m6dvd" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.116:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8080: connect: connection refused" Jan 30 06:46:54 crc kubenswrapper[4931]: I0130 06:46:54.422031 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:46:54 crc kubenswrapper[4931]: E0130 06:46:54.422972 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:46:58 crc kubenswrapper[4931]: I0130 06:46:58.085074 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b8f5df775-m6dvd" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.116:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8080: connect: connection refused" Jan 30 06:46:58 crc kubenswrapper[4931]: I0130 06:46:58.085703 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.636410 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.747371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-config-data\") pod \"1d524b32-d060-41f3-88a6-d5339c438fff\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.747451 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-scripts\") pod \"1d524b32-d060-41f3-88a6-d5339c438fff\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.747545 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d524b32-d060-41f3-88a6-d5339c438fff-logs\") pod \"1d524b32-d060-41f3-88a6-d5339c438fff\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.747643 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-477f8\" (UniqueName: \"kubernetes.io/projected/1d524b32-d060-41f3-88a6-d5339c438fff-kube-api-access-477f8\") pod \"1d524b32-d060-41f3-88a6-d5339c438fff\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.747701 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d524b32-d060-41f3-88a6-d5339c438fff-horizon-secret-key\") pod \"1d524b32-d060-41f3-88a6-d5339c438fff\" (UID: \"1d524b32-d060-41f3-88a6-d5339c438fff\") " Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.748055 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d524b32-d060-41f3-88a6-d5339c438fff-logs" (OuterVolumeSpecName: "logs") pod "1d524b32-d060-41f3-88a6-d5339c438fff" (UID: "1d524b32-d060-41f3-88a6-d5339c438fff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.754139 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d524b32-d060-41f3-88a6-d5339c438fff-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1d524b32-d060-41f3-88a6-d5339c438fff" (UID: "1d524b32-d060-41f3-88a6-d5339c438fff"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.754694 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d524b32-d060-41f3-88a6-d5339c438fff-kube-api-access-477f8" (OuterVolumeSpecName: "kube-api-access-477f8") pod "1d524b32-d060-41f3-88a6-d5339c438fff" (UID: "1d524b32-d060-41f3-88a6-d5339c438fff"). InnerVolumeSpecName "kube-api-access-477f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.776521 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-config-data" (OuterVolumeSpecName: "config-data") pod "1d524b32-d060-41f3-88a6-d5339c438fff" (UID: "1d524b32-d060-41f3-88a6-d5339c438fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.776568 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-scripts" (OuterVolumeSpecName: "scripts") pod "1d524b32-d060-41f3-88a6-d5339c438fff" (UID: "1d524b32-d060-41f3-88a6-d5339c438fff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.850018 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.850051 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d524b32-d060-41f3-88a6-d5339c438fff-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.850061 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d524b32-d060-41f3-88a6-d5339c438fff-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.850071 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-477f8\" (UniqueName: \"kubernetes.io/projected/1d524b32-d060-41f3-88a6-d5339c438fff-kube-api-access-477f8\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:59 crc kubenswrapper[4931]: I0130 06:46:59.850083 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d524b32-d060-41f3-88a6-d5339c438fff-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.200240 4931 generic.go:334] "Generic (PLEG): container finished" podID="1d524b32-d060-41f3-88a6-d5339c438fff" containerID="16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb" exitCode=137 Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.200276 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8f5df775-m6dvd" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.200293 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f5df775-m6dvd" event={"ID":"1d524b32-d060-41f3-88a6-d5339c438fff","Type":"ContainerDied","Data":"16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb"} Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.200868 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f5df775-m6dvd" event={"ID":"1d524b32-d060-41f3-88a6-d5339c438fff","Type":"ContainerDied","Data":"b4e9652d680d0769885e340ea1c5e364bdd99c6ea6b532de8c239df55ba5c48d"} Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.200885 4931 scope.go:117] "RemoveContainer" containerID="85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.248386 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b8f5df775-m6dvd"] Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.255796 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b8f5df775-m6dvd"] Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.385788 4931 scope.go:117] "RemoveContainer" containerID="16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.412764 4931 scope.go:117] "RemoveContainer" containerID="85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88" Jan 30 06:47:00 crc kubenswrapper[4931]: E0130 06:47:00.413283 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88\": container with ID starting with 85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88 not found: ID does not exist" containerID="85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.413328 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88"} err="failed to get container status \"85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88\": rpc error: code = NotFound desc = could not find container \"85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88\": container with ID starting with 85548fef8024dea6f8f6115f0c41910b99449a559d12a7ba0b4376b470f15f88 not found: ID does not exist" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.413352 4931 scope.go:117] "RemoveContainer" containerID="16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb" Jan 30 06:47:00 crc kubenswrapper[4931]: E0130 06:47:00.413824 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb\": container with ID starting with 16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb not found: ID does not exist" containerID="16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb" Jan 30 06:47:00 crc kubenswrapper[4931]: I0130 06:47:00.413849 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb"} err="failed to get container status \"16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb\": rpc error: code = NotFound desc = could not find container \"16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb\": container with ID starting with 16d9ec94d86ff7aef0c860f5097848e89654e8c653024be93461df34e92fa0cb not found: ID does not exist" Jan 30 06:47:01 crc kubenswrapper[4931]: I0130 06:47:01.305195 4931 scope.go:117] "RemoveContainer" containerID="6f79520471e9a429df8d71872cafba7c48f1385750af4061f0e5ea5c4355f53e" Jan 30 06:47:01 crc kubenswrapper[4931]: I0130 06:47:01.338335 4931 scope.go:117] "RemoveContainer" containerID="b12392121e0278ef6aaee0ef2cb91f20ce791df236403c3611d10649bcb909d3" Jan 30 06:47:01 crc kubenswrapper[4931]: I0130 06:47:01.373099 4931 scope.go:117] "RemoveContainer" containerID="4711e717af206225417ee23e6a5a6867fd0fca04b0c1bb798437c5d765e9f38b" Jan 30 06:47:01 crc kubenswrapper[4931]: I0130 06:47:01.431959 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" path="/var/lib/kubelet/pods/1d524b32-d060-41f3-88a6-d5339c438fff/volumes" Jan 30 06:47:05 crc kubenswrapper[4931]: I0130 06:47:05.432824 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:47:06 crc kubenswrapper[4931]: I0130 06:47:06.267048 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"fb726f99bee299533a0b10daf42c4d3c80f89e1b2459842bb36e1df7a3f9faa7"} Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.212041 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll"] Jan 30 06:47:08 crc kubenswrapper[4931]: E0130 06:47:08.212954 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon-log" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.212966 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon-log" Jan 30 06:47:08 crc kubenswrapper[4931]: E0130 06:47:08.212978 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.212986 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.213175 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.213200 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d524b32-d060-41f3-88a6-d5339c438fff" containerName="horizon-log" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.214654 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.216573 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.232876 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll"] Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.333455 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmcjq\" (UniqueName: \"kubernetes.io/projected/8db6c802-44ea-48b4-a63f-c6c43492e6bc-kube-api-access-lmcjq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.333725 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.334103 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.438492 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.438617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.438640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmcjq\" (UniqueName: \"kubernetes.io/projected/8db6c802-44ea-48b4-a63f-c6c43492e6bc-kube-api-access-lmcjq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.439381 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.439672 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.490226 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmcjq\" (UniqueName: \"kubernetes.io/projected/8db6c802-44ea-48b4-a63f-c6c43492e6bc-kube-api-access-lmcjq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:08 crc kubenswrapper[4931]: I0130 06:47:08.532846 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:09 crc kubenswrapper[4931]: I0130 06:47:09.081503 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll"] Jan 30 06:47:09 crc kubenswrapper[4931]: I0130 06:47:09.294162 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" event={"ID":"8db6c802-44ea-48b4-a63f-c6c43492e6bc","Type":"ContainerStarted","Data":"a25c57f5fdcd778577fea8d3560ffb6508c8c6a43e5aa9b1c708756b6a5b4cda"} Jan 30 06:47:09 crc kubenswrapper[4931]: I0130 06:47:09.294541 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" event={"ID":"8db6c802-44ea-48b4-a63f-c6c43492e6bc","Type":"ContainerStarted","Data":"e83f3a2231d4e8b900f553f14b20706a468a8bbf08b33c6e42175c37e907d02b"} Jan 30 06:47:10 crc kubenswrapper[4931]: I0130 06:47:10.311274 4931 generic.go:334] "Generic (PLEG): container finished" podID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerID="a25c57f5fdcd778577fea8d3560ffb6508c8c6a43e5aa9b1c708756b6a5b4cda" exitCode=0 Jan 30 06:47:10 crc kubenswrapper[4931]: I0130 06:47:10.311325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" event={"ID":"8db6c802-44ea-48b4-a63f-c6c43492e6bc","Type":"ContainerDied","Data":"a25c57f5fdcd778577fea8d3560ffb6508c8c6a43e5aa9b1c708756b6a5b4cda"} Jan 30 06:47:12 crc kubenswrapper[4931]: I0130 06:47:12.340052 4931 generic.go:334] "Generic (PLEG): container finished" podID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerID="3d9512a36b8f2d2bbb2d2a714b4fd87023b7b20bbdb765b5393aa21e003e11fe" exitCode=0 Jan 30 06:47:12 crc kubenswrapper[4931]: I0130 06:47:12.340166 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" event={"ID":"8db6c802-44ea-48b4-a63f-c6c43492e6bc","Type":"ContainerDied","Data":"3d9512a36b8f2d2bbb2d2a714b4fd87023b7b20bbdb765b5393aa21e003e11fe"} Jan 30 06:47:13 crc kubenswrapper[4931]: I0130 06:47:13.353189 4931 generic.go:334] "Generic (PLEG): container finished" podID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerID="768e5dca9c8f197b42c027297409e1b69f038de6d463f6b9a171e4d80977f474" exitCode=0 Jan 30 06:47:13 crc kubenswrapper[4931]: I0130 06:47:13.353298 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" event={"ID":"8db6c802-44ea-48b4-a63f-c6c43492e6bc","Type":"ContainerDied","Data":"768e5dca9c8f197b42c027297409e1b69f038de6d463f6b9a171e4d80977f474"} Jan 30 06:47:14 crc kubenswrapper[4931]: I0130 06:47:14.066530 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9f04-account-create-update-wgg9g"] Jan 30 06:47:14 crc kubenswrapper[4931]: I0130 06:47:14.097053 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9f04-account-create-update-wgg9g"] Jan 30 06:47:14 crc kubenswrapper[4931]: I0130 06:47:14.112916 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zkc49"] Jan 30 06:47:14 crc kubenswrapper[4931]: I0130 06:47:14.122904 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zkc49"] Jan 30 06:47:14 crc kubenswrapper[4931]: I0130 06:47:14.812185 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.000270 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmcjq\" (UniqueName: \"kubernetes.io/projected/8db6c802-44ea-48b4-a63f-c6c43492e6bc-kube-api-access-lmcjq\") pod \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.000465 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-bundle\") pod \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.000569 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-util\") pod \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\" (UID: \"8db6c802-44ea-48b4-a63f-c6c43492e6bc\") " Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.002542 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-bundle" (OuterVolumeSpecName: "bundle") pod "8db6c802-44ea-48b4-a63f-c6c43492e6bc" (UID: "8db6c802-44ea-48b4-a63f-c6c43492e6bc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.013586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db6c802-44ea-48b4-a63f-c6c43492e6bc-kube-api-access-lmcjq" (OuterVolumeSpecName: "kube-api-access-lmcjq") pod "8db6c802-44ea-48b4-a63f-c6c43492e6bc" (UID: "8db6c802-44ea-48b4-a63f-c6c43492e6bc"). InnerVolumeSpecName "kube-api-access-lmcjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.016248 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-util" (OuterVolumeSpecName: "util") pod "8db6c802-44ea-48b4-a63f-c6c43492e6bc" (UID: "8db6c802-44ea-48b4-a63f-c6c43492e6bc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.102944 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmcjq\" (UniqueName: \"kubernetes.io/projected/8db6c802-44ea-48b4-a63f-c6c43492e6bc-kube-api-access-lmcjq\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.102992 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.103011 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db6c802-44ea-48b4-a63f-c6c43492e6bc-util\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.378449 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" event={"ID":"8db6c802-44ea-48b4-a63f-c6c43492e6bc","Type":"ContainerDied","Data":"e83f3a2231d4e8b900f553f14b20706a468a8bbf08b33c6e42175c37e907d02b"} Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.378805 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83f3a2231d4e8b900f553f14b20706a468a8bbf08b33c6e42175c37e907d02b" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.378529 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.442209 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d676c50-5909-4eeb-a22b-63823761ab17" path="/var/lib/kubelet/pods/7d676c50-5909-4eeb-a22b-63823761ab17/volumes" Jan 30 06:47:15 crc kubenswrapper[4931]: I0130 06:47:15.443220 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80acfb99-2d96-453a-b29a-62f23608dd5f" path="/var/lib/kubelet/pods/80acfb99-2d96-453a-b29a-62f23608dd5f/volumes" Jan 30 06:47:20 crc kubenswrapper[4931]: I0130 06:47:20.044769 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-x9ngk"] Jan 30 06:47:20 crc kubenswrapper[4931]: I0130 06:47:20.055697 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-x9ngk"] Jan 30 06:47:21 crc kubenswrapper[4931]: I0130 06:47:21.432735 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e7d2a9-093c-4495-81ab-99972c72b179" path="/var/lib/kubelet/pods/08e7d2a9-093c-4495-81ab-99972c72b179/volumes" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.073392 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m"] Jan 30 06:47:26 crc kubenswrapper[4931]: E0130 06:47:26.074431 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="extract" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.074446 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="extract" Jan 30 06:47:26 crc kubenswrapper[4931]: E0130 06:47:26.074471 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="pull" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.074478 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="pull" Jan 30 06:47:26 crc kubenswrapper[4931]: E0130 06:47:26.074491 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="util" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.074498 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="util" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.074733 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db6c802-44ea-48b4-a63f-c6c43492e6bc" containerName="extract" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.075473 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.083409 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.083822 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.083962 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-vkqdq" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.111248 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.145061 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.146848 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.152251 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-c5zbp" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.154904 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.165304 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.170673 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.187479 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.200552 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.234502 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/119a1b91-5877-408e-8721-dccac5a05367-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z\" (UID: \"119a1b91-5877-408e-8721-dccac5a05367\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.234588 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/119a1b91-5877-408e-8721-dccac5a05367-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z\" (UID: \"119a1b91-5877-408e-8721-dccac5a05367\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.234749 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8xcf\" (UniqueName: \"kubernetes.io/projected/2668098b-064f-4807-b2ee-7efb5dc89fb8-kube-api-access-m8xcf\") pod \"obo-prometheus-operator-68bc856cb9-lx27m\" (UID: \"2668098b-064f-4807-b2ee-7efb5dc89fb8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.271795 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qm276"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.273039 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.275678 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-pn6vp" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.275796 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.293880 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qm276"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.337052 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8xcf\" (UniqueName: \"kubernetes.io/projected/2668098b-064f-4807-b2ee-7efb5dc89fb8-kube-api-access-m8xcf\") pod \"obo-prometheus-operator-68bc856cb9-lx27m\" (UID: \"2668098b-064f-4807-b2ee-7efb5dc89fb8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.337161 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38907dab-62b6-4364-b48c-8300b1fa2ad2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr\" (UID: \"38907dab-62b6-4364-b48c-8300b1fa2ad2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.337220 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/119a1b91-5877-408e-8721-dccac5a05367-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z\" (UID: \"119a1b91-5877-408e-8721-dccac5a05367\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.337285 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/119a1b91-5877-408e-8721-dccac5a05367-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z\" (UID: \"119a1b91-5877-408e-8721-dccac5a05367\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.337328 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38907dab-62b6-4364-b48c-8300b1fa2ad2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr\" (UID: \"38907dab-62b6-4364-b48c-8300b1fa2ad2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.343732 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/119a1b91-5877-408e-8721-dccac5a05367-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z\" (UID: \"119a1b91-5877-408e-8721-dccac5a05367\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.359133 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/119a1b91-5877-408e-8721-dccac5a05367-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z\" (UID: \"119a1b91-5877-408e-8721-dccac5a05367\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.360009 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8xcf\" (UniqueName: \"kubernetes.io/projected/2668098b-064f-4807-b2ee-7efb5dc89fb8-kube-api-access-m8xcf\") pod \"obo-prometheus-operator-68bc856cb9-lx27m\" (UID: \"2668098b-064f-4807-b2ee-7efb5dc89fb8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.388394 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gw297"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.389817 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.403841 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vlfpb" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.407405 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.417153 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gw297"] Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.439834 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38907dab-62b6-4364-b48c-8300b1fa2ad2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr\" (UID: \"38907dab-62b6-4364-b48c-8300b1fa2ad2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.439980 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5zv\" (UniqueName: \"kubernetes.io/projected/c63c0b3f-7290-4318-8db6-a1ae150b22e0-kube-api-access-4d5zv\") pod \"observability-operator-59bdc8b94-qm276\" (UID: \"c63c0b3f-7290-4318-8db6-a1ae150b22e0\") " pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.440031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c63c0b3f-7290-4318-8db6-a1ae150b22e0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qm276\" (UID: \"c63c0b3f-7290-4318-8db6-a1ae150b22e0\") " pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.440160 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38907dab-62b6-4364-b48c-8300b1fa2ad2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr\" (UID: \"38907dab-62b6-4364-b48c-8300b1fa2ad2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.443032 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38907dab-62b6-4364-b48c-8300b1fa2ad2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr\" (UID: \"38907dab-62b6-4364-b48c-8300b1fa2ad2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.446524 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38907dab-62b6-4364-b48c-8300b1fa2ad2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr\" (UID: \"38907dab-62b6-4364-b48c-8300b1fa2ad2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.467152 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.489095 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.543163 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55072f8e-c1ef-45fd-9ec3-43e74afed3a7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gw297\" (UID: \"55072f8e-c1ef-45fd-9ec3-43e74afed3a7\") " pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.543359 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5zv\" (UniqueName: \"kubernetes.io/projected/c63c0b3f-7290-4318-8db6-a1ae150b22e0-kube-api-access-4d5zv\") pod \"observability-operator-59bdc8b94-qm276\" (UID: \"c63c0b3f-7290-4318-8db6-a1ae150b22e0\") " pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.543473 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c63c0b3f-7290-4318-8db6-a1ae150b22e0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qm276\" (UID: \"c63c0b3f-7290-4318-8db6-a1ae150b22e0\") " pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.543727 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck4jr\" (UniqueName: \"kubernetes.io/projected/55072f8e-c1ef-45fd-9ec3-43e74afed3a7-kube-api-access-ck4jr\") pod \"perses-operator-5bf474d74f-gw297\" (UID: \"55072f8e-c1ef-45fd-9ec3-43e74afed3a7\") " pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.572890 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c63c0b3f-7290-4318-8db6-a1ae150b22e0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qm276\" (UID: \"c63c0b3f-7290-4318-8db6-a1ae150b22e0\") " pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.584048 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5zv\" (UniqueName: \"kubernetes.io/projected/c63c0b3f-7290-4318-8db6-a1ae150b22e0-kube-api-access-4d5zv\") pod \"observability-operator-59bdc8b94-qm276\" (UID: \"c63c0b3f-7290-4318-8db6-a1ae150b22e0\") " pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.595444 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.654921 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck4jr\" (UniqueName: \"kubernetes.io/projected/55072f8e-c1ef-45fd-9ec3-43e74afed3a7-kube-api-access-ck4jr\") pod \"perses-operator-5bf474d74f-gw297\" (UID: \"55072f8e-c1ef-45fd-9ec3-43e74afed3a7\") " pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.655082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55072f8e-c1ef-45fd-9ec3-43e74afed3a7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gw297\" (UID: \"55072f8e-c1ef-45fd-9ec3-43e74afed3a7\") " pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.656006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55072f8e-c1ef-45fd-9ec3-43e74afed3a7-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gw297\" (UID: \"55072f8e-c1ef-45fd-9ec3-43e74afed3a7\") " pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.719473 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck4jr\" (UniqueName: \"kubernetes.io/projected/55072f8e-c1ef-45fd-9ec3-43e74afed3a7-kube-api-access-ck4jr\") pod \"perses-operator-5bf474d74f-gw297\" (UID: \"55072f8e-c1ef-45fd-9ec3-43e74afed3a7\") " pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:26 crc kubenswrapper[4931]: I0130 06:47:26.874557 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.084756 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m"] Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.097219 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z"] Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.217320 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr"] Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.341764 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qm276"] Jan 30 06:47:27 crc kubenswrapper[4931]: W0130 06:47:27.346854 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63c0b3f_7290_4318_8db6_a1ae150b22e0.slice/crio-1e1bd0d954b66af59ae51108612c7e3e66ee714efeac262528918ec8e68c0294 WatchSource:0}: Error finding container 1e1bd0d954b66af59ae51108612c7e3e66ee714efeac262528918ec8e68c0294: Status 404 returned error can't find the container with id 1e1bd0d954b66af59ae51108612c7e3e66ee714efeac262528918ec8e68c0294 Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.455079 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gw297"] Jan 30 06:47:27 crc kubenswrapper[4931]: W0130 06:47:27.464354 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55072f8e_c1ef_45fd_9ec3_43e74afed3a7.slice/crio-cdc09e6144bfe13c994836816be674a2482b6fba1840c0c8d82684afbe83855f WatchSource:0}: Error finding container cdc09e6144bfe13c994836816be674a2482b6fba1840c0c8d82684afbe83855f: Status 404 returned error can't find the container with id cdc09e6144bfe13c994836816be674a2482b6fba1840c0c8d82684afbe83855f Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.537008 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-qm276" event={"ID":"c63c0b3f-7290-4318-8db6-a1ae150b22e0","Type":"ContainerStarted","Data":"1e1bd0d954b66af59ae51108612c7e3e66ee714efeac262528918ec8e68c0294"} Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.539208 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" event={"ID":"2668098b-064f-4807-b2ee-7efb5dc89fb8","Type":"ContainerStarted","Data":"337b1d501ed16bb3dddf280a3f7608ee524e7d7ad390612a1f4b5a8428bd92ad"} Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.541171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" event={"ID":"119a1b91-5877-408e-8721-dccac5a05367","Type":"ContainerStarted","Data":"49697c2c21142e44412f16e27dc59cefe601284a387b0d8f77575a53056342a1"} Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.543756 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gw297" event={"ID":"55072f8e-c1ef-45fd-9ec3-43e74afed3a7","Type":"ContainerStarted","Data":"cdc09e6144bfe13c994836816be674a2482b6fba1840c0c8d82684afbe83855f"} Jan 30 06:47:27 crc kubenswrapper[4931]: I0130 06:47:27.547800 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" event={"ID":"38907dab-62b6-4364-b48c-8300b1fa2ad2","Type":"ContainerStarted","Data":"100d94ead8b427afcfd12524ab4a6e5628fc75d3aeb2e3152310812206695d05"} Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.755196 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" event={"ID":"2668098b-064f-4807-b2ee-7efb5dc89fb8","Type":"ContainerStarted","Data":"b1ae7c48818c75c8f0f058c5a8e32f448f0319b09927c3cbfba7d5a6ec89e174"} Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.762356 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" event={"ID":"119a1b91-5877-408e-8721-dccac5a05367","Type":"ContainerStarted","Data":"f757d946a3ad0e0d7f9bc785018faa8d2249f50a81150e7d76525a29da1c33f5"} Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.764255 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gw297" event={"ID":"55072f8e-c1ef-45fd-9ec3-43e74afed3a7","Type":"ContainerStarted","Data":"bae6c02ffb2792738f9ce9a06213783c8b1959ada4ab57320eafed7bb96a04bb"} Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.764353 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.766104 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" event={"ID":"38907dab-62b6-4364-b48c-8300b1fa2ad2","Type":"ContainerStarted","Data":"92b828684851fdf725b5b5116d1adf9cac3d5fddcbc4aca79eaeb832d1c93b8f"} Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.767779 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-qm276" event={"ID":"c63c0b3f-7290-4318-8db6-a1ae150b22e0","Type":"ContainerStarted","Data":"7e8a19a93897c4d07e99f2031c8047a8307c95f13efc0ccfc7740dc247da194a"} Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.768264 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.769978 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-qm276" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.782111 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-lx27m" podStartSLOduration=2.527822385 podStartE2EDuration="14.782094323s" podCreationTimestamp="2026-01-30 06:47:26 +0000 UTC" firstStartedPulling="2026-01-30 06:47:27.087678535 +0000 UTC m=+5982.457588792" lastFinishedPulling="2026-01-30 06:47:39.341950473 +0000 UTC m=+5994.711860730" observedRunningTime="2026-01-30 06:47:40.778405479 +0000 UTC m=+5996.148315736" watchObservedRunningTime="2026-01-30 06:47:40.782094323 +0000 UTC m=+5996.152004580" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.804712 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr" podStartSLOduration=2.702875903 podStartE2EDuration="14.804694719s" podCreationTimestamp="2026-01-30 06:47:26 +0000 UTC" firstStartedPulling="2026-01-30 06:47:27.224633021 +0000 UTC m=+5982.594543268" lastFinishedPulling="2026-01-30 06:47:39.326451817 +0000 UTC m=+5994.696362084" observedRunningTime="2026-01-30 06:47:40.795932312 +0000 UTC m=+5996.165842569" watchObservedRunningTime="2026-01-30 06:47:40.804694719 +0000 UTC m=+5996.174604976" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.856155 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-gw297" podStartSLOduration=2.979634913 podStartE2EDuration="14.856139677s" podCreationTimestamp="2026-01-30 06:47:26 +0000 UTC" firstStartedPulling="2026-01-30 06:47:27.467859097 +0000 UTC m=+5982.837769364" lastFinishedPulling="2026-01-30 06:47:39.344363861 +0000 UTC m=+5994.714274128" observedRunningTime="2026-01-30 06:47:40.81930385 +0000 UTC m=+5996.189214107" watchObservedRunningTime="2026-01-30 06:47:40.856139677 +0000 UTC m=+5996.226049934" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.864036 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-qm276" podStartSLOduration=2.807341913 podStartE2EDuration="14.864020609s" podCreationTimestamp="2026-01-30 06:47:26 +0000 UTC" firstStartedPulling="2026-01-30 06:47:27.348885358 +0000 UTC m=+5982.718795615" lastFinishedPulling="2026-01-30 06:47:39.405564054 +0000 UTC m=+5994.775474311" observedRunningTime="2026-01-30 06:47:40.859220384 +0000 UTC m=+5996.229130641" watchObservedRunningTime="2026-01-30 06:47:40.864020609 +0000 UTC m=+5996.233930866" Jan 30 06:47:40 crc kubenswrapper[4931]: I0130 06:47:40.884619 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z" podStartSLOduration=2.648008158 podStartE2EDuration="14.884605298s" podCreationTimestamp="2026-01-30 06:47:26 +0000 UTC" firstStartedPulling="2026-01-30 06:47:27.102966216 +0000 UTC m=+5982.472876473" lastFinishedPulling="2026-01-30 06:47:39.339563306 +0000 UTC m=+5994.709473613" observedRunningTime="2026-01-30 06:47:40.880720809 +0000 UTC m=+5996.250631066" watchObservedRunningTime="2026-01-30 06:47:40.884605298 +0000 UTC m=+5996.254515555" Jan 30 06:47:46 crc kubenswrapper[4931]: I0130 06:47:46.878957 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-gw297" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.403540 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.404104 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="459f1ff6-e3cb-45a8-9a4a-0e24e7881407" containerName="openstackclient" containerID="cri-o://73102981ee7cbbfca21cfe57a2e7dc11e7bcd50d611864190f31f741ca04bd1e" gracePeriod=2 Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.416447 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.463294 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: E0130 06:47:49.463773 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459f1ff6-e3cb-45a8-9a4a-0e24e7881407" containerName="openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.463791 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="459f1ff6-e3cb-45a8-9a4a-0e24e7881407" containerName="openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.464033 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="459f1ff6-e3cb-45a8-9a4a-0e24e7881407" containerName="openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.464726 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.486137 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.548509 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.548667 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config-secret\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.548715 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75swp\" (UniqueName: \"kubernetes.io/projected/c06bf028-7b95-418f-9285-80094ac02aa7-kube-api-access-75swp\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.575052 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: E0130 06:47:49.576721 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-75swp openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="c06bf028-7b95-418f-9285-80094ac02aa7" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.635660 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.664034 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config-secret\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.664124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75swp\" (UniqueName: \"kubernetes.io/projected/c06bf028-7b95-418f-9285-80094ac02aa7-kube-api-access-75swp\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.664336 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.669771 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: E0130 06:47:49.670241 4931 projected.go:194] Error preparing data for projected volume kube-api-access-75swp for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c06bf028-7b95-418f-9285-80094ac02aa7) does not match the UID in record. The object might have been deleted and then recreated Jan 30 06:47:49 crc kubenswrapper[4931]: E0130 06:47:49.670378 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06bf028-7b95-418f-9285-80094ac02aa7-kube-api-access-75swp podName:c06bf028-7b95-418f-9285-80094ac02aa7 nodeName:}" failed. No retries permitted until 2026-01-30 06:47:50.17035598 +0000 UTC m=+6005.540266237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-75swp" (UniqueName: "kubernetes.io/projected/c06bf028-7b95-418f-9285-80094ac02aa7-kube-api-access-75swp") pod "openstackclient" (UID: "c06bf028-7b95-418f-9285-80094ac02aa7") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c06bf028-7b95-418f-9285-80094ac02aa7) does not match the UID in record. The object might have been deleted and then recreated Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.680208 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.681882 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.716549 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.730066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config-secret\") pod \"openstackclient\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.769259 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9261635-8331-44ae-88d1-df73db930d2d-openstack-config\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.769318 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59l2t\" (UniqueName: \"kubernetes.io/projected/a9261635-8331-44ae-88d1-df73db930d2d-kube-api-access-59l2t\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.769367 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9261635-8331-44ae-88d1-df73db930d2d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.813176 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.814882 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.819798 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lhdr6" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.847254 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.870579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59l2t\" (UniqueName: \"kubernetes.io/projected/a9261635-8331-44ae-88d1-df73db930d2d-kube-api-access-59l2t\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.870665 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9261635-8331-44ae-88d1-df73db930d2d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.870804 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fx5b\" (UniqueName: \"kubernetes.io/projected/eb1cdd0a-4520-49ce-8bc6-686dba45e7e8-kube-api-access-8fx5b\") pod \"kube-state-metrics-0\" (UID: \"eb1cdd0a-4520-49ce-8bc6-686dba45e7e8\") " pod="openstack/kube-state-metrics-0" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.870832 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9261635-8331-44ae-88d1-df73db930d2d-openstack-config\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.871607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9261635-8331-44ae-88d1-df73db930d2d-openstack-config\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.878010 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9261635-8331-44ae-88d1-df73db930d2d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.901282 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59l2t\" (UniqueName: \"kubernetes.io/projected/a9261635-8331-44ae-88d1-df73db930d2d-kube-api-access-59l2t\") pod \"openstackclient\" (UID: \"a9261635-8331-44ae-88d1-df73db930d2d\") " pod="openstack/openstackclient" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.972547 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fx5b\" (UniqueName: \"kubernetes.io/projected/eb1cdd0a-4520-49ce-8bc6-686dba45e7e8-kube-api-access-8fx5b\") pod \"kube-state-metrics-0\" (UID: \"eb1cdd0a-4520-49ce-8bc6-686dba45e7e8\") " pod="openstack/kube-state-metrics-0" Jan 30 06:47:49 crc kubenswrapper[4931]: I0130 06:47:49.992331 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fx5b\" (UniqueName: \"kubernetes.io/projected/eb1cdd0a-4520-49ce-8bc6-686dba45e7e8-kube-api-access-8fx5b\") pod \"kube-state-metrics-0\" (UID: \"eb1cdd0a-4520-49ce-8bc6-686dba45e7e8\") " pod="openstack/kube-state-metrics-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.094692 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.106614 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.107128 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c06bf028-7b95-418f-9285-80094ac02aa7" podUID="a9261635-8331-44ae-88d1-df73db930d2d" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.109336 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.147322 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.179077 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config-secret\") pod \"c06bf028-7b95-418f-9285-80094ac02aa7\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.179299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config\") pod \"c06bf028-7b95-418f-9285-80094ac02aa7\" (UID: \"c06bf028-7b95-418f-9285-80094ac02aa7\") " Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.179796 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75swp\" (UniqueName: \"kubernetes.io/projected/c06bf028-7b95-418f-9285-80094ac02aa7-kube-api-access-75swp\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.180321 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c06bf028-7b95-418f-9285-80094ac02aa7" (UID: "c06bf028-7b95-418f-9285-80094ac02aa7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.209195 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c06bf028-7b95-418f-9285-80094ac02aa7" (UID: "c06bf028-7b95-418f-9285-80094ac02aa7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.281115 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.281149 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c06bf028-7b95-418f-9285-80094ac02aa7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.547907 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.551168 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.555033 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.555301 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.563035 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.563235 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.563526 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-w2s75" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.580363 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.601953 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfc9f\" (UniqueName: \"kubernetes.io/projected/faef005b-c58c-4b22-944c-defd3471fa32-kube-api-access-hfc9f\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.601985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.602048 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/faef005b-c58c-4b22-944c-defd3471fa32-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.602069 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/faef005b-c58c-4b22-944c-defd3471fa32-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.602101 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/faef005b-c58c-4b22-944c-defd3471fa32-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.602122 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.602216 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.703496 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/faef005b-c58c-4b22-944c-defd3471fa32-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.703817 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.703985 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.704070 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfc9f\" (UniqueName: \"kubernetes.io/projected/faef005b-c58c-4b22-944c-defd3471fa32-kube-api-access-hfc9f\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.704146 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.704244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/faef005b-c58c-4b22-944c-defd3471fa32-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.704322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/faef005b-c58c-4b22-944c-defd3471fa32-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.704847 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/faef005b-c58c-4b22-944c-defd3471fa32-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.717021 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.717707 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.718412 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/faef005b-c58c-4b22-944c-defd3471fa32-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.728000 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/faef005b-c58c-4b22-944c-defd3471fa32-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.740051 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/faef005b-c58c-4b22-944c-defd3471fa32-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.748795 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfc9f\" (UniqueName: \"kubernetes.io/projected/faef005b-c58c-4b22-944c-defd3471fa32-kube-api-access-hfc9f\") pod \"alertmanager-metric-storage-0\" (UID: \"faef005b-c58c-4b22-944c-defd3471fa32\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.919398 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:47:50 crc kubenswrapper[4931]: I0130 06:47:50.920542 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.028920 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.031629 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.040794 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.041060 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-bx8cf" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.041206 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.041842 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.041875 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.042063 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.042089 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.042239 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.072488 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.129467 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.139557 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eb1cdd0a-4520-49ce-8bc6-686dba45e7e8","Type":"ContainerStarted","Data":"2d8f55e0331f11036c5786ac23d7de2c089dbeb8f93b1579ad7aaa1b700aa4b3"} Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.176348 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.179606 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c06bf028-7b95-418f-9285-80094ac02aa7" podUID="a9261635-8331-44ae-88d1-df73db930d2d" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228569 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228623 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-config\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228648 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228669 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228698 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228751 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/201626a3-bc04-48ab-859c-5a7ffe97670e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228816 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/201626a3-bc04-48ab-859c-5a7ffe97670e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228865 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwx6\" (UniqueName: \"kubernetes.io/projected/201626a3-bc04-48ab-859c-5a7ffe97670e-kube-api-access-npwx6\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228886 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.228909 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.332692 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/201626a3-bc04-48ab-859c-5a7ffe97670e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.332986 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwx6\" (UniqueName: \"kubernetes.io/projected/201626a3-bc04-48ab-859c-5a7ffe97670e-kube-api-access-npwx6\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333032 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333090 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333113 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-config\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333132 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333166 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333189 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.333222 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/201626a3-bc04-48ab-859c-5a7ffe97670e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.339236 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.339722 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.340382 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/201626a3-bc04-48ab-859c-5a7ffe97670e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.344788 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/201626a3-bc04-48ab-859c-5a7ffe97670e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.346038 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/201626a3-bc04-48ab-859c-5a7ffe97670e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.348905 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-config\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.358151 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.363401 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwx6\" (UniqueName: \"kubernetes.io/projected/201626a3-bc04-48ab-859c-5a7ffe97670e-kube-api-access-npwx6\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.364217 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.364271 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ea2045aa96daddbc52ed0a3ac5da2f96eb3f4f7be6270f0e4a84add5a5ee748a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.386996 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/201626a3-bc04-48ab-859c-5a7ffe97670e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.451810 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06bf028-7b95-418f-9285-80094ac02aa7" path="/var/lib/kubelet/pods/c06bf028-7b95-418f-9285-80094ac02aa7/volumes" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.668871 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.776902 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d50d636-9fa9-4d96-8a8b-86354c2ddcd2\") pod \"prometheus-metric-storage-0\" (UID: \"201626a3-bc04-48ab-859c-5a7ffe97670e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:51 crc kubenswrapper[4931]: I0130 06:47:51.992086 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:47:52 crc kubenswrapper[4931]: I0130 06:47:52.138968 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"faef005b-c58c-4b22-944c-defd3471fa32","Type":"ContainerStarted","Data":"09c97a7f80d27798d89cf46d3d17d28ad1db0bdb1357a4f8c7152d9252762f4c"} Jan 30 06:47:52 crc kubenswrapper[4931]: I0130 06:47:52.140331 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a9261635-8331-44ae-88d1-df73db930d2d","Type":"ContainerStarted","Data":"60cf8cbef84de7d548f52576f9f6189fcb562cc1bc3caf2f4e4227b7b4a94ca0"} Jan 30 06:47:52 crc kubenswrapper[4931]: I0130 06:47:52.143190 4931 generic.go:334] "Generic (PLEG): container finished" podID="459f1ff6-e3cb-45a8-9a4a-0e24e7881407" containerID="73102981ee7cbbfca21cfe57a2e7dc11e7bcd50d611864190f31f741ca04bd1e" exitCode=137 Jan 30 06:47:52 crc kubenswrapper[4931]: I0130 06:47:52.534695 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.155391 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a9261635-8331-44ae-88d1-df73db930d2d","Type":"ContainerStarted","Data":"7ef25917a5433af6332663c4a2a47d9d78ee326692e15534fb17e746d3f90e43"} Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.158120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"201626a3-bc04-48ab-859c-5a7ffe97670e","Type":"ContainerStarted","Data":"ad1131374e01305446fc3c82a63690da5edabbd594006bde2373428cfa38f313"} Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.171534 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.171514745 podStartE2EDuration="4.171514745s" podCreationTimestamp="2026-01-30 06:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:47:53.171199386 +0000 UTC m=+6008.541109643" watchObservedRunningTime="2026-01-30 06:47:53.171514745 +0000 UTC m=+6008.541425052" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.267967 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.379465 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config\") pod \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.379572 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config-secret\") pod \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.379706 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c59zl\" (UniqueName: \"kubernetes.io/projected/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-kube-api-access-c59zl\") pod \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\" (UID: \"459f1ff6-e3cb-45a8-9a4a-0e24e7881407\") " Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.389887 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-kube-api-access-c59zl" (OuterVolumeSpecName: "kube-api-access-c59zl") pod "459f1ff6-e3cb-45a8-9a4a-0e24e7881407" (UID: "459f1ff6-e3cb-45a8-9a4a-0e24e7881407"). InnerVolumeSpecName "kube-api-access-c59zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.436194 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "459f1ff6-e3cb-45a8-9a4a-0e24e7881407" (UID: "459f1ff6-e3cb-45a8-9a4a-0e24e7881407"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.461224 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "459f1ff6-e3cb-45a8-9a4a-0e24e7881407" (UID: "459f1ff6-e3cb-45a8-9a4a-0e24e7881407"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.482661 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.482703 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:53 crc kubenswrapper[4931]: I0130 06:47:53.482713 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c59zl\" (UniqueName: \"kubernetes.io/projected/459f1ff6-e3cb-45a8-9a4a-0e24e7881407-kube-api-access-c59zl\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:54 crc kubenswrapper[4931]: I0130 06:47:54.170477 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eb1cdd0a-4520-49ce-8bc6-686dba45e7e8","Type":"ContainerStarted","Data":"44a177c6e04af3412b97c40f5dabd9fcb08bf20bae8016685d5f9079a80c66ad"} Jan 30 06:47:54 crc kubenswrapper[4931]: I0130 06:47:54.170884 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 06:47:54 crc kubenswrapper[4931]: I0130 06:47:54.172262 4931 scope.go:117] "RemoveContainer" containerID="73102981ee7cbbfca21cfe57a2e7dc11e7bcd50d611864190f31f741ca04bd1e" Jan 30 06:47:54 crc kubenswrapper[4931]: I0130 06:47:54.172468 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:47:54 crc kubenswrapper[4931]: I0130 06:47:54.197543 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.801984573 podStartE2EDuration="5.197523786s" podCreationTimestamp="2026-01-30 06:47:49 +0000 UTC" firstStartedPulling="2026-01-30 06:47:50.965535058 +0000 UTC m=+6006.335445315" lastFinishedPulling="2026-01-30 06:47:53.361074251 +0000 UTC m=+6008.730984528" observedRunningTime="2026-01-30 06:47:54.187739671 +0000 UTC m=+6009.557649918" watchObservedRunningTime="2026-01-30 06:47:54.197523786 +0000 UTC m=+6009.567434043" Jan 30 06:47:55 crc kubenswrapper[4931]: I0130 06:47:55.438065 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459f1ff6-e3cb-45a8-9a4a-0e24e7881407" path="/var/lib/kubelet/pods/459f1ff6-e3cb-45a8-9a4a-0e24e7881407/volumes" Jan 30 06:48:00 crc kubenswrapper[4931]: I0130 06:48:00.152204 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 06:48:01 crc kubenswrapper[4931]: I0130 06:48:01.506901 4931 scope.go:117] "RemoveContainer" containerID="d17e4d5da3cbd98de6ce9452e4b21d30ccf3ad5f026d8b30b101024dc6fd4576" Jan 30 06:48:01 crc kubenswrapper[4931]: I0130 06:48:01.596304 4931 scope.go:117] "RemoveContainer" containerID="312f2f7be76e7df2d1974a8fc3d9bcd846d76d9bf2e6bb52018a7e69743078de" Jan 30 06:48:01 crc kubenswrapper[4931]: I0130 06:48:01.636732 4931 scope.go:117] "RemoveContainer" containerID="a9a1121e8223cb02db35e17fa42991de13dcf37e427dbdd0bf16fa8038651093" Jan 30 06:48:06 crc kubenswrapper[4931]: I0130 06:48:06.353751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"201626a3-bc04-48ab-859c-5a7ffe97670e","Type":"ContainerStarted","Data":"c2eba11cac41762b5cd6c8068a20314519ca7b74bad5ea62a0adbde79dc90df4"} Jan 30 06:48:06 crc kubenswrapper[4931]: I0130 06:48:06.358513 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"faef005b-c58c-4b22-944c-defd3471fa32","Type":"ContainerStarted","Data":"376225131c6a847f9df2d3be6e0526d8a705d0a9a91ec6c6c672058ed88bcafb"} Jan 30 06:48:15 crc kubenswrapper[4931]: I0130 06:48:15.482263 4931 generic.go:334] "Generic (PLEG): container finished" podID="faef005b-c58c-4b22-944c-defd3471fa32" containerID="376225131c6a847f9df2d3be6e0526d8a705d0a9a91ec6c6c672058ed88bcafb" exitCode=0 Jan 30 06:48:15 crc kubenswrapper[4931]: I0130 06:48:15.482385 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"faef005b-c58c-4b22-944c-defd3471fa32","Type":"ContainerDied","Data":"376225131c6a847f9df2d3be6e0526d8a705d0a9a91ec6c6c672058ed88bcafb"} Jan 30 06:48:15 crc kubenswrapper[4931]: I0130 06:48:15.485145 4931 generic.go:334] "Generic (PLEG): container finished" podID="201626a3-bc04-48ab-859c-5a7ffe97670e" containerID="c2eba11cac41762b5cd6c8068a20314519ca7b74bad5ea62a0adbde79dc90df4" exitCode=0 Jan 30 06:48:15 crc kubenswrapper[4931]: I0130 06:48:15.485188 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"201626a3-bc04-48ab-859c-5a7ffe97670e","Type":"ContainerDied","Data":"c2eba11cac41762b5cd6c8068a20314519ca7b74bad5ea62a0adbde79dc90df4"} Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.037775 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vm2gb"] Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.054040 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7dkqh"] Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.066158 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5xpsl"] Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.077655 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vm2gb"] Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.090007 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7dkqh"] Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.100054 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5xpsl"] Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.450594 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2ac10a-2179-4d51-b7e8-31ac3621d798" path="/var/lib/kubelet/pods/0d2ac10a-2179-4d51-b7e8-31ac3621d798/volumes" Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.451414 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d595bdd-ffa6-4292-b4c2-1eba0736a6a4" path="/var/lib/kubelet/pods/5d595bdd-ffa6-4292-b4c2-1eba0736a6a4/volumes" Jan 30 06:48:17 crc kubenswrapper[4931]: I0130 06:48:17.464086 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede2117e-e3d5-46f6-8a54-1cd987370470" path="/var/lib/kubelet/pods/ede2117e-e3d5-46f6-8a54-1cd987370470/volumes" Jan 30 06:48:18 crc kubenswrapper[4931]: I0130 06:48:18.044202 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c3ab-account-create-update-6wqgk"] Jan 30 06:48:18 crc kubenswrapper[4931]: I0130 06:48:18.059790 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9932-account-create-update-6qlx2"] Jan 30 06:48:18 crc kubenswrapper[4931]: I0130 06:48:18.071562 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-184a-account-create-update-b6t5s"] Jan 30 06:48:18 crc kubenswrapper[4931]: I0130 06:48:18.081574 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c3ab-account-create-update-6wqgk"] Jan 30 06:48:18 crc kubenswrapper[4931]: I0130 06:48:18.090838 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9932-account-create-update-6qlx2"] Jan 30 06:48:18 crc kubenswrapper[4931]: I0130 06:48:18.100171 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-184a-account-create-update-b6t5s"] Jan 30 06:48:19 crc kubenswrapper[4931]: I0130 06:48:19.528523 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2a0233-04c5-4382-948d-809c1216b075" path="/var/lib/kubelet/pods/0c2a0233-04c5-4382-948d-809c1216b075/volumes" Jan 30 06:48:19 crc kubenswrapper[4931]: I0130 06:48:19.594632 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="730d8243-e8f1-4b7a-b012-d65ff132d427" path="/var/lib/kubelet/pods/730d8243-e8f1-4b7a-b012-d65ff132d427/volumes" Jan 30 06:48:19 crc kubenswrapper[4931]: I0130 06:48:19.595208 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a06db3-c381-45ef-883d-ee7393822e5a" path="/var/lib/kubelet/pods/c7a06db3-c381-45ef-883d-ee7393822e5a/volumes" Jan 30 06:48:27 crc kubenswrapper[4931]: I0130 06:48:27.039765 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-thknd"] Jan 30 06:48:27 crc kubenswrapper[4931]: I0130 06:48:27.052269 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-thknd"] Jan 30 06:48:27 crc kubenswrapper[4931]: I0130 06:48:27.434097 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693a2e91-1503-4caa-a71d-4f65d99a913c" path="/var/lib/kubelet/pods/693a2e91-1503-4caa-a71d-4f65d99a913c/volumes" Jan 30 06:48:29 crc kubenswrapper[4931]: E0130 06:48:29.970168 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741" Jan 30 06:48:29 crc kubenswrapper[4931]: E0130 06:48:29.970858 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-npwx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(201626a3-bc04-48ab-859c-5a7ffe97670e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 06:48:30 crc kubenswrapper[4931]: I0130 06:48:30.670467 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"faef005b-c58c-4b22-944c-defd3471fa32","Type":"ContainerStarted","Data":"f1be7091ee17b166609f29d4cc5a133c686f77e6b89bdd157872b6f5dbddc0d1"} Jan 30 06:48:34 crc kubenswrapper[4931]: I0130 06:48:34.733653 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"201626a3-bc04-48ab-859c-5a7ffe97670e","Type":"ContainerStarted","Data":"0a255d21e6258b4c05baeadd78ecbe9e74a01311366f48578347d846fe068bb1"} Jan 30 06:48:35 crc kubenswrapper[4931]: I0130 06:48:35.750391 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"faef005b-c58c-4b22-944c-defd3471fa32","Type":"ContainerStarted","Data":"45728b94f180ab44ef641bed1d4b55e655ae789f5c8e494f0c900fdbd87751b5"} Jan 30 06:48:35 crc kubenswrapper[4931]: I0130 06:48:35.751519 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:48:35 crc kubenswrapper[4931]: I0130 06:48:35.755013 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:48:35 crc kubenswrapper[4931]: I0130 06:48:35.798043 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.709086714 podStartE2EDuration="45.798022089s" podCreationTimestamp="2026-01-30 06:47:50 +0000 UTC" firstStartedPulling="2026-01-30 06:47:51.836447664 +0000 UTC m=+6007.206357921" lastFinishedPulling="2026-01-30 06:48:29.925382999 +0000 UTC m=+6045.295293296" observedRunningTime="2026-01-30 06:48:35.7866863 +0000 UTC m=+6051.156596587" watchObservedRunningTime="2026-01-30 06:48:35.798022089 +0000 UTC m=+6051.167932356" Jan 30 06:48:37 crc kubenswrapper[4931]: E0130 06:48:37.574736 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="201626a3-bc04-48ab-859c-5a7ffe97670e" Jan 30 06:48:37 crc kubenswrapper[4931]: I0130 06:48:37.771107 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"201626a3-bc04-48ab-859c-5a7ffe97670e","Type":"ContainerStarted","Data":"d2617c77f0fdebe0e80d581b32539d895e8c20afcf9a30ae8785370e94df527f"} Jan 30 06:48:37 crc kubenswrapper[4931]: E0130 06:48:37.772821 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="201626a3-bc04-48ab-859c-5a7ffe97670e" Jan 30 06:48:38 crc kubenswrapper[4931]: E0130 06:48:38.780411 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="201626a3-bc04-48ab-859c-5a7ffe97670e" Jan 30 06:48:45 crc kubenswrapper[4931]: I0130 06:48:45.056879 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xg8js"] Jan 30 06:48:45 crc kubenswrapper[4931]: I0130 06:48:45.069473 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xg8js"] Jan 30 06:48:45 crc kubenswrapper[4931]: I0130 06:48:45.441092 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ed4dbf-5bb0-45b9-bc15-763a93ba7375" path="/var/lib/kubelet/pods/97ed4dbf-5bb0-45b9-bc15-763a93ba7375/volumes" Jan 30 06:48:46 crc kubenswrapper[4931]: I0130 06:48:46.036709 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-49hcs"] Jan 30 06:48:46 crc kubenswrapper[4931]: I0130 06:48:46.054594 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-49hcs"] Jan 30 06:48:47 crc kubenswrapper[4931]: I0130 06:48:47.444488 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f484f87-1747-491b-a6c5-dd1d51ff66af" path="/var/lib/kubelet/pods/3f484f87-1747-491b-a6c5-dd1d51ff66af/volumes" Jan 30 06:48:52 crc kubenswrapper[4931]: I0130 06:48:52.924700 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"201626a3-bc04-48ab-859c-5a7ffe97670e","Type":"ContainerStarted","Data":"8ec51fb652154b51616a8a659fc911eabe3266bab8b1482a87fa750159b29d69"} Jan 30 06:48:52 crc kubenswrapper[4931]: I0130 06:48:52.949090 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.252430203 podStartE2EDuration="1m3.949071639s" podCreationTimestamp="2026-01-30 06:47:49 +0000 UTC" firstStartedPulling="2026-01-30 06:47:52.54963185 +0000 UTC m=+6007.919542117" lastFinishedPulling="2026-01-30 06:48:52.246273296 +0000 UTC m=+6067.616183553" observedRunningTime="2026-01-30 06:48:52.94627025 +0000 UTC m=+6068.316180507" watchObservedRunningTime="2026-01-30 06:48:52.949071639 +0000 UTC m=+6068.318981896" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.237141 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.240473 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.243663 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.244008 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.285303 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.353844 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-config-data\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.353947 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-scripts\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.353983 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.354086 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-log-httpd\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.354132 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-run-httpd\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.354159 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j95ck\" (UniqueName: \"kubernetes.io/projected/1ace174a-c316-432c-82da-840f5e2283d1-kube-api-access-j95ck\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.354175 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.455956 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.456448 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-log-httpd\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.456640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-run-httpd\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.456687 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j95ck\" (UniqueName: \"kubernetes.io/projected/1ace174a-c316-432c-82da-840f5e2283d1-kube-api-access-j95ck\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.456727 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.456755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-config-data\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.456851 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-scripts\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.457136 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-log-httpd\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.458229 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-run-httpd\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.463762 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-scripts\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.465075 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.466071 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.469578 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-config-data\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.483076 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j95ck\" (UniqueName: \"kubernetes.io/projected/1ace174a-c316-432c-82da-840f5e2283d1-kube-api-access-j95ck\") pod \"ceilometer-0\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.559909 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:48:56 crc kubenswrapper[4931]: I0130 06:48:56.993941 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 06:48:57 crc kubenswrapper[4931]: I0130 06:48:57.099369 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:48:57 crc kubenswrapper[4931]: W0130 06:48:57.100551 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ace174a_c316_432c_82da_840f5e2283d1.slice/crio-8f0ee7d141f630c7e73df9127c0fee9f75d55fe3a9ad54a729725d13272a81b3 WatchSource:0}: Error finding container 8f0ee7d141f630c7e73df9127c0fee9f75d55fe3a9ad54a729725d13272a81b3: Status 404 returned error can't find the container with id 8f0ee7d141f630c7e73df9127c0fee9f75d55fe3a9ad54a729725d13272a81b3 Jan 30 06:48:57 crc kubenswrapper[4931]: I0130 06:48:57.968303 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerStarted","Data":"aed3b001cce71a5012e1a07b41789b31d0ea72897e75f2bf7e3bbd6b03358b00"} Jan 30 06:48:57 crc kubenswrapper[4931]: I0130 06:48:57.968955 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerStarted","Data":"8f0ee7d141f630c7e73df9127c0fee9f75d55fe3a9ad54a729725d13272a81b3"} Jan 30 06:48:58 crc kubenswrapper[4931]: I0130 06:48:58.980745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerStarted","Data":"67791474132ea95207d226ce291f362b43314fdb91b8a0365481b75a910cde7a"} Jan 30 06:48:59 crc kubenswrapper[4931]: I0130 06:48:59.990631 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerStarted","Data":"dc00b4b18ec22a13fffbcb9088cbc8844fa86ef6d3929196ba622b907b1b6226"} Jan 30 06:49:01 crc kubenswrapper[4931]: I0130 06:49:01.810834 4931 scope.go:117] "RemoveContainer" containerID="982580309a618618acf59d7ed62dffc9baa63654e107e79e17f31ae5e09b9d10" Jan 30 06:49:01 crc kubenswrapper[4931]: I0130 06:49:01.848920 4931 scope.go:117] "RemoveContainer" containerID="0945601f9dc541b7489d17e996b29ddbb60ba07b0e0dec353dfa850db402078c" Jan 30 06:49:01 crc kubenswrapper[4931]: I0130 06:49:01.918174 4931 scope.go:117] "RemoveContainer" containerID="6d585a871e0bcc02099ca7b0fec64c91cc44765f6bff9850b839ed74e64354fb" Jan 30 06:49:01 crc kubenswrapper[4931]: I0130 06:49:01.958394 4931 scope.go:117] "RemoveContainer" containerID="f317a267d263377c23363a6996fcb39842543912fc8f152fbbaa9e502e107fac" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.015775 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerStarted","Data":"12aa02774ebae7d466712369ad3821a4d5645f4c4076a555423c271558b4c479"} Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.015878 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.020091 4931 scope.go:117] "RemoveContainer" containerID="5ecad2bc0e017bf1777e069b0d51cce5fdf83ffb212486a161ec83e5ab28a776" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.044980 4931 scope.go:117] "RemoveContainer" containerID="3e425706ce28b09c3d63a84e9658921cb423491b485b43961ce5f36589fbc46a" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.064725 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.964262172 podStartE2EDuration="6.064702147s" podCreationTimestamp="2026-01-30 06:48:56 +0000 UTC" firstStartedPulling="2026-01-30 06:48:57.102917546 +0000 UTC m=+6072.472827803" lastFinishedPulling="2026-01-30 06:49:01.203357521 +0000 UTC m=+6076.573267778" observedRunningTime="2026-01-30 06:49:02.036323998 +0000 UTC m=+6077.406234255" watchObservedRunningTime="2026-01-30 06:49:02.064702147 +0000 UTC m=+6077.434612414" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.083638 4931 scope.go:117] "RemoveContainer" containerID="e6ba02a25b1e0ee2f6ab68e84a5695ffab01e5de9788f62975d6f6c203f6437a" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.103995 4931 scope.go:117] "RemoveContainer" containerID="4fbc33cf72a98eaa43152775b96af49bda5487d618cfc4a531333efaa8d2b27d" Jan 30 06:49:02 crc kubenswrapper[4931]: I0130 06:49:02.147027 4931 scope.go:117] "RemoveContainer" containerID="27e13b6b085d0725e59ac8bc7078474da43dc2cdd83ba76b6820d4ffc9d8594f" Jan 30 06:49:04 crc kubenswrapper[4931]: I0130 06:49:04.051580 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqzzz"] Jan 30 06:49:04 crc kubenswrapper[4931]: I0130 06:49:04.068572 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqzzz"] Jan 30 06:49:05 crc kubenswrapper[4931]: I0130 06:49:05.435327 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80061de-8d87-4c58-8733-26c5224bf03a" path="/var/lib/kubelet/pods/d80061de-8d87-4c58-8733-26c5224bf03a/volumes" Jan 30 06:49:06 crc kubenswrapper[4931]: I0130 06:49:06.993183 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 06:49:06 crc kubenswrapper[4931]: I0130 06:49:06.997982 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 06:49:07 crc kubenswrapper[4931]: I0130 06:49:07.090473 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.383873 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-fmv6s"] Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.385608 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.398679 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-fmv6s"] Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.481326 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-cb1e-account-create-update-6drdw"] Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.482792 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.485761 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.492951 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-cb1e-account-create-update-6drdw"] Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.515893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/681b527a-d511-4db8-8f19-1df02bbf9f61-operator-scripts\") pod \"aodh-db-create-fmv6s\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.516098 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntfwg\" (UniqueName: \"kubernetes.io/projected/681b527a-d511-4db8-8f19-1df02bbf9f61-kube-api-access-ntfwg\") pod \"aodh-db-create-fmv6s\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.617775 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e6957d-e715-4a84-9952-19f773cfe882-operator-scripts\") pod \"aodh-cb1e-account-create-update-6drdw\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.617975 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/681b527a-d511-4db8-8f19-1df02bbf9f61-operator-scripts\") pod \"aodh-db-create-fmv6s\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.618033 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjvsj\" (UniqueName: \"kubernetes.io/projected/51e6957d-e715-4a84-9952-19f773cfe882-kube-api-access-zjvsj\") pod \"aodh-cb1e-account-create-update-6drdw\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.618370 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntfwg\" (UniqueName: \"kubernetes.io/projected/681b527a-d511-4db8-8f19-1df02bbf9f61-kube-api-access-ntfwg\") pod \"aodh-db-create-fmv6s\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.618757 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/681b527a-d511-4db8-8f19-1df02bbf9f61-operator-scripts\") pod \"aodh-db-create-fmv6s\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.642298 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntfwg\" (UniqueName: \"kubernetes.io/projected/681b527a-d511-4db8-8f19-1df02bbf9f61-kube-api-access-ntfwg\") pod \"aodh-db-create-fmv6s\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.716454 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.719395 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e6957d-e715-4a84-9952-19f773cfe882-operator-scripts\") pod \"aodh-cb1e-account-create-update-6drdw\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.719542 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjvsj\" (UniqueName: \"kubernetes.io/projected/51e6957d-e715-4a84-9952-19f773cfe882-kube-api-access-zjvsj\") pod \"aodh-cb1e-account-create-update-6drdw\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.720045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e6957d-e715-4a84-9952-19f773cfe882-operator-scripts\") pod \"aodh-cb1e-account-create-update-6drdw\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.739911 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjvsj\" (UniqueName: \"kubernetes.io/projected/51e6957d-e715-4a84-9952-19f773cfe882-kube-api-access-zjvsj\") pod \"aodh-cb1e-account-create-update-6drdw\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:08 crc kubenswrapper[4931]: I0130 06:49:08.800621 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:09 crc kubenswrapper[4931]: I0130 06:49:09.312671 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-fmv6s"] Jan 30 06:49:09 crc kubenswrapper[4931]: W0130 06:49:09.318557 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod681b527a_d511_4db8_8f19_1df02bbf9f61.slice/crio-bfc55abe9e3678fe35af1979b421a6ca3ccdcbf00f9ed457fa2b8d11555bca9d WatchSource:0}: Error finding container bfc55abe9e3678fe35af1979b421a6ca3ccdcbf00f9ed457fa2b8d11555bca9d: Status 404 returned error can't find the container with id bfc55abe9e3678fe35af1979b421a6ca3ccdcbf00f9ed457fa2b8d11555bca9d Jan 30 06:49:09 crc kubenswrapper[4931]: I0130 06:49:09.581298 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-cb1e-account-create-update-6drdw"] Jan 30 06:49:10 crc kubenswrapper[4931]: I0130 06:49:10.136381 4931 generic.go:334] "Generic (PLEG): container finished" podID="681b527a-d511-4db8-8f19-1df02bbf9f61" containerID="80ab6efc7f6dcfb70eed703ea54962d42118f91ddd843c75b9238af6658827ba" exitCode=0 Jan 30 06:49:10 crc kubenswrapper[4931]: I0130 06:49:10.136501 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fmv6s" event={"ID":"681b527a-d511-4db8-8f19-1df02bbf9f61","Type":"ContainerDied","Data":"80ab6efc7f6dcfb70eed703ea54962d42118f91ddd843c75b9238af6658827ba"} Jan 30 06:49:10 crc kubenswrapper[4931]: I0130 06:49:10.136817 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fmv6s" event={"ID":"681b527a-d511-4db8-8f19-1df02bbf9f61","Type":"ContainerStarted","Data":"bfc55abe9e3678fe35af1979b421a6ca3ccdcbf00f9ed457fa2b8d11555bca9d"} Jan 30 06:49:10 crc kubenswrapper[4931]: I0130 06:49:10.139814 4931 generic.go:334] "Generic (PLEG): container finished" podID="51e6957d-e715-4a84-9952-19f773cfe882" containerID="0e4d3615364adb9fc327ac5ce20cdd4fecf281a043a844859ed5dca539ce5720" exitCode=0 Jan 30 06:49:10 crc kubenswrapper[4931]: I0130 06:49:10.139847 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-cb1e-account-create-update-6drdw" event={"ID":"51e6957d-e715-4a84-9952-19f773cfe882","Type":"ContainerDied","Data":"0e4d3615364adb9fc327ac5ce20cdd4fecf281a043a844859ed5dca539ce5720"} Jan 30 06:49:10 crc kubenswrapper[4931]: I0130 06:49:10.139866 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-cb1e-account-create-update-6drdw" event={"ID":"51e6957d-e715-4a84-9952-19f773cfe882","Type":"ContainerStarted","Data":"36b11eaacb0459b577609608abbf56aefe928116c6f3f0ce4cddffd24a1980f3"} Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.677312 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.710518 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.791769 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e6957d-e715-4a84-9952-19f773cfe882-operator-scripts\") pod \"51e6957d-e715-4a84-9952-19f773cfe882\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.792042 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjvsj\" (UniqueName: \"kubernetes.io/projected/51e6957d-e715-4a84-9952-19f773cfe882-kube-api-access-zjvsj\") pod \"51e6957d-e715-4a84-9952-19f773cfe882\" (UID: \"51e6957d-e715-4a84-9952-19f773cfe882\") " Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.792281 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e6957d-e715-4a84-9952-19f773cfe882-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51e6957d-e715-4a84-9952-19f773cfe882" (UID: "51e6957d-e715-4a84-9952-19f773cfe882"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.792726 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e6957d-e715-4a84-9952-19f773cfe882-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.797674 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e6957d-e715-4a84-9952-19f773cfe882-kube-api-access-zjvsj" (OuterVolumeSpecName: "kube-api-access-zjvsj") pod "51e6957d-e715-4a84-9952-19f773cfe882" (UID: "51e6957d-e715-4a84-9952-19f773cfe882"). InnerVolumeSpecName "kube-api-access-zjvsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.894175 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/681b527a-d511-4db8-8f19-1df02bbf9f61-operator-scripts\") pod \"681b527a-d511-4db8-8f19-1df02bbf9f61\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.894320 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntfwg\" (UniqueName: \"kubernetes.io/projected/681b527a-d511-4db8-8f19-1df02bbf9f61-kube-api-access-ntfwg\") pod \"681b527a-d511-4db8-8f19-1df02bbf9f61\" (UID: \"681b527a-d511-4db8-8f19-1df02bbf9f61\") " Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.894890 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjvsj\" (UniqueName: \"kubernetes.io/projected/51e6957d-e715-4a84-9952-19f773cfe882-kube-api-access-zjvsj\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.894951 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681b527a-d511-4db8-8f19-1df02bbf9f61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "681b527a-d511-4db8-8f19-1df02bbf9f61" (UID: "681b527a-d511-4db8-8f19-1df02bbf9f61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.897621 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/681b527a-d511-4db8-8f19-1df02bbf9f61-kube-api-access-ntfwg" (OuterVolumeSpecName: "kube-api-access-ntfwg") pod "681b527a-d511-4db8-8f19-1df02bbf9f61" (UID: "681b527a-d511-4db8-8f19-1df02bbf9f61"). InnerVolumeSpecName "kube-api-access-ntfwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.997015 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntfwg\" (UniqueName: \"kubernetes.io/projected/681b527a-d511-4db8-8f19-1df02bbf9f61-kube-api-access-ntfwg\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:11.997371 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/681b527a-d511-4db8-8f19-1df02bbf9f61-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:12.167895 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fmv6s" event={"ID":"681b527a-d511-4db8-8f19-1df02bbf9f61","Type":"ContainerDied","Data":"bfc55abe9e3678fe35af1979b421a6ca3ccdcbf00f9ed457fa2b8d11555bca9d"} Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:12.167939 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fmv6s" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:12.167944 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc55abe9e3678fe35af1979b421a6ca3ccdcbf00f9ed457fa2b8d11555bca9d" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:12.170010 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-cb1e-account-create-update-6drdw" event={"ID":"51e6957d-e715-4a84-9952-19f773cfe882","Type":"ContainerDied","Data":"36b11eaacb0459b577609608abbf56aefe928116c6f3f0ce4cddffd24a1980f3"} Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:12.170049 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36b11eaacb0459b577609608abbf56aefe928116c6f3f0ce4cddffd24a1980f3" Jan 30 06:49:12 crc kubenswrapper[4931]: I0130 06:49:12.170066 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cb1e-account-create-update-6drdw" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.883369 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-rq4fv"] Jan 30 06:49:13 crc kubenswrapper[4931]: E0130 06:49:13.884164 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681b527a-d511-4db8-8f19-1df02bbf9f61" containerName="mariadb-database-create" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.884181 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="681b527a-d511-4db8-8f19-1df02bbf9f61" containerName="mariadb-database-create" Jan 30 06:49:13 crc kubenswrapper[4931]: E0130 06:49:13.884216 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e6957d-e715-4a84-9952-19f773cfe882" containerName="mariadb-account-create-update" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.884223 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e6957d-e715-4a84-9952-19f773cfe882" containerName="mariadb-account-create-update" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.884480 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="681b527a-d511-4db8-8f19-1df02bbf9f61" containerName="mariadb-database-create" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.884499 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e6957d-e715-4a84-9952-19f773cfe882" containerName="mariadb-account-create-update" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.885379 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.891244 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.893841 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xxpdc" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.893886 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.893952 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.900024 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rq4fv"] Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.964098 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-combined-ca-bundle\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.964151 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-scripts\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.964183 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwl2d\" (UniqueName: \"kubernetes.io/projected/76eec61d-6ff6-4286-9102-758374c6fa27-kube-api-access-qwl2d\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:13 crc kubenswrapper[4931]: I0130 06:49:13.964308 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-config-data\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.065667 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-config-data\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.065740 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-combined-ca-bundle\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.065766 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-scripts\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.065794 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwl2d\" (UniqueName: \"kubernetes.io/projected/76eec61d-6ff6-4286-9102-758374c6fa27-kube-api-access-qwl2d\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.071166 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-scripts\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.071686 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-combined-ca-bundle\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.071719 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-config-data\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.092909 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwl2d\" (UniqueName: \"kubernetes.io/projected/76eec61d-6ff6-4286-9102-758374c6fa27-kube-api-access-qwl2d\") pod \"aodh-db-sync-rq4fv\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.212732 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:14 crc kubenswrapper[4931]: I0130 06:49:14.717018 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rq4fv"] Jan 30 06:49:14 crc kubenswrapper[4931]: W0130 06:49:14.738472 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76eec61d_6ff6_4286_9102_758374c6fa27.slice/crio-ce6b29ec009d92bf4beaaf1185150d3f0a5fc6375f5de7e6f45a3df21d88ab7f WatchSource:0}: Error finding container ce6b29ec009d92bf4beaaf1185150d3f0a5fc6375f5de7e6f45a3df21d88ab7f: Status 404 returned error can't find the container with id ce6b29ec009d92bf4beaaf1185150d3f0a5fc6375f5de7e6f45a3df21d88ab7f Jan 30 06:49:15 crc kubenswrapper[4931]: I0130 06:49:15.205471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rq4fv" event={"ID":"76eec61d-6ff6-4286-9102-758374c6fa27","Type":"ContainerStarted","Data":"ce6b29ec009d92bf4beaaf1185150d3f0a5fc6375f5de7e6f45a3df21d88ab7f"} Jan 30 06:49:20 crc kubenswrapper[4931]: I0130 06:49:20.266758 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rq4fv" event={"ID":"76eec61d-6ff6-4286-9102-758374c6fa27","Type":"ContainerStarted","Data":"113fabb6410c6ac50615d981ed7da97e3148c88e8fc0cf34f88de6f851a2a62e"} Jan 30 06:49:20 crc kubenswrapper[4931]: I0130 06:49:20.286375 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-rq4fv" podStartSLOduration=2.059646285 podStartE2EDuration="7.286352003s" podCreationTimestamp="2026-01-30 06:49:13 +0000 UTC" firstStartedPulling="2026-01-30 06:49:14.74152729 +0000 UTC m=+6090.111437547" lastFinishedPulling="2026-01-30 06:49:19.968232968 +0000 UTC m=+6095.338143265" observedRunningTime="2026-01-30 06:49:20.280613621 +0000 UTC m=+6095.650523888" watchObservedRunningTime="2026-01-30 06:49:20.286352003 +0000 UTC m=+6095.656262270" Jan 30 06:49:22 crc kubenswrapper[4931]: I0130 06:49:22.301525 4931 generic.go:334] "Generic (PLEG): container finished" podID="76eec61d-6ff6-4286-9102-758374c6fa27" containerID="113fabb6410c6ac50615d981ed7da97e3148c88e8fc0cf34f88de6f851a2a62e" exitCode=0 Jan 30 06:49:22 crc kubenswrapper[4931]: I0130 06:49:22.301686 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rq4fv" event={"ID":"76eec61d-6ff6-4286-9102-758374c6fa27","Type":"ContainerDied","Data":"113fabb6410c6ac50615d981ed7da97e3148c88e8fc0cf34f88de6f851a2a62e"} Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.753853 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.806531 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-config-data\") pod \"76eec61d-6ff6-4286-9102-758374c6fa27\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.806603 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-combined-ca-bundle\") pod \"76eec61d-6ff6-4286-9102-758374c6fa27\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.806746 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwl2d\" (UniqueName: \"kubernetes.io/projected/76eec61d-6ff6-4286-9102-758374c6fa27-kube-api-access-qwl2d\") pod \"76eec61d-6ff6-4286-9102-758374c6fa27\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.806844 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-scripts\") pod \"76eec61d-6ff6-4286-9102-758374c6fa27\" (UID: \"76eec61d-6ff6-4286-9102-758374c6fa27\") " Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.812701 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-scripts" (OuterVolumeSpecName: "scripts") pod "76eec61d-6ff6-4286-9102-758374c6fa27" (UID: "76eec61d-6ff6-4286-9102-758374c6fa27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.812898 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76eec61d-6ff6-4286-9102-758374c6fa27-kube-api-access-qwl2d" (OuterVolumeSpecName: "kube-api-access-qwl2d") pod "76eec61d-6ff6-4286-9102-758374c6fa27" (UID: "76eec61d-6ff6-4286-9102-758374c6fa27"). InnerVolumeSpecName "kube-api-access-qwl2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.835933 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-config-data" (OuterVolumeSpecName: "config-data") pod "76eec61d-6ff6-4286-9102-758374c6fa27" (UID: "76eec61d-6ff6-4286-9102-758374c6fa27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.842568 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76eec61d-6ff6-4286-9102-758374c6fa27" (UID: "76eec61d-6ff6-4286-9102-758374c6fa27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.909360 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.909397 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.909408 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwl2d\" (UniqueName: \"kubernetes.io/projected/76eec61d-6ff6-4286-9102-758374c6fa27-kube-api-access-qwl2d\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:23 crc kubenswrapper[4931]: I0130 06:49:23.909427 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76eec61d-6ff6-4286-9102-758374c6fa27-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:24 crc kubenswrapper[4931]: I0130 06:49:24.329189 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rq4fv" event={"ID":"76eec61d-6ff6-4286-9102-758374c6fa27","Type":"ContainerDied","Data":"ce6b29ec009d92bf4beaaf1185150d3f0a5fc6375f5de7e6f45a3df21d88ab7f"} Jan 30 06:49:24 crc kubenswrapper[4931]: I0130 06:49:24.329232 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce6b29ec009d92bf4beaaf1185150d3f0a5fc6375f5de7e6f45a3df21d88ab7f" Jan 30 06:49:24 crc kubenswrapper[4931]: I0130 06:49:24.329242 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rq4fv" Jan 30 06:49:26 crc kubenswrapper[4931]: I0130 06:49:26.577913 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 06:49:27 crc kubenswrapper[4931]: I0130 06:49:27.363044 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:49:27 crc kubenswrapper[4931]: I0130 06:49:27.363381 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.478289 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 30 06:49:28 crc kubenswrapper[4931]: E0130 06:49:28.479011 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76eec61d-6ff6-4286-9102-758374c6fa27" containerName="aodh-db-sync" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.479025 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eec61d-6ff6-4286-9102-758374c6fa27" containerName="aodh-db-sync" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.479212 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="76eec61d-6ff6-4286-9102-758374c6fa27" containerName="aodh-db-sync" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.487636 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.491272 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xxpdc" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.491404 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.491499 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.494070 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.605218 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc96p\" (UniqueName: \"kubernetes.io/projected/890734fc-018f-4d2e-bc3e-ef4399f477da-kube-api-access-nc96p\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.605468 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-scripts\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.605765 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-combined-ca-bundle\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.605872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-config-data\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.708911 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-combined-ca-bundle\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.710828 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-config-data\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.710871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc96p\" (UniqueName: \"kubernetes.io/projected/890734fc-018f-4d2e-bc3e-ef4399f477da-kube-api-access-nc96p\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.710939 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-scripts\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.716045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-config-data\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.716324 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-combined-ca-bundle\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.720966 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890734fc-018f-4d2e-bc3e-ef4399f477da-scripts\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.736503 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc96p\" (UniqueName: \"kubernetes.io/projected/890734fc-018f-4d2e-bc3e-ef4399f477da-kube-api-access-nc96p\") pod \"aodh-0\" (UID: \"890734fc-018f-4d2e-bc3e-ef4399f477da\") " pod="openstack/aodh-0" Jan 30 06:49:28 crc kubenswrapper[4931]: I0130 06:49:28.814238 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:49:29 crc kubenswrapper[4931]: I0130 06:49:29.294897 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 30 06:49:29 crc kubenswrapper[4931]: I0130 06:49:29.303188 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:49:29 crc kubenswrapper[4931]: I0130 06:49:29.379176 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"890734fc-018f-4d2e-bc3e-ef4399f477da","Type":"ContainerStarted","Data":"287f5d40351da171b121ab371f559dd20696a906d00896d0c40736967109d5e3"} Jan 30 06:49:30 crc kubenswrapper[4931]: I0130 06:49:30.390274 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"890734fc-018f-4d2e-bc3e-ef4399f477da","Type":"ContainerStarted","Data":"25e3b66d4c9038129e9edcdc5fafec226a5cbf6fda303811185ce570a2d99c73"} Jan 30 06:49:30 crc kubenswrapper[4931]: I0130 06:49:30.645921 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:49:30 crc kubenswrapper[4931]: I0130 06:49:30.646168 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-central-agent" containerID="cri-o://aed3b001cce71a5012e1a07b41789b31d0ea72897e75f2bf7e3bbd6b03358b00" gracePeriod=30 Jan 30 06:49:30 crc kubenswrapper[4931]: I0130 06:49:30.646318 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-notification-agent" containerID="cri-o://67791474132ea95207d226ce291f362b43314fdb91b8a0365481b75a910cde7a" gracePeriod=30 Jan 30 06:49:30 crc kubenswrapper[4931]: I0130 06:49:30.646320 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="proxy-httpd" containerID="cri-o://12aa02774ebae7d466712369ad3821a4d5645f4c4076a555423c271558b4c479" gracePeriod=30 Jan 30 06:49:30 crc kubenswrapper[4931]: I0130 06:49:30.646548 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="sg-core" containerID="cri-o://dc00b4b18ec22a13fffbcb9088cbc8844fa86ef6d3929196ba622b907b1b6226" gracePeriod=30 Jan 30 06:49:31 crc kubenswrapper[4931]: I0130 06:49:31.403625 4931 generic.go:334] "Generic (PLEG): container finished" podID="1ace174a-c316-432c-82da-840f5e2283d1" containerID="12aa02774ebae7d466712369ad3821a4d5645f4c4076a555423c271558b4c479" exitCode=0 Jan 30 06:49:31 crc kubenswrapper[4931]: I0130 06:49:31.403955 4931 generic.go:334] "Generic (PLEG): container finished" podID="1ace174a-c316-432c-82da-840f5e2283d1" containerID="dc00b4b18ec22a13fffbcb9088cbc8844fa86ef6d3929196ba622b907b1b6226" exitCode=2 Jan 30 06:49:31 crc kubenswrapper[4931]: I0130 06:49:31.403980 4931 generic.go:334] "Generic (PLEG): container finished" podID="1ace174a-c316-432c-82da-840f5e2283d1" containerID="aed3b001cce71a5012e1a07b41789b31d0ea72897e75f2bf7e3bbd6b03358b00" exitCode=0 Jan 30 06:49:31 crc kubenswrapper[4931]: I0130 06:49:31.403710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerDied","Data":"12aa02774ebae7d466712369ad3821a4d5645f4c4076a555423c271558b4c479"} Jan 30 06:49:31 crc kubenswrapper[4931]: I0130 06:49:31.404021 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerDied","Data":"dc00b4b18ec22a13fffbcb9088cbc8844fa86ef6d3929196ba622b907b1b6226"} Jan 30 06:49:31 crc kubenswrapper[4931]: I0130 06:49:31.404040 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerDied","Data":"aed3b001cce71a5012e1a07b41789b31d0ea72897e75f2bf7e3bbd6b03358b00"} Jan 30 06:49:32 crc kubenswrapper[4931]: I0130 06:49:32.417635 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"890734fc-018f-4d2e-bc3e-ef4399f477da","Type":"ContainerStarted","Data":"9664b2dcf5cec3eb886c37c94885b290b4800534da7ca6b602895c3587b1d543"} Jan 30 06:49:34 crc kubenswrapper[4931]: I0130 06:49:34.463887 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"890734fc-018f-4d2e-bc3e-ef4399f477da","Type":"ContainerStarted","Data":"ee2123c23f00ddba11257e42d25ee445a7354f1e8f5862e132b2412004b6f861"} Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.484773 4931 generic.go:334] "Generic (PLEG): container finished" podID="1ace174a-c316-432c-82da-840f5e2283d1" containerID="67791474132ea95207d226ce291f362b43314fdb91b8a0365481b75a910cde7a" exitCode=0 Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.485275 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerDied","Data":"67791474132ea95207d226ce291f362b43314fdb91b8a0365481b75a910cde7a"} Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.509907 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"890734fc-018f-4d2e-bc3e-ef4399f477da","Type":"ContainerStarted","Data":"57a999a5726ef486c30936fca19673a3ee9e7ca668e958733eb7c4e34f5d925d"} Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.536474 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.638814287 podStartE2EDuration="7.536449121s" podCreationTimestamp="2026-01-30 06:49:28 +0000 UTC" firstStartedPulling="2026-01-30 06:49:29.30284615 +0000 UTC m=+6104.672756417" lastFinishedPulling="2026-01-30 06:49:35.200480994 +0000 UTC m=+6110.570391251" observedRunningTime="2026-01-30 06:49:35.527607433 +0000 UTC m=+6110.897517710" watchObservedRunningTime="2026-01-30 06:49:35.536449121 +0000 UTC m=+6110.906359378" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.600031 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668166 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-scripts\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668209 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-run-httpd\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668253 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-config-data\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668441 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-sg-core-conf-yaml\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668466 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-combined-ca-bundle\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668500 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-log-httpd\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.668518 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j95ck\" (UniqueName: \"kubernetes.io/projected/1ace174a-c316-432c-82da-840f5e2283d1-kube-api-access-j95ck\") pod \"1ace174a-c316-432c-82da-840f5e2283d1\" (UID: \"1ace174a-c316-432c-82da-840f5e2283d1\") " Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.669742 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.670076 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.675051 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-scripts" (OuterVolumeSpecName: "scripts") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.682822 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ace174a-c316-432c-82da-840f5e2283d1-kube-api-access-j95ck" (OuterVolumeSpecName: "kube-api-access-j95ck") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "kube-api-access-j95ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.714667 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.767647 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.770725 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.770763 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.770775 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.770789 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.770800 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ace174a-c316-432c-82da-840f5e2283d1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.770811 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j95ck\" (UniqueName: \"kubernetes.io/projected/1ace174a-c316-432c-82da-840f5e2283d1-kube-api-access-j95ck\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.787695 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-config-data" (OuterVolumeSpecName: "config-data") pod "1ace174a-c316-432c-82da-840f5e2283d1" (UID: "1ace174a-c316-432c-82da-840f5e2283d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:35 crc kubenswrapper[4931]: I0130 06:49:35.872696 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace174a-c316-432c-82da-840f5e2283d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.521087 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.523278 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ace174a-c316-432c-82da-840f5e2283d1","Type":"ContainerDied","Data":"8f0ee7d141f630c7e73df9127c0fee9f75d55fe3a9ad54a729725d13272a81b3"} Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.523323 4931 scope.go:117] "RemoveContainer" containerID="12aa02774ebae7d466712369ad3821a4d5645f4c4076a555423c271558b4c479" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.545316 4931 scope.go:117] "RemoveContainer" containerID="dc00b4b18ec22a13fffbcb9088cbc8844fa86ef6d3929196ba622b907b1b6226" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.571942 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.573599 4931 scope.go:117] "RemoveContainer" containerID="67791474132ea95207d226ce291f362b43314fdb91b8a0365481b75a910cde7a" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.595315 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.600058 4931 scope.go:117] "RemoveContainer" containerID="aed3b001cce71a5012e1a07b41789b31d0ea72897e75f2bf7e3bbd6b03358b00" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.605323 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:49:36 crc kubenswrapper[4931]: E0130 06:49:36.605855 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="proxy-httpd" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.605872 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="proxy-httpd" Jan 30 06:49:36 crc kubenswrapper[4931]: E0130 06:49:36.605902 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-central-agent" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.605908 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-central-agent" Jan 30 06:49:36 crc kubenswrapper[4931]: E0130 06:49:36.605925 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-notification-agent" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.605932 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-notification-agent" Jan 30 06:49:36 crc kubenswrapper[4931]: E0130 06:49:36.605943 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="sg-core" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.605950 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="sg-core" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.606135 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-central-agent" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.606164 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="proxy-httpd" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.606174 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="ceilometer-notification-agent" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.606184 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ace174a-c316-432c-82da-840f5e2283d1" containerName="sg-core" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.608097 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.613849 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.614736 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.631169 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.687890 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-run-httpd\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.687929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.687968 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-scripts\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.688008 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.688031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-config-data\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.688087 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-log-httpd\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.688108 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqxwx\" (UniqueName: \"kubernetes.io/projected/83416e39-1feb-47a7-9e5d-748122bed281-kube-api-access-pqxwx\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.789753 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.789822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-config-data\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.789905 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-log-httpd\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.789929 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqxwx\" (UniqueName: \"kubernetes.io/projected/83416e39-1feb-47a7-9e5d-748122bed281-kube-api-access-pqxwx\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.790049 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-run-httpd\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.790071 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.790128 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-scripts\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.790751 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-log-httpd\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.790790 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-run-httpd\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.796043 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.796850 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-scripts\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.797750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-config-data\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.811793 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqxwx\" (UniqueName: \"kubernetes.io/projected/83416e39-1feb-47a7-9e5d-748122bed281-kube-api-access-pqxwx\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.817619 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " pod="openstack/ceilometer-0" Jan 30 06:49:36 crc kubenswrapper[4931]: I0130 06:49:36.932728 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:49:37 crc kubenswrapper[4931]: I0130 06:49:37.438143 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ace174a-c316-432c-82da-840f5e2283d1" path="/var/lib/kubelet/pods/1ace174a-c316-432c-82da-840f5e2283d1/volumes" Jan 30 06:49:37 crc kubenswrapper[4931]: I0130 06:49:37.442481 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:49:37 crc kubenswrapper[4931]: I0130 06:49:37.530813 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerStarted","Data":"c52383b65f03eb04f4d7782d65ea62f04f107a518cccdbc33d3de6808caeee13"} Jan 30 06:49:38 crc kubenswrapper[4931]: I0130 06:49:38.541016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerStarted","Data":"e6075bddf7d9e844cbcadfacf5366aedab5b6c0869d55610da59bca13b8f2e10"} Jan 30 06:49:39 crc kubenswrapper[4931]: I0130 06:49:39.552034 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerStarted","Data":"01bff7fca0dad20c19afe6ca1af5b694418bb46bbfaf5130445564c32e038144"} Jan 30 06:49:42 crc kubenswrapper[4931]: I0130 06:49:42.580325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerStarted","Data":"d6ffd18a5fcda569a257ea582064d96c557e419380bbc5b6e6ffba7dcc0e2a22"} Jan 30 06:49:44 crc kubenswrapper[4931]: I0130 06:49:44.625265 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerStarted","Data":"c0fc86cc25abacab572b554b477bbaff87a06a333e44043d43cc5be409f8a89e"} Jan 30 06:49:44 crc kubenswrapper[4931]: I0130 06:49:44.625868 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:49:44 crc kubenswrapper[4931]: I0130 06:49:44.656038 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.995757448 podStartE2EDuration="8.65602008s" podCreationTimestamp="2026-01-30 06:49:36 +0000 UTC" firstStartedPulling="2026-01-30 06:49:37.45193214 +0000 UTC m=+6112.821842397" lastFinishedPulling="2026-01-30 06:49:44.112194772 +0000 UTC m=+6119.482105029" observedRunningTime="2026-01-30 06:49:44.649940389 +0000 UTC m=+6120.019850646" watchObservedRunningTime="2026-01-30 06:49:44.65602008 +0000 UTC m=+6120.025930337" Jan 30 06:49:44 crc kubenswrapper[4931]: I0130 06:49:44.992963 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-45ct9"] Jan 30 06:49:44 crc kubenswrapper[4931]: I0130 06:49:44.995948 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.008486 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-45ct9"] Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.097224 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-65c4-account-create-update-rfndg"] Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.099068 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.101238 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.107330 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-65c4-account-create-update-rfndg"] Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.110625 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2w5m\" (UniqueName: \"kubernetes.io/projected/448719bb-ff8e-4d9e-982b-a8425f907a15-kube-api-access-l2w5m\") pod \"manila-db-create-45ct9\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.110754 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/448719bb-ff8e-4d9e-982b-a8425f907a15-operator-scripts\") pod \"manila-db-create-45ct9\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.212721 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2w5m\" (UniqueName: \"kubernetes.io/projected/448719bb-ff8e-4d9e-982b-a8425f907a15-kube-api-access-l2w5m\") pod \"manila-db-create-45ct9\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.212817 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/448719bb-ff8e-4d9e-982b-a8425f907a15-operator-scripts\") pod \"manila-db-create-45ct9\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.212929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/037161b5-dad9-4d8f-9be4-f980ee947129-operator-scripts\") pod \"manila-65c4-account-create-update-rfndg\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.212958 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvwc\" (UniqueName: \"kubernetes.io/projected/037161b5-dad9-4d8f-9be4-f980ee947129-kube-api-access-6gvwc\") pod \"manila-65c4-account-create-update-rfndg\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.213810 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/448719bb-ff8e-4d9e-982b-a8425f907a15-operator-scripts\") pod \"manila-db-create-45ct9\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.232014 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2w5m\" (UniqueName: \"kubernetes.io/projected/448719bb-ff8e-4d9e-982b-a8425f907a15-kube-api-access-l2w5m\") pod \"manila-db-create-45ct9\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.314374 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/037161b5-dad9-4d8f-9be4-f980ee947129-operator-scripts\") pod \"manila-65c4-account-create-update-rfndg\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.314538 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvwc\" (UniqueName: \"kubernetes.io/projected/037161b5-dad9-4d8f-9be4-f980ee947129-kube-api-access-6gvwc\") pod \"manila-65c4-account-create-update-rfndg\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.315574 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/037161b5-dad9-4d8f-9be4-f980ee947129-operator-scripts\") pod \"manila-65c4-account-create-update-rfndg\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.334689 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-45ct9" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.353254 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvwc\" (UniqueName: \"kubernetes.io/projected/037161b5-dad9-4d8f-9be4-f980ee947129-kube-api-access-6gvwc\") pod \"manila-65c4-account-create-update-rfndg\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.420387 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:45 crc kubenswrapper[4931]: I0130 06:49:45.814910 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-45ct9"] Jan 30 06:49:46 crc kubenswrapper[4931]: W0130 06:49:46.180398 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod037161b5_dad9_4d8f_9be4_f980ee947129.slice/crio-88f853b1d6e7fa7789617a86c9ca6cda2ea7a5e89d27946d5df194f470ea3b57 WatchSource:0}: Error finding container 88f853b1d6e7fa7789617a86c9ca6cda2ea7a5e89d27946d5df194f470ea3b57: Status 404 returned error can't find the container with id 88f853b1d6e7fa7789617a86c9ca6cda2ea7a5e89d27946d5df194f470ea3b57 Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.185647 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-65c4-account-create-update-rfndg"] Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.185789 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.675477 4931 generic.go:334] "Generic (PLEG): container finished" podID="037161b5-dad9-4d8f-9be4-f980ee947129" containerID="7c8becae24c7a8a33bf584e1ab34512a30cd0f1208b8f42cb257da9c6245e6c8" exitCode=0 Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.675688 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-65c4-account-create-update-rfndg" event={"ID":"037161b5-dad9-4d8f-9be4-f980ee947129","Type":"ContainerDied","Data":"7c8becae24c7a8a33bf584e1ab34512a30cd0f1208b8f42cb257da9c6245e6c8"} Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.675820 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-65c4-account-create-update-rfndg" event={"ID":"037161b5-dad9-4d8f-9be4-f980ee947129","Type":"ContainerStarted","Data":"88f853b1d6e7fa7789617a86c9ca6cda2ea7a5e89d27946d5df194f470ea3b57"} Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.678715 4931 generic.go:334] "Generic (PLEG): container finished" podID="448719bb-ff8e-4d9e-982b-a8425f907a15" containerID="030be6de81f263d984b02a8d10e7722844ea7978d675c59a14a66ccbbd2666b2" exitCode=0 Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.678750 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-45ct9" event={"ID":"448719bb-ff8e-4d9e-982b-a8425f907a15","Type":"ContainerDied","Data":"030be6de81f263d984b02a8d10e7722844ea7978d675c59a14a66ccbbd2666b2"} Jan 30 06:49:46 crc kubenswrapper[4931]: I0130 06:49:46.678769 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-45ct9" event={"ID":"448719bb-ff8e-4d9e-982b-a8425f907a15","Type":"ContainerStarted","Data":"e434123ad5b1fb377d0c8862f9eb8db04393395119cd0b0bbe793eaf79cbbfc2"} Jan 30 06:49:47 crc kubenswrapper[4931]: I0130 06:49:47.042986 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f264-account-create-update-hm2jw"] Jan 30 06:49:47 crc kubenswrapper[4931]: I0130 06:49:47.052223 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7fmgw"] Jan 30 06:49:47 crc kubenswrapper[4931]: I0130 06:49:47.062666 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7fmgw"] Jan 30 06:49:47 crc kubenswrapper[4931]: I0130 06:49:47.070690 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f264-account-create-update-hm2jw"] Jan 30 06:49:47 crc kubenswrapper[4931]: I0130 06:49:47.438697 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bda186-cc7a-4422-8266-5f494795cf7f" path="/var/lib/kubelet/pods/44bda186-cc7a-4422-8266-5f494795cf7f/volumes" Jan 30 06:49:47 crc kubenswrapper[4931]: I0130 06:49:47.439427 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7909a37-a194-4666-b642-8193c2b8e29c" path="/var/lib/kubelet/pods/a7909a37-a194-4666-b642-8193c2b8e29c/volumes" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.207421 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.215921 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-45ct9" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.283535 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gvwc\" (UniqueName: \"kubernetes.io/projected/037161b5-dad9-4d8f-9be4-f980ee947129-kube-api-access-6gvwc\") pod \"037161b5-dad9-4d8f-9be4-f980ee947129\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.283940 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/037161b5-dad9-4d8f-9be4-f980ee947129-operator-scripts\") pod \"037161b5-dad9-4d8f-9be4-f980ee947129\" (UID: \"037161b5-dad9-4d8f-9be4-f980ee947129\") " Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.284781 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037161b5-dad9-4d8f-9be4-f980ee947129-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "037161b5-dad9-4d8f-9be4-f980ee947129" (UID: "037161b5-dad9-4d8f-9be4-f980ee947129"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.292426 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037161b5-dad9-4d8f-9be4-f980ee947129-kube-api-access-6gvwc" (OuterVolumeSpecName: "kube-api-access-6gvwc") pod "037161b5-dad9-4d8f-9be4-f980ee947129" (UID: "037161b5-dad9-4d8f-9be4-f980ee947129"). InnerVolumeSpecName "kube-api-access-6gvwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.387080 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/448719bb-ff8e-4d9e-982b-a8425f907a15-operator-scripts\") pod \"448719bb-ff8e-4d9e-982b-a8425f907a15\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.387894 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2w5m\" (UniqueName: \"kubernetes.io/projected/448719bb-ff8e-4d9e-982b-a8425f907a15-kube-api-access-l2w5m\") pod \"448719bb-ff8e-4d9e-982b-a8425f907a15\" (UID: \"448719bb-ff8e-4d9e-982b-a8425f907a15\") " Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.388849 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/037161b5-dad9-4d8f-9be4-f980ee947129-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.388868 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gvwc\" (UniqueName: \"kubernetes.io/projected/037161b5-dad9-4d8f-9be4-f980ee947129-kube-api-access-6gvwc\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.389300 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/448719bb-ff8e-4d9e-982b-a8425f907a15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "448719bb-ff8e-4d9e-982b-a8425f907a15" (UID: "448719bb-ff8e-4d9e-982b-a8425f907a15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.403177 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/448719bb-ff8e-4d9e-982b-a8425f907a15-kube-api-access-l2w5m" (OuterVolumeSpecName: "kube-api-access-l2w5m") pod "448719bb-ff8e-4d9e-982b-a8425f907a15" (UID: "448719bb-ff8e-4d9e-982b-a8425f907a15"). InnerVolumeSpecName "kube-api-access-l2w5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.491036 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2w5m\" (UniqueName: \"kubernetes.io/projected/448719bb-ff8e-4d9e-982b-a8425f907a15-kube-api-access-l2w5m\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.491075 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/448719bb-ff8e-4d9e-982b-a8425f907a15-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.698399 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-65c4-account-create-update-rfndg" event={"ID":"037161b5-dad9-4d8f-9be4-f980ee947129","Type":"ContainerDied","Data":"88f853b1d6e7fa7789617a86c9ca6cda2ea7a5e89d27946d5df194f470ea3b57"} Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.698477 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f853b1d6e7fa7789617a86c9ca6cda2ea7a5e89d27946d5df194f470ea3b57" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.698610 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-65c4-account-create-update-rfndg" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.708231 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-45ct9" event={"ID":"448719bb-ff8e-4d9e-982b-a8425f907a15","Type":"ContainerDied","Data":"e434123ad5b1fb377d0c8862f9eb8db04393395119cd0b0bbe793eaf79cbbfc2"} Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.708272 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e434123ad5b1fb377d0c8862f9eb8db04393395119cd0b0bbe793eaf79cbbfc2" Jan 30 06:49:48 crc kubenswrapper[4931]: I0130 06:49:48.708343 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-45ct9" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.687132 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-ctjj7"] Jan 30 06:49:50 crc kubenswrapper[4931]: E0130 06:49:50.687942 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037161b5-dad9-4d8f-9be4-f980ee947129" containerName="mariadb-account-create-update" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.687960 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="037161b5-dad9-4d8f-9be4-f980ee947129" containerName="mariadb-account-create-update" Jan 30 06:49:50 crc kubenswrapper[4931]: E0130 06:49:50.687975 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448719bb-ff8e-4d9e-982b-a8425f907a15" containerName="mariadb-database-create" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.687982 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="448719bb-ff8e-4d9e-982b-a8425f907a15" containerName="mariadb-database-create" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.688240 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="037161b5-dad9-4d8f-9be4-f980ee947129" containerName="mariadb-account-create-update" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.688267 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="448719bb-ff8e-4d9e-982b-a8425f907a15" containerName="mariadb-database-create" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.689372 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.693772 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-h2dqk" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.706371 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-ctjj7"] Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.711849 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.842144 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-config-data\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.842637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-combined-ca-bundle\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.842887 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-job-config-data\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.843010 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/2f518288-3c69-4f3a-9e32-9f9211cab22a-kube-api-access-gktd8\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.944918 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-config-data\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.945014 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-combined-ca-bundle\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.945061 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-job-config-data\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.945087 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/2f518288-3c69-4f3a-9e32-9f9211cab22a-kube-api-access-gktd8\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.960198 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-combined-ca-bundle\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.960988 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-job-config-data\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.961045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-config-data\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:50 crc kubenswrapper[4931]: I0130 06:49:50.973315 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/2f518288-3c69-4f3a-9e32-9f9211cab22a-kube-api-access-gktd8\") pod \"manila-db-sync-ctjj7\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:51 crc kubenswrapper[4931]: I0130 06:49:51.010858 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ctjj7" Jan 30 06:49:52 crc kubenswrapper[4931]: W0130 06:49:52.106348 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f518288_3c69_4f3a_9e32_9f9211cab22a.slice/crio-166e1550f8f34b4565cb40cded79456bbdb37c4ffea08a4b6b17690ebabe67e1 WatchSource:0}: Error finding container 166e1550f8f34b4565cb40cded79456bbdb37c4ffea08a4b6b17690ebabe67e1: Status 404 returned error can't find the container with id 166e1550f8f34b4565cb40cded79456bbdb37c4ffea08a4b6b17690ebabe67e1 Jan 30 06:49:52 crc kubenswrapper[4931]: I0130 06:49:52.109179 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-ctjj7"] Jan 30 06:49:52 crc kubenswrapper[4931]: I0130 06:49:52.751351 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ctjj7" event={"ID":"2f518288-3c69-4f3a-9e32-9f9211cab22a","Type":"ContainerStarted","Data":"166e1550f8f34b4565cb40cded79456bbdb37c4ffea08a4b6b17690ebabe67e1"} Jan 30 06:49:55 crc kubenswrapper[4931]: I0130 06:49:55.058688 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tfpsx"] Jan 30 06:49:55 crc kubenswrapper[4931]: I0130 06:49:55.074398 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tfpsx"] Jan 30 06:49:55 crc kubenswrapper[4931]: I0130 06:49:55.436353 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0677a1-9051-4719-9e4e-142694e6683a" path="/var/lib/kubelet/pods/de0677a1-9051-4719-9e4e-142694e6683a/volumes" Jan 30 06:49:56 crc kubenswrapper[4931]: I0130 06:49:56.897332 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6wq2"] Jan 30 06:49:56 crc kubenswrapper[4931]: I0130 06:49:56.935469 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:56 crc kubenswrapper[4931]: I0130 06:49:56.980288 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6wq2"] Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.077123 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngkp9\" (UniqueName: \"kubernetes.io/projected/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-kube-api-access-ngkp9\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.077232 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-catalog-content\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.077502 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-utilities\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.179265 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkp9\" (UniqueName: \"kubernetes.io/projected/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-kube-api-access-ngkp9\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.179470 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-catalog-content\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.179693 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-utilities\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.179907 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-catalog-content\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.180122 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-utilities\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.203708 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkp9\" (UniqueName: \"kubernetes.io/projected/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-kube-api-access-ngkp9\") pod \"community-operators-j6wq2\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.271620 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.363440 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:49:57 crc kubenswrapper[4931]: I0130 06:49:57.363499 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:50:01 crc kubenswrapper[4931]: I0130 06:50:01.752508 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6wq2"] Jan 30 06:50:01 crc kubenswrapper[4931]: W0130 06:50:01.758470 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b12adab_30ea_4122_9ea7_0c8b2fdb117c.slice/crio-a18b31c9c3fa680e8d93ca8737e2041057cbd5a166876584d210ca053654abaf WatchSource:0}: Error finding container a18b31c9c3fa680e8d93ca8737e2041057cbd5a166876584d210ca053654abaf: Status 404 returned error can't find the container with id a18b31c9c3fa680e8d93ca8737e2041057cbd5a166876584d210ca053654abaf Jan 30 06:50:01 crc kubenswrapper[4931]: I0130 06:50:01.837201 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerStarted","Data":"a18b31c9c3fa680e8d93ca8737e2041057cbd5a166876584d210ca053654abaf"} Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.337012 4931 scope.go:117] "RemoveContainer" containerID="956ce554bd663761599c9dc4f978e7719f40043720c3d10db30cc18c76ff6127" Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.412742 4931 scope.go:117] "RemoveContainer" containerID="cff7e0d64b5667667e85a8a7d8d6d557567a72e224933981bb30fb75cc9c37a5" Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.505038 4931 scope.go:117] "RemoveContainer" containerID="18638ce6c93ee0d191ba3ee6b587a88fb1ae5413bad9f52cba0bc5cd608d3a29" Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.571728 4931 scope.go:117] "RemoveContainer" containerID="3f61c0f02fd2c2ced024d1e703805185ca2f7ee2e42863e0e8a06a4f812766d2" Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.605772 4931 scope.go:117] "RemoveContainer" containerID="ad91a28e445938a8582000a48ddcd232576020a5c15a4a29af6e45aaf8531507" Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.849524 4931 generic.go:334] "Generic (PLEG): container finished" podID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerID="b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d" exitCode=0 Jan 30 06:50:02 crc kubenswrapper[4931]: I0130 06:50:02.849566 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerDied","Data":"b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d"} Jan 30 06:50:03 crc kubenswrapper[4931]: I0130 06:50:03.862719 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ctjj7" event={"ID":"2f518288-3c69-4f3a-9e32-9f9211cab22a","Type":"ContainerStarted","Data":"e300d33068406baea942af2b5b021d10a35ce639099354dd534b82d9b9278f4c"} Jan 30 06:50:03 crc kubenswrapper[4931]: I0130 06:50:03.885101 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-ctjj7" podStartSLOduration=3.777201205 podStartE2EDuration="13.885075983s" podCreationTimestamp="2026-01-30 06:49:50 +0000 UTC" firstStartedPulling="2026-01-30 06:49:52.113507723 +0000 UTC m=+6127.483418000" lastFinishedPulling="2026-01-30 06:50:02.221382481 +0000 UTC m=+6137.591292778" observedRunningTime="2026-01-30 06:50:03.876922184 +0000 UTC m=+6139.246832441" watchObservedRunningTime="2026-01-30 06:50:03.885075983 +0000 UTC m=+6139.254986250" Jan 30 06:50:05 crc kubenswrapper[4931]: I0130 06:50:05.885346 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerStarted","Data":"681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4"} Jan 30 06:50:06 crc kubenswrapper[4931]: I0130 06:50:06.980900 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 06:50:07 crc kubenswrapper[4931]: I0130 06:50:07.908639 4931 generic.go:334] "Generic (PLEG): container finished" podID="2f518288-3c69-4f3a-9e32-9f9211cab22a" containerID="e300d33068406baea942af2b5b021d10a35ce639099354dd534b82d9b9278f4c" exitCode=0 Jan 30 06:50:07 crc kubenswrapper[4931]: I0130 06:50:07.908706 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ctjj7" event={"ID":"2f518288-3c69-4f3a-9e32-9f9211cab22a","Type":"ContainerDied","Data":"e300d33068406baea942af2b5b021d10a35ce639099354dd534b82d9b9278f4c"} Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.479116 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ctjj7" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.595782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-config-data\") pod \"2f518288-3c69-4f3a-9e32-9f9211cab22a\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.595853 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-combined-ca-bundle\") pod \"2f518288-3c69-4f3a-9e32-9f9211cab22a\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.595917 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/2f518288-3c69-4f3a-9e32-9f9211cab22a-kube-api-access-gktd8\") pod \"2f518288-3c69-4f3a-9e32-9f9211cab22a\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.596068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-job-config-data\") pod \"2f518288-3c69-4f3a-9e32-9f9211cab22a\" (UID: \"2f518288-3c69-4f3a-9e32-9f9211cab22a\") " Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.610631 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f518288-3c69-4f3a-9e32-9f9211cab22a-kube-api-access-gktd8" (OuterVolumeSpecName: "kube-api-access-gktd8") pod "2f518288-3c69-4f3a-9e32-9f9211cab22a" (UID: "2f518288-3c69-4f3a-9e32-9f9211cab22a"). InnerVolumeSpecName "kube-api-access-gktd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.613375 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-config-data" (OuterVolumeSpecName: "config-data") pod "2f518288-3c69-4f3a-9e32-9f9211cab22a" (UID: "2f518288-3c69-4f3a-9e32-9f9211cab22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.618958 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "2f518288-3c69-4f3a-9e32-9f9211cab22a" (UID: "2f518288-3c69-4f3a-9e32-9f9211cab22a"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.635492 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f518288-3c69-4f3a-9e32-9f9211cab22a" (UID: "2f518288-3c69-4f3a-9e32-9f9211cab22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.698137 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.698174 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/2f518288-3c69-4f3a-9e32-9f9211cab22a-kube-api-access-gktd8\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.698184 4931 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-job-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.698195 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f518288-3c69-4f3a-9e32-9f9211cab22a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.926849 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ctjj7" event={"ID":"2f518288-3c69-4f3a-9e32-9f9211cab22a","Type":"ContainerDied","Data":"166e1550f8f34b4565cb40cded79456bbdb37c4ffea08a4b6b17690ebabe67e1"} Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.927200 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="166e1550f8f34b4565cb40cded79456bbdb37c4ffea08a4b6b17690ebabe67e1" Jan 30 06:50:09 crc kubenswrapper[4931]: I0130 06:50:09.926885 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ctjj7" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.587475 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 30 06:50:10 crc kubenswrapper[4931]: E0130 06:50:10.587911 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f518288-3c69-4f3a-9e32-9f9211cab22a" containerName="manila-db-sync" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.587923 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f518288-3c69-4f3a-9e32-9f9211cab22a" containerName="manila-db-sync" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.588137 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f518288-3c69-4f3a-9e32-9f9211cab22a" containerName="manila-db-sync" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.589242 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.598819 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59977785bf-q4vw9"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.599149 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.599295 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-h2dqk" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.599315 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.599489 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.600610 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.607359 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.609182 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.617479 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.617726 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.635369 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.688023 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59977785bf-q4vw9"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.720586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-dns-svc\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.720940 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.720969 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721066 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721157 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-config\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-config-data\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-scripts\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721266 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721289 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721310 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721340 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-scripts\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721368 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28zr\" (UniqueName: \"kubernetes.io/projected/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-kube-api-access-d28zr\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721402 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-config-data\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-ovsdbserver-sb\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721480 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6c7\" (UniqueName: \"kubernetes.io/projected/b05cd1de-6848-4de5-92f4-399913835db3-kube-api-access-6m6c7\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721522 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pstx2\" (UniqueName: \"kubernetes.io/projected/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-kube-api-access-pstx2\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721600 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-ceph\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721643 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-ovsdbserver-nb\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.721666 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.765960 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.767906 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.772207 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.781659 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.822885 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pstx2\" (UniqueName: \"kubernetes.io/projected/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-kube-api-access-pstx2\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.822955 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-config-data-custom\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823013 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-etc-machine-id\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823033 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-ceph\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823065 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-ovsdbserver-nb\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823080 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823125 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-logs\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823143 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-dns-svc\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823165 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823182 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823208 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-config\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823273 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-config-data\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823296 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-scripts\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823313 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823346 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823362 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823379 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-scripts\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823396 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-scripts\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823416 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28zr\" (UniqueName: \"kubernetes.io/projected/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-kube-api-access-d28zr\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823446 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrzc\" (UniqueName: \"kubernetes.io/projected/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-kube-api-access-9qrzc\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823464 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-config-data\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823479 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-config-data\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823499 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-ovsdbserver-sb\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.823525 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6c7\" (UniqueName: \"kubernetes.io/projected/b05cd1de-6848-4de5-92f4-399913835db3-kube-api-access-6m6c7\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.825120 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-ovsdbserver-nb\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.826673 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-ovsdbserver-sb\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.827219 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-config\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.827354 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b05cd1de-6848-4de5-92f4-399913835db3-dns-svc\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.827487 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.827527 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.828153 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.831067 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-scripts\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.833145 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.834700 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-config-data\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.839660 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6c7\" (UniqueName: \"kubernetes.io/projected/b05cd1de-6848-4de5-92f4-399913835db3-kube-api-access-6m6c7\") pod \"dnsmasq-dns-59977785bf-q4vw9\" (UID: \"b05cd1de-6848-4de5-92f4-399913835db3\") " pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.840052 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.844127 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-ceph\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.844653 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-config-data\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.845888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-scripts\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.846360 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.847975 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.856607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28zr\" (UniqueName: \"kubernetes.io/projected/3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70-kube-api-access-d28zr\") pod \"manila-share-share1-0\" (UID: \"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70\") " pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.875818 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pstx2\" (UniqueName: \"kubernetes.io/projected/b9eb637a-1c6e-47f5-87ec-fa28c244db0b-kube-api-access-pstx2\") pod \"manila-scheduler-0\" (UID: \"b9eb637a-1c6e-47f5-87ec-fa28c244db0b\") " pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.923453 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936016 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-logs\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936173 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936205 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-scripts\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936246 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrzc\" (UniqueName: \"kubernetes.io/projected/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-kube-api-access-9qrzc\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936269 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-config-data\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936317 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-config-data-custom\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936376 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-etc-machine-id\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936394 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-logs\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936583 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.936620 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-etc-machine-id\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.939575 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-scripts\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.939951 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.940268 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-config-data-custom\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.941827 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-config-data\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.945340 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 30 06:50:10 crc kubenswrapper[4931]: I0130 06:50:10.957394 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrzc\" (UniqueName: \"kubernetes.io/projected/50c75c49-3fc8-4f3e-9af2-66535e3b49a9-kube-api-access-9qrzc\") pod \"manila-api-0\" (UID: \"50c75c49-3fc8-4f3e-9af2-66535e3b49a9\") " pod="openstack/manila-api-0" Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.092559 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.529287 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.591956 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59977785bf-q4vw9"] Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.781185 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.949718 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" event={"ID":"b05cd1de-6848-4de5-92f4-399913835db3","Type":"ContainerStarted","Data":"89d6e80860900a2a9ba9ceaeb6a787c29072ba5d410dfd6349fc20a33ed3e5a9"} Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.949768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" event={"ID":"b05cd1de-6848-4de5-92f4-399913835db3","Type":"ContainerStarted","Data":"3cd0c7d307be633a0d3b62ce55290c53bc46cd5e7f5cfbff1dd13256a3f46de7"} Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.952161 4931 generic.go:334] "Generic (PLEG): container finished" podID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerID="681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4" exitCode=0 Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.952256 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerDied","Data":"681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4"} Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.953168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b9eb637a-1c6e-47f5-87ec-fa28c244db0b","Type":"ContainerStarted","Data":"c197a3157a7e967f47a4986f2f525f9a378e4435446719644353b51b8309043a"} Jan 30 06:50:11 crc kubenswrapper[4931]: I0130 06:50:11.954794 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70","Type":"ContainerStarted","Data":"3c61c439f05978a8c0bd0c74057137c1fb75b1562304656a69e25d66548a55b8"} Jan 30 06:50:12 crc kubenswrapper[4931]: I0130 06:50:12.847454 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 30 06:50:12 crc kubenswrapper[4931]: I0130 06:50:12.969305 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"50c75c49-3fc8-4f3e-9af2-66535e3b49a9","Type":"ContainerStarted","Data":"9a5e78170ba9979b37144bb6c69619a96fe71ec83f7f7ae1c8aa04dd333aa696"} Jan 30 06:50:12 crc kubenswrapper[4931]: I0130 06:50:12.970796 4931 generic.go:334] "Generic (PLEG): container finished" podID="b05cd1de-6848-4de5-92f4-399913835db3" containerID="89d6e80860900a2a9ba9ceaeb6a787c29072ba5d410dfd6349fc20a33ed3e5a9" exitCode=0 Jan 30 06:50:12 crc kubenswrapper[4931]: I0130 06:50:12.970842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" event={"ID":"b05cd1de-6848-4de5-92f4-399913835db3","Type":"ContainerDied","Data":"89d6e80860900a2a9ba9ceaeb6a787c29072ba5d410dfd6349fc20a33ed3e5a9"} Jan 30 06:50:12 crc kubenswrapper[4931]: I0130 06:50:12.980232 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerStarted","Data":"137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1"} Jan 30 06:50:13 crc kubenswrapper[4931]: I0130 06:50:13.056066 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6wq2" podStartSLOduration=7.405172794 podStartE2EDuration="17.056044989s" podCreationTimestamp="2026-01-30 06:49:56 +0000 UTC" firstStartedPulling="2026-01-30 06:50:02.851409496 +0000 UTC m=+6138.221319763" lastFinishedPulling="2026-01-30 06:50:12.502281701 +0000 UTC m=+6147.872191958" observedRunningTime="2026-01-30 06:50:13.016666541 +0000 UTC m=+6148.386576808" watchObservedRunningTime="2026-01-30 06:50:13.056044989 +0000 UTC m=+6148.425955246" Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.002601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b9eb637a-1c6e-47f5-87ec-fa28c244db0b","Type":"ContainerStarted","Data":"fff3cd3da7a1e0565d6c3ab09c75b7c979ce4896ad15ee33669392f300594aab"} Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.003182 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b9eb637a-1c6e-47f5-87ec-fa28c244db0b","Type":"ContainerStarted","Data":"bf40182a00506ac472df5b8e5486b83c229d757049edd269d9747aa93052e605"} Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.007078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"50c75c49-3fc8-4f3e-9af2-66535e3b49a9","Type":"ContainerStarted","Data":"3a0fdc2feb00fcee5361f31e8874e980b3b4cc3670d04aa01f305230ca363abb"} Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.014016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" event={"ID":"b05cd1de-6848-4de5-92f4-399913835db3","Type":"ContainerStarted","Data":"d0320521dc403d998ca155278a7f3957b38bcbd67acc9239576b5a083b9a2b3c"} Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.014241 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.046587 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.060225118 podStartE2EDuration="4.046565082s" podCreationTimestamp="2026-01-30 06:50:10 +0000 UTC" firstStartedPulling="2026-01-30 06:50:11.531383802 +0000 UTC m=+6146.901294059" lastFinishedPulling="2026-01-30 06:50:12.517723756 +0000 UTC m=+6147.887634023" observedRunningTime="2026-01-30 06:50:14.024019567 +0000 UTC m=+6149.393929824" watchObservedRunningTime="2026-01-30 06:50:14.046565082 +0000 UTC m=+6149.416475339" Jan 30 06:50:14 crc kubenswrapper[4931]: I0130 06:50:14.054283 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" podStartSLOduration=4.054267579 podStartE2EDuration="4.054267579s" podCreationTimestamp="2026-01-30 06:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:50:14.043075854 +0000 UTC m=+6149.412986111" watchObservedRunningTime="2026-01-30 06:50:14.054267579 +0000 UTC m=+6149.424177826" Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.053491 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"50c75c49-3fc8-4f3e-9af2-66535e3b49a9","Type":"ContainerStarted","Data":"23effb781db0029b2a0562728b1467627c21585f5fd6ba1d1c58f8dbbcc65656"} Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.054217 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.105874 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.105858121 podStartE2EDuration="5.105858121s" podCreationTimestamp="2026-01-30 06:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:50:15.09697105 +0000 UTC m=+6150.466881317" watchObservedRunningTime="2026-01-30 06:50:15.105858121 +0000 UTC m=+6150.475768378" Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.150173 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.150448 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-central-agent" containerID="cri-o://e6075bddf7d9e844cbcadfacf5366aedab5b6c0869d55610da59bca13b8f2e10" gracePeriod=30 Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.150562 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="proxy-httpd" containerID="cri-o://c0fc86cc25abacab572b554b477bbaff87a06a333e44043d43cc5be409f8a89e" gracePeriod=30 Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.150600 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="sg-core" containerID="cri-o://d6ffd18a5fcda569a257ea582064d96c557e419380bbc5b6e6ffba7dcc0e2a22" gracePeriod=30 Jan 30 06:50:15 crc kubenswrapper[4931]: I0130 06:50:15.150634 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-notification-agent" containerID="cri-o://01bff7fca0dad20c19afe6ca1af5b694418bb46bbfaf5130445564c32e038144" gracePeriod=30 Jan 30 06:50:16 crc kubenswrapper[4931]: I0130 06:50:16.065978 4931 generic.go:334] "Generic (PLEG): container finished" podID="83416e39-1feb-47a7-9e5d-748122bed281" containerID="c0fc86cc25abacab572b554b477bbaff87a06a333e44043d43cc5be409f8a89e" exitCode=0 Jan 30 06:50:16 crc kubenswrapper[4931]: I0130 06:50:16.066326 4931 generic.go:334] "Generic (PLEG): container finished" podID="83416e39-1feb-47a7-9e5d-748122bed281" containerID="d6ffd18a5fcda569a257ea582064d96c557e419380bbc5b6e6ffba7dcc0e2a22" exitCode=2 Jan 30 06:50:16 crc kubenswrapper[4931]: I0130 06:50:16.066337 4931 generic.go:334] "Generic (PLEG): container finished" podID="83416e39-1feb-47a7-9e5d-748122bed281" containerID="e6075bddf7d9e844cbcadfacf5366aedab5b6c0869d55610da59bca13b8f2e10" exitCode=0 Jan 30 06:50:16 crc kubenswrapper[4931]: I0130 06:50:16.066069 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerDied","Data":"c0fc86cc25abacab572b554b477bbaff87a06a333e44043d43cc5be409f8a89e"} Jan 30 06:50:16 crc kubenswrapper[4931]: I0130 06:50:16.066459 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerDied","Data":"d6ffd18a5fcda569a257ea582064d96c557e419380bbc5b6e6ffba7dcc0e2a22"} Jan 30 06:50:16 crc kubenswrapper[4931]: I0130 06:50:16.066476 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerDied","Data":"e6075bddf7d9e844cbcadfacf5366aedab5b6c0869d55610da59bca13b8f2e10"} Jan 30 06:50:17 crc kubenswrapper[4931]: I0130 06:50:17.271912 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:50:17 crc kubenswrapper[4931]: I0130 06:50:17.272461 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:50:18 crc kubenswrapper[4931]: I0130 06:50:18.090948 4931 generic.go:334] "Generic (PLEG): container finished" podID="83416e39-1feb-47a7-9e5d-748122bed281" containerID="01bff7fca0dad20c19afe6ca1af5b694418bb46bbfaf5130445564c32e038144" exitCode=0 Jan 30 06:50:18 crc kubenswrapper[4931]: I0130 06:50:18.091179 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerDied","Data":"01bff7fca0dad20c19afe6ca1af5b694418bb46bbfaf5130445564c32e038144"} Jan 30 06:50:18 crc kubenswrapper[4931]: I0130 06:50:18.321137 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-j6wq2" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="registry-server" probeResult="failure" output=< Jan 30 06:50:18 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:50:18 crc kubenswrapper[4931]: > Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.191104 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.261469 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-run-httpd\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.261546 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-scripts\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.261711 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-sg-core-conf-yaml\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.261741 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-combined-ca-bundle\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.261821 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-log-httpd\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.262001 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqxwx\" (UniqueName: \"kubernetes.io/projected/83416e39-1feb-47a7-9e5d-748122bed281-kube-api-access-pqxwx\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.262188 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-config-data\") pod \"83416e39-1feb-47a7-9e5d-748122bed281\" (UID: \"83416e39-1feb-47a7-9e5d-748122bed281\") " Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.262352 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.262382 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.263448 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.263473 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83416e39-1feb-47a7-9e5d-748122bed281-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.266752 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83416e39-1feb-47a7-9e5d-748122bed281-kube-api-access-pqxwx" (OuterVolumeSpecName: "kube-api-access-pqxwx") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "kube-api-access-pqxwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.267235 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-scripts" (OuterVolumeSpecName: "scripts") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.297645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.360530 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.365665 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.365691 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.365702 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.365710 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqxwx\" (UniqueName: \"kubernetes.io/projected/83416e39-1feb-47a7-9e5d-748122bed281-kube-api-access-pqxwx\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.397298 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-config-data" (OuterVolumeSpecName: "config-data") pod "83416e39-1feb-47a7-9e5d-748122bed281" (UID: "83416e39-1feb-47a7-9e5d-748122bed281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:19 crc kubenswrapper[4931]: I0130 06:50:19.467858 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83416e39-1feb-47a7-9e5d-748122bed281-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.125024 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70","Type":"ContainerStarted","Data":"23733cadfa5f645805d1ead2ddee0cb62565699bfce6560a00cd522cf2fa23f2"} Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.125524 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70","Type":"ContainerStarted","Data":"184f4f07cc92de22f89c14c90a2d5c84bf1ec61f1a94772da5110b4c53f3590c"} Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.141643 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83416e39-1feb-47a7-9e5d-748122bed281","Type":"ContainerDied","Data":"c52383b65f03eb04f4d7782d65ea62f04f107a518cccdbc33d3de6808caeee13"} Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.141697 4931 scope.go:117] "RemoveContainer" containerID="c0fc86cc25abacab572b554b477bbaff87a06a333e44043d43cc5be409f8a89e" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.141820 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.161859 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.019763289 podStartE2EDuration="10.161840762s" podCreationTimestamp="2026-01-30 06:50:10 +0000 UTC" firstStartedPulling="2026-01-30 06:50:11.786831483 +0000 UTC m=+6147.156741750" lastFinishedPulling="2026-01-30 06:50:18.928908966 +0000 UTC m=+6154.298819223" observedRunningTime="2026-01-30 06:50:20.15929098 +0000 UTC m=+6155.529201247" watchObservedRunningTime="2026-01-30 06:50:20.161840762 +0000 UTC m=+6155.531751029" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.174478 4931 scope.go:117] "RemoveContainer" containerID="d6ffd18a5fcda569a257ea582064d96c557e419380bbc5b6e6ffba7dcc0e2a22" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.193013 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.213568 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.216391 4931 scope.go:117] "RemoveContainer" containerID="01bff7fca0dad20c19afe6ca1af5b694418bb46bbfaf5130445564c32e038144" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.222223 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:20 crc kubenswrapper[4931]: E0130 06:50:20.222749 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-notification-agent" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.222772 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-notification-agent" Jan 30 06:50:20 crc kubenswrapper[4931]: E0130 06:50:20.222803 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-central-agent" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.222812 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-central-agent" Jan 30 06:50:20 crc kubenswrapper[4931]: E0130 06:50:20.222832 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="proxy-httpd" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.222839 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="proxy-httpd" Jan 30 06:50:20 crc kubenswrapper[4931]: E0130 06:50:20.222861 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="sg-core" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.222887 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="sg-core" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.223116 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="sg-core" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.223142 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-notification-agent" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.223158 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="ceilometer-central-agent" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.223179 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="83416e39-1feb-47a7-9e5d-748122bed281" containerName="proxy-httpd" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.225956 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.230868 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.231128 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.232026 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.263969 4931 scope.go:117] "RemoveContainer" containerID="e6075bddf7d9e844cbcadfacf5366aedab5b6c0869d55610da59bca13b8f2e10" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.283593 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-config-data\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.283661 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-scripts\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.283822 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.283905 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxjgq\" (UniqueName: \"kubernetes.io/projected/57fae8dd-3d28-4bc8-b8f2-667d583a4931-kube-api-access-qxjgq\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.284172 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.284464 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-log-httpd\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.284532 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-run-httpd\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386377 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-scripts\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386454 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386483 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxjgq\" (UniqueName: \"kubernetes.io/projected/57fae8dd-3d28-4bc8-b8f2-667d583a4931-kube-api-access-qxjgq\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386549 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386612 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-log-httpd\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386632 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-run-httpd\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.386684 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-config-data\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.387461 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-log-httpd\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.387686 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-run-httpd\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.395162 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.395509 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-config-data\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.398063 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-scripts\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.411291 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.422284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxjgq\" (UniqueName: \"kubernetes.io/projected/57fae8dd-3d28-4bc8-b8f2-667d583a4931-kube-api-access-qxjgq\") pod \"ceilometer-0\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.556728 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.907212 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.925082 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.938631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59977785bf-q4vw9" Jan 30 06:50:20 crc kubenswrapper[4931]: I0130 06:50:20.946354 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 30 06:50:21 crc kubenswrapper[4931]: I0130 06:50:21.023270 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879d4f7c5-x7dw2"] Jan 30 06:50:21 crc kubenswrapper[4931]: I0130 06:50:21.023703 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="dnsmasq-dns" containerID="cri-o://1b21e024974dbffc1b686bd5f52316fe76f211ede1a6fa05295886b31dbd35b5" gracePeriod=10 Jan 30 06:50:21 crc kubenswrapper[4931]: I0130 06:50:21.120656 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.82:5353: connect: connection refused" Jan 30 06:50:21 crc kubenswrapper[4931]: I0130 06:50:21.159743 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerStarted","Data":"29cb71fe5f366a9385aa94ca5e212e47a7ecc8bdc0bac6a88eee5b6e47ca58de"} Jan 30 06:50:21 crc kubenswrapper[4931]: I0130 06:50:21.461967 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83416e39-1feb-47a7-9e5d-748122bed281" path="/var/lib/kubelet/pods/83416e39-1feb-47a7-9e5d-748122bed281/volumes" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.173307 4931 generic.go:334] "Generic (PLEG): container finished" podID="214b78b9-e769-4474-be87-e9b494c2fa69" containerID="1b21e024974dbffc1b686bd5f52316fe76f211ede1a6fa05295886b31dbd35b5" exitCode=0 Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.173354 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" event={"ID":"214b78b9-e769-4474-be87-e9b494c2fa69","Type":"ContainerDied","Data":"1b21e024974dbffc1b686bd5f52316fe76f211ede1a6fa05295886b31dbd35b5"} Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.806702 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.838978 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn9tk\" (UniqueName: \"kubernetes.io/projected/214b78b9-e769-4474-be87-e9b494c2fa69-kube-api-access-pn9tk\") pod \"214b78b9-e769-4474-be87-e9b494c2fa69\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.839047 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-dns-svc\") pod \"214b78b9-e769-4474-be87-e9b494c2fa69\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.839077 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-config\") pod \"214b78b9-e769-4474-be87-e9b494c2fa69\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.839172 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-nb\") pod \"214b78b9-e769-4474-be87-e9b494c2fa69\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.839240 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-sb\") pod \"214b78b9-e769-4474-be87-e9b494c2fa69\" (UID: \"214b78b9-e769-4474-be87-e9b494c2fa69\") " Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.844100 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214b78b9-e769-4474-be87-e9b494c2fa69-kube-api-access-pn9tk" (OuterVolumeSpecName: "kube-api-access-pn9tk") pod "214b78b9-e769-4474-be87-e9b494c2fa69" (UID: "214b78b9-e769-4474-be87-e9b494c2fa69"). InnerVolumeSpecName "kube-api-access-pn9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.893534 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "214b78b9-e769-4474-be87-e9b494c2fa69" (UID: "214b78b9-e769-4474-be87-e9b494c2fa69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.926325 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "214b78b9-e769-4474-be87-e9b494c2fa69" (UID: "214b78b9-e769-4474-be87-e9b494c2fa69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.930026 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-config" (OuterVolumeSpecName: "config") pod "214b78b9-e769-4474-be87-e9b494c2fa69" (UID: "214b78b9-e769-4474-be87-e9b494c2fa69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.937014 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "214b78b9-e769-4474-be87-e9b494c2fa69" (UID: "214b78b9-e769-4474-be87-e9b494c2fa69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.941855 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.941889 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.941989 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn9tk\" (UniqueName: \"kubernetes.io/projected/214b78b9-e769-4474-be87-e9b494c2fa69-kube-api-access-pn9tk\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.942024 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:22 crc kubenswrapper[4931]: I0130 06:50:22.942033 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214b78b9-e769-4474-be87-e9b494c2fa69-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.186782 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" event={"ID":"214b78b9-e769-4474-be87-e9b494c2fa69","Type":"ContainerDied","Data":"1cea5345d991653f1a6830de732c35c4b2f81ed6821f46e956ec8f3a43e28720"} Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.186888 4931 scope.go:117] "RemoveContainer" containerID="1b21e024974dbffc1b686bd5f52316fe76f211ede1a6fa05295886b31dbd35b5" Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.187009 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5879d4f7c5-x7dw2" Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.192704 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerStarted","Data":"197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914"} Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.220963 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5879d4f7c5-x7dw2"] Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.226636 4931 scope.go:117] "RemoveContainer" containerID="45cf3829eaba7efc9ffdbde5fa46c91facdbe555edf8963708f266596e0113d9" Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.228244 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5879d4f7c5-x7dw2"] Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.332996 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:23 crc kubenswrapper[4931]: I0130 06:50:23.436099 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" path="/var/lib/kubelet/pods/214b78b9-e769-4474-be87-e9b494c2fa69/volumes" Jan 30 06:50:24 crc kubenswrapper[4931]: I0130 06:50:24.207146 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerStarted","Data":"bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631"} Jan 30 06:50:24 crc kubenswrapper[4931]: I0130 06:50:24.207536 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerStarted","Data":"54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e"} Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.245180 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerStarted","Data":"9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25"} Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.245616 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-central-agent" containerID="cri-o://197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914" gracePeriod=30 Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.245666 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.245719 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="proxy-httpd" containerID="cri-o://9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25" gracePeriod=30 Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.245765 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="sg-core" containerID="cri-o://bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631" gracePeriod=30 Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.245806 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-notification-agent" containerID="cri-o://54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e" gracePeriod=30 Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.362805 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.362870 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.362921 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.363713 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb726f99bee299533a0b10daf42c4d3c80f89e1b2459842bb36e1df7a3f9faa7"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:50:27 crc kubenswrapper[4931]: I0130 06:50:27.363769 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://fb726f99bee299533a0b10daf42c4d3c80f89e1b2459842bb36e1df7a3f9faa7" gracePeriod=600 Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.264099 4931 generic.go:334] "Generic (PLEG): container finished" podID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerID="9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25" exitCode=0 Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.264542 4931 generic.go:334] "Generic (PLEG): container finished" podID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerID="bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631" exitCode=2 Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.264551 4931 generic.go:334] "Generic (PLEG): container finished" podID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerID="54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e" exitCode=0 Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.264165 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerDied","Data":"9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25"} Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.264620 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerDied","Data":"bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631"} Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.264635 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerDied","Data":"54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e"} Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.268789 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="fb726f99bee299533a0b10daf42c4d3c80f89e1b2459842bb36e1df7a3f9faa7" exitCode=0 Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.268818 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"fb726f99bee299533a0b10daf42c4d3c80f89e1b2459842bb36e1df7a3f9faa7"} Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.268883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d"} Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.268914 4931 scope.go:117] "RemoveContainer" containerID="981373dfbb9d9692d805ba5950bbc2a4aba00fefca1f782507816d2c2ba0f7f7" Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.310955 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.299934937 podStartE2EDuration="8.310929413s" podCreationTimestamp="2026-01-30 06:50:20 +0000 UTC" firstStartedPulling="2026-01-30 06:50:20.921874697 +0000 UTC m=+6156.291784954" lastFinishedPulling="2026-01-30 06:50:25.932869143 +0000 UTC m=+6161.302779430" observedRunningTime="2026-01-30 06:50:27.286950318 +0000 UTC m=+6162.656860585" watchObservedRunningTime="2026-01-30 06:50:28.310929413 +0000 UTC m=+6163.680839690" Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.363248 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-j6wq2" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="registry-server" probeResult="failure" output=< Jan 30 06:50:28 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:50:28 crc kubenswrapper[4931]: > Jan 30 06:50:28 crc kubenswrapper[4931]: I0130 06:50:28.927310 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.001766 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxjgq\" (UniqueName: \"kubernetes.io/projected/57fae8dd-3d28-4bc8-b8f2-667d583a4931-kube-api-access-qxjgq\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.001833 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-run-httpd\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.001899 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-combined-ca-bundle\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.001977 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-scripts\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.002017 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-sg-core-conf-yaml\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.002072 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-config-data\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.002217 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-log-httpd\") pod \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\" (UID: \"57fae8dd-3d28-4bc8-b8f2-667d583a4931\") " Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.002505 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.002801 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.003744 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.011981 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-scripts" (OuterVolumeSpecName: "scripts") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.012064 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fae8dd-3d28-4bc8-b8f2-667d583a4931-kube-api-access-qxjgq" (OuterVolumeSpecName: "kube-api-access-qxjgq") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "kube-api-access-qxjgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.036822 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.104307 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57fae8dd-3d28-4bc8-b8f2-667d583a4931-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.104343 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxjgq\" (UniqueName: \"kubernetes.io/projected/57fae8dd-3d28-4bc8-b8f2-667d583a4931-kube-api-access-qxjgq\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.104354 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.104362 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.105132 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.125602 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-config-data" (OuterVolumeSpecName: "config-data") pod "57fae8dd-3d28-4bc8-b8f2-667d583a4931" (UID: "57fae8dd-3d28-4bc8-b8f2-667d583a4931"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.206476 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.206704 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fae8dd-3d28-4bc8-b8f2-667d583a4931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.285302 4931 generic.go:334] "Generic (PLEG): container finished" podID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerID="197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914" exitCode=0 Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.285374 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerDied","Data":"197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914"} Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.285401 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57fae8dd-3d28-4bc8-b8f2-667d583a4931","Type":"ContainerDied","Data":"29cb71fe5f366a9385aa94ca5e212e47a7ecc8bdc0bac6a88eee5b6e47ca58de"} Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.285417 4931 scope.go:117] "RemoveContainer" containerID="9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.286222 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.313494 4931 scope.go:117] "RemoveContainer" containerID="bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.354715 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.381140 4931 scope.go:117] "RemoveContainer" containerID="54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.384582 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.395720 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.398720 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="sg-core" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.398753 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="sg-core" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.398777 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-central-agent" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.398784 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-central-agent" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.398791 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="proxy-httpd" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.398796 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="proxy-httpd" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.398809 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="dnsmasq-dns" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.398815 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="dnsmasq-dns" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.398827 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-notification-agent" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.398832 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-notification-agent" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.398849 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="init" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.398855 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="init" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.399169 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-central-agent" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.399181 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="214b78b9-e769-4474-be87-e9b494c2fa69" containerName="dnsmasq-dns" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.399196 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="ceilometer-notification-agent" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.399206 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="proxy-httpd" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.399226 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" containerName="sg-core" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.401359 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.404038 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.404441 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.404818 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.415796 4931 scope.go:117] "RemoveContainer" containerID="197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.441165 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fae8dd-3d28-4bc8-b8f2-667d583a4931" path="/var/lib/kubelet/pods/57fae8dd-3d28-4bc8-b8f2-667d583a4931/volumes" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.443239 4931 scope.go:117] "RemoveContainer" containerID="9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.443630 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25\": container with ID starting with 9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25 not found: ID does not exist" containerID="9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.443668 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25"} err="failed to get container status \"9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25\": rpc error: code = NotFound desc = could not find container \"9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25\": container with ID starting with 9b6b1a571845a37b2c3b7a0e309b0fde565dc4026786cde88a1e3aa91519fd25 not found: ID does not exist" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.443687 4931 scope.go:117] "RemoveContainer" containerID="bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.444053 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631\": container with ID starting with bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631 not found: ID does not exist" containerID="bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.444083 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631"} err="failed to get container status \"bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631\": rpc error: code = NotFound desc = could not find container \"bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631\": container with ID starting with bf82ab3602e67902b4e094a710a1cd379b3bda403032cb0e5c2107422f572631 not found: ID does not exist" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.444099 4931 scope.go:117] "RemoveContainer" containerID="54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.444321 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e\": container with ID starting with 54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e not found: ID does not exist" containerID="54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.444344 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e"} err="failed to get container status \"54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e\": rpc error: code = NotFound desc = could not find container \"54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e\": container with ID starting with 54dbc567a6f51424cbe90715a2274c2a39e9560d24fbfc82467bed1bf7038a3e not found: ID does not exist" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.444362 4931 scope.go:117] "RemoveContainer" containerID="197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914" Jan 30 06:50:29 crc kubenswrapper[4931]: E0130 06:50:29.444650 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914\": container with ID starting with 197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914 not found: ID does not exist" containerID="197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.444671 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914"} err="failed to get container status \"197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914\": rpc error: code = NotFound desc = could not find container \"197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914\": container with ID starting with 197b2e00bf1b62071bfd0ea1789b2cd2e3aa6238e686992fda263b2330b2a914 not found: ID does not exist" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.513658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-run-httpd\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.513793 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbbt\" (UniqueName: \"kubernetes.io/projected/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-kube-api-access-5tbbt\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.513841 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-config-data\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.513885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.513923 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.513949 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-scripts\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.514039 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-log-httpd\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616026 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbbt\" (UniqueName: \"kubernetes.io/projected/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-kube-api-access-5tbbt\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616361 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-config-data\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616448 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616498 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616538 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-scripts\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616632 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-log-httpd\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.616722 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-run-httpd\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.617703 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-run-httpd\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.618575 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-log-httpd\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.620746 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.620822 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-scripts\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.621518 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-config-data\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.621594 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.636850 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbbt\" (UniqueName: \"kubernetes.io/projected/d49c4b54-cfb7-4264-a6b8-9ee32cc53de7-kube-api-access-5tbbt\") pod \"ceilometer-0\" (UID: \"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7\") " pod="openstack/ceilometer-0" Jan 30 06:50:29 crc kubenswrapper[4931]: I0130 06:50:29.723740 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:50:30 crc kubenswrapper[4931]: I0130 06:50:30.216982 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:50:30 crc kubenswrapper[4931]: W0130 06:50:30.223598 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd49c4b54_cfb7_4264_a6b8_9ee32cc53de7.slice/crio-e70ec4a1e75b08ae74bef2c03c30f8fde6723ad39111490f1e1adc5dbac355ad WatchSource:0}: Error finding container e70ec4a1e75b08ae74bef2c03c30f8fde6723ad39111490f1e1adc5dbac355ad: Status 404 returned error can't find the container with id e70ec4a1e75b08ae74bef2c03c30f8fde6723ad39111490f1e1adc5dbac355ad Jan 30 06:50:30 crc kubenswrapper[4931]: I0130 06:50:30.302064 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7","Type":"ContainerStarted","Data":"e70ec4a1e75b08ae74bef2c03c30f8fde6723ad39111490f1e1adc5dbac355ad"} Jan 30 06:50:31 crc kubenswrapper[4931]: I0130 06:50:31.313051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7","Type":"ContainerStarted","Data":"b349f3523aa5d13b1408a90273ff6dec9f52ed69ae81c38a2c9c2e27d19c77ab"} Jan 30 06:50:32 crc kubenswrapper[4931]: I0130 06:50:32.323532 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7","Type":"ContainerStarted","Data":"a341b96dfe0b6e82654dabec321e7d7a9a8387bbf76d026a7f791d913d883308"} Jan 30 06:50:32 crc kubenswrapper[4931]: I0130 06:50:32.487879 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 30 06:50:32 crc kubenswrapper[4931]: I0130 06:50:32.560090 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 30 06:50:32 crc kubenswrapper[4931]: I0130 06:50:32.565507 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Jan 30 06:50:33 crc kubenswrapper[4931]: I0130 06:50:33.335352 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7","Type":"ContainerStarted","Data":"6b53435422486799234d64567e00201256b9fd0017f0d413b2e249c8fa6e81a6"} Jan 30 06:50:35 crc kubenswrapper[4931]: I0130 06:50:35.364071 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d49c4b54-cfb7-4264-a6b8-9ee32cc53de7","Type":"ContainerStarted","Data":"beea85f311109358d788e6b45d0d72e60fc1353e7e6eb9561f2ba3b164cf7187"} Jan 30 06:50:35 crc kubenswrapper[4931]: I0130 06:50:35.364995 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:50:35 crc kubenswrapper[4931]: I0130 06:50:35.390693 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.111099626 podStartE2EDuration="6.390658912s" podCreationTimestamp="2026-01-30 06:50:29 +0000 UTC" firstStartedPulling="2026-01-30 06:50:30.227129842 +0000 UTC m=+6165.597040099" lastFinishedPulling="2026-01-30 06:50:34.506689118 +0000 UTC m=+6169.876599385" observedRunningTime="2026-01-30 06:50:35.38739314 +0000 UTC m=+6170.757303497" watchObservedRunningTime="2026-01-30 06:50:35.390658912 +0000 UTC m=+6170.760569219" Jan 30 06:50:37 crc kubenswrapper[4931]: I0130 06:50:37.324085 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:50:37 crc kubenswrapper[4931]: I0130 06:50:37.374716 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:50:37 crc kubenswrapper[4931]: I0130 06:50:37.579414 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6wq2"] Jan 30 06:50:38 crc kubenswrapper[4931]: I0130 06:50:38.396795 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6wq2" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="registry-server" containerID="cri-o://137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1" gracePeriod=2 Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.184691 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.243715 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-catalog-content\") pod \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.244113 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-utilities\") pod \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.244503 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngkp9\" (UniqueName: \"kubernetes.io/projected/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-kube-api-access-ngkp9\") pod \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\" (UID: \"2b12adab-30ea-4122-9ea7-0c8b2fdb117c\") " Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.246592 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-utilities" (OuterVolumeSpecName: "utilities") pod "2b12adab-30ea-4122-9ea7-0c8b2fdb117c" (UID: "2b12adab-30ea-4122-9ea7-0c8b2fdb117c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.262999 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-kube-api-access-ngkp9" (OuterVolumeSpecName: "kube-api-access-ngkp9") pod "2b12adab-30ea-4122-9ea7-0c8b2fdb117c" (UID: "2b12adab-30ea-4122-9ea7-0c8b2fdb117c"). InnerVolumeSpecName "kube-api-access-ngkp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.316941 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b12adab-30ea-4122-9ea7-0c8b2fdb117c" (UID: "2b12adab-30ea-4122-9ea7-0c8b2fdb117c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.347599 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngkp9\" (UniqueName: \"kubernetes.io/projected/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-kube-api-access-ngkp9\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.347643 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.347657 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b12adab-30ea-4122-9ea7-0c8b2fdb117c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.415585 4931 generic.go:334] "Generic (PLEG): container finished" podID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerID="137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1" exitCode=0 Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.415634 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerDied","Data":"137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1"} Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.415666 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6wq2" event={"ID":"2b12adab-30ea-4122-9ea7-0c8b2fdb117c","Type":"ContainerDied","Data":"a18b31c9c3fa680e8d93ca8737e2041057cbd5a166876584d210ca053654abaf"} Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.415685 4931 scope.go:117] "RemoveContainer" containerID="137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.416876 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6wq2" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.448106 4931 scope.go:117] "RemoveContainer" containerID="681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.480800 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6wq2"] Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.491027 4931 scope.go:117] "RemoveContainer" containerID="b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.493608 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6wq2"] Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.539617 4931 scope.go:117] "RemoveContainer" containerID="137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1" Jan 30 06:50:39 crc kubenswrapper[4931]: E0130 06:50:39.540166 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1\": container with ID starting with 137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1 not found: ID does not exist" containerID="137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.540215 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1"} err="failed to get container status \"137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1\": rpc error: code = NotFound desc = could not find container \"137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1\": container with ID starting with 137405c9a2e57ffc07560da2bab8534c0b526dfafb00e20c14c914b0824bcca1 not found: ID does not exist" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.540239 4931 scope.go:117] "RemoveContainer" containerID="681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4" Jan 30 06:50:39 crc kubenswrapper[4931]: E0130 06:50:39.540648 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4\": container with ID starting with 681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4 not found: ID does not exist" containerID="681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.540675 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4"} err="failed to get container status \"681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4\": rpc error: code = NotFound desc = could not find container \"681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4\": container with ID starting with 681b6c299cbe135891d8c8dcaae11331648366ef679f31fa6b02fe2f6fc1b5d4 not found: ID does not exist" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.540690 4931 scope.go:117] "RemoveContainer" containerID="b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d" Jan 30 06:50:39 crc kubenswrapper[4931]: E0130 06:50:39.540975 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d\": container with ID starting with b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d not found: ID does not exist" containerID="b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d" Jan 30 06:50:39 crc kubenswrapper[4931]: I0130 06:50:39.540998 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d"} err="failed to get container status \"b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d\": rpc error: code = NotFound desc = could not find container \"b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d\": container with ID starting with b96dd5bc39ca8e281e96819198e81f51b95d1548ac1ca72b9934c7b258b3531d not found: ID does not exist" Jan 30 06:50:41 crc kubenswrapper[4931]: I0130 06:50:41.435069 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" path="/var/lib/kubelet/pods/2b12adab-30ea-4122-9ea7-0c8b2fdb117c/volumes" Jan 30 06:50:59 crc kubenswrapper[4931]: I0130 06:50:59.738309 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.191581 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-57fqf/must-gather-v9ths"] Jan 30 06:52:05 crc kubenswrapper[4931]: E0130 06:52:05.193578 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="extract-content" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.193655 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="extract-content" Jan 30 06:52:05 crc kubenswrapper[4931]: E0130 06:52:05.193728 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="registry-server" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.193784 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="registry-server" Jan 30 06:52:05 crc kubenswrapper[4931]: E0130 06:52:05.194047 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="extract-utilities" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.194101 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="extract-utilities" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.194375 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b12adab-30ea-4122-9ea7-0c8b2fdb117c" containerName="registry-server" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.195568 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.197062 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-57fqf"/"openshift-service-ca.crt" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.197637 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-57fqf"/"default-dockercfg-zrkvq" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.199890 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-57fqf"/"kube-root-ca.crt" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.210981 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-57fqf/must-gather-v9ths"] Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.287025 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1075a448-992c-4364-842b-06dda255cd42-must-gather-output\") pod \"must-gather-v9ths\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.287362 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcnzx\" (UniqueName: \"kubernetes.io/projected/1075a448-992c-4364-842b-06dda255cd42-kube-api-access-wcnzx\") pod \"must-gather-v9ths\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.389916 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcnzx\" (UniqueName: \"kubernetes.io/projected/1075a448-992c-4364-842b-06dda255cd42-kube-api-access-wcnzx\") pod \"must-gather-v9ths\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.390086 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1075a448-992c-4364-842b-06dda255cd42-must-gather-output\") pod \"must-gather-v9ths\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.390553 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1075a448-992c-4364-842b-06dda255cd42-must-gather-output\") pod \"must-gather-v9ths\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.413750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcnzx\" (UniqueName: \"kubernetes.io/projected/1075a448-992c-4364-842b-06dda255cd42-kube-api-access-wcnzx\") pod \"must-gather-v9ths\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:05 crc kubenswrapper[4931]: I0130 06:52:05.511627 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:52:06 crc kubenswrapper[4931]: I0130 06:52:06.072384 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-57fqf/must-gather-v9ths"] Jan 30 06:52:06 crc kubenswrapper[4931]: I0130 06:52:06.549252 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/must-gather-v9ths" event={"ID":"1075a448-992c-4364-842b-06dda255cd42","Type":"ContainerStarted","Data":"cf436afdc02c26e863dc8ad812f5a2c820057a0e4bd2d7c8832d02ba1d788c94"} Jan 30 06:52:14 crc kubenswrapper[4931]: I0130 06:52:14.648837 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/must-gather-v9ths" event={"ID":"1075a448-992c-4364-842b-06dda255cd42","Type":"ContainerStarted","Data":"f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92"} Jan 30 06:52:15 crc kubenswrapper[4931]: I0130 06:52:15.662947 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/must-gather-v9ths" event={"ID":"1075a448-992c-4364-842b-06dda255cd42","Type":"ContainerStarted","Data":"b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942"} Jan 30 06:52:15 crc kubenswrapper[4931]: I0130 06:52:15.688985 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-57fqf/must-gather-v9ths" podStartSLOduration=2.670147832 podStartE2EDuration="10.688950865s" podCreationTimestamp="2026-01-30 06:52:05 +0000 UTC" firstStartedPulling="2026-01-30 06:52:06.072704525 +0000 UTC m=+6261.442614782" lastFinishedPulling="2026-01-30 06:52:14.091507558 +0000 UTC m=+6269.461417815" observedRunningTime="2026-01-30 06:52:15.683600354 +0000 UTC m=+6271.053510671" watchObservedRunningTime="2026-01-30 06:52:15.688950865 +0000 UTC m=+6271.058861162" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.566474 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-57fqf/crc-debug-k2m6k"] Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.568281 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.689487 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6z7\" (UniqueName: \"kubernetes.io/projected/57350cb0-488c-4df8-808a-a9327d16816d-kube-api-access-dv6z7\") pod \"crc-debug-k2m6k\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.690031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57350cb0-488c-4df8-808a-a9327d16816d-host\") pod \"crc-debug-k2m6k\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.791404 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57350cb0-488c-4df8-808a-a9327d16816d-host\") pod \"crc-debug-k2m6k\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.791476 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57350cb0-488c-4df8-808a-a9327d16816d-host\") pod \"crc-debug-k2m6k\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.791591 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6z7\" (UniqueName: \"kubernetes.io/projected/57350cb0-488c-4df8-808a-a9327d16816d-kube-api-access-dv6z7\") pod \"crc-debug-k2m6k\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.813594 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6z7\" (UniqueName: \"kubernetes.io/projected/57350cb0-488c-4df8-808a-a9327d16816d-kube-api-access-dv6z7\") pod \"crc-debug-k2m6k\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: I0130 06:52:18.884058 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:18 crc kubenswrapper[4931]: W0130 06:52:18.928929 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57350cb0_488c_4df8_808a_a9327d16816d.slice/crio-99e60a1f5ea9c470502256a828f0ebd37f1595ad3a4fa47fcd3728c61b6ab807 WatchSource:0}: Error finding container 99e60a1f5ea9c470502256a828f0ebd37f1595ad3a4fa47fcd3728c61b6ab807: Status 404 returned error can't find the container with id 99e60a1f5ea9c470502256a828f0ebd37f1595ad3a4fa47fcd3728c61b6ab807 Jan 30 06:52:19 crc kubenswrapper[4931]: I0130 06:52:19.761321 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" event={"ID":"57350cb0-488c-4df8-808a-a9327d16816d","Type":"ContainerStarted","Data":"99e60a1f5ea9c470502256a828f0ebd37f1595ad3a4fa47fcd3728c61b6ab807"} Jan 30 06:52:27 crc kubenswrapper[4931]: I0130 06:52:27.363035 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:52:27 crc kubenswrapper[4931]: I0130 06:52:27.363685 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:52:32 crc kubenswrapper[4931]: I0130 06:52:32.895449 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" event={"ID":"57350cb0-488c-4df8-808a-a9327d16816d","Type":"ContainerStarted","Data":"a1eade7d298ab8964a28bb2ee51b88b39f3ffa30229e7d9814ffc7f1e58b96ec"} Jan 30 06:52:32 crc kubenswrapper[4931]: I0130 06:52:32.909180 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" podStartSLOduration=1.8654467609999998 podStartE2EDuration="14.909159561s" podCreationTimestamp="2026-01-30 06:52:18 +0000 UTC" firstStartedPulling="2026-01-30 06:52:18.930858942 +0000 UTC m=+6274.300769209" lastFinishedPulling="2026-01-30 06:52:31.974571732 +0000 UTC m=+6287.344482009" observedRunningTime="2026-01-30 06:52:32.906757773 +0000 UTC m=+6288.276668030" watchObservedRunningTime="2026-01-30 06:52:32.909159561 +0000 UTC m=+6288.279069838" Jan 30 06:52:39 crc kubenswrapper[4931]: I0130 06:52:39.052619 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-5v2g5"] Jan 30 06:52:39 crc kubenswrapper[4931]: I0130 06:52:39.063127 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-5v2g5"] Jan 30 06:52:39 crc kubenswrapper[4931]: I0130 06:52:39.437006 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031" path="/var/lib/kubelet/pods/cfb96a2d-1dc4-45d0-9dd2-b2a0bda9a031/volumes" Jan 30 06:52:40 crc kubenswrapper[4931]: I0130 06:52:40.270832 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db99-account-create-update-8dpb5"] Jan 30 06:52:40 crc kubenswrapper[4931]: I0130 06:52:40.284737 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db99-account-create-update-8dpb5"] Jan 30 06:52:41 crc kubenswrapper[4931]: I0130 06:52:41.630764 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21268b76-c5b2-457f-a433-ff2da3b9bd10" path="/var/lib/kubelet/pods/21268b76-c5b2-457f-a433-ff2da3b9bd10/volumes" Jan 30 06:52:46 crc kubenswrapper[4931]: I0130 06:52:46.046933 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-scftb"] Jan 30 06:52:46 crc kubenswrapper[4931]: I0130 06:52:46.066269 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-scftb"] Jan 30 06:52:47 crc kubenswrapper[4931]: I0130 06:52:47.036126 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-06bb-account-create-update-9w2dc"] Jan 30 06:52:47 crc kubenswrapper[4931]: I0130 06:52:47.051940 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-06bb-account-create-update-9w2dc"] Jan 30 06:52:47 crc kubenswrapper[4931]: I0130 06:52:47.513954 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c32b952-ea20-4e38-be3d-0ca833fb8aaf" path="/var/lib/kubelet/pods/7c32b952-ea20-4e38-be3d-0ca833fb8aaf/volumes" Jan 30 06:52:47 crc kubenswrapper[4931]: I0130 06:52:47.515769 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffbc623-924e-4952-890f-da78398d60fb" path="/var/lib/kubelet/pods/cffbc623-924e-4952-890f-da78398d60fb/volumes" Jan 30 06:52:57 crc kubenswrapper[4931]: I0130 06:52:57.156348 4931 generic.go:334] "Generic (PLEG): container finished" podID="57350cb0-488c-4df8-808a-a9327d16816d" containerID="a1eade7d298ab8964a28bb2ee51b88b39f3ffa30229e7d9814ffc7f1e58b96ec" exitCode=0 Jan 30 06:52:57 crc kubenswrapper[4931]: I0130 06:52:57.156759 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" event={"ID":"57350cb0-488c-4df8-808a-a9327d16816d","Type":"ContainerDied","Data":"a1eade7d298ab8964a28bb2ee51b88b39f3ffa30229e7d9814ffc7f1e58b96ec"} Jan 30 06:52:57 crc kubenswrapper[4931]: I0130 06:52:57.362537 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:52:57 crc kubenswrapper[4931]: I0130 06:52:57.362595 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.321713 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.365654 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-57fqf/crc-debug-k2m6k"] Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.393204 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-57fqf/crc-debug-k2m6k"] Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.428938 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57350cb0-488c-4df8-808a-a9327d16816d-host\") pod \"57350cb0-488c-4df8-808a-a9327d16816d\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.429104 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57350cb0-488c-4df8-808a-a9327d16816d-host" (OuterVolumeSpecName: "host") pod "57350cb0-488c-4df8-808a-a9327d16816d" (UID: "57350cb0-488c-4df8-808a-a9327d16816d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.429120 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv6z7\" (UniqueName: \"kubernetes.io/projected/57350cb0-488c-4df8-808a-a9327d16816d-kube-api-access-dv6z7\") pod \"57350cb0-488c-4df8-808a-a9327d16816d\" (UID: \"57350cb0-488c-4df8-808a-a9327d16816d\") " Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.432698 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57350cb0-488c-4df8-808a-a9327d16816d-host\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.447688 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57350cb0-488c-4df8-808a-a9327d16816d-kube-api-access-dv6z7" (OuterVolumeSpecName: "kube-api-access-dv6z7") pod "57350cb0-488c-4df8-808a-a9327d16816d" (UID: "57350cb0-488c-4df8-808a-a9327d16816d"). InnerVolumeSpecName "kube-api-access-dv6z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:52:58 crc kubenswrapper[4931]: I0130 06:52:58.534932 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv6z7\" (UniqueName: \"kubernetes.io/projected/57350cb0-488c-4df8-808a-a9327d16816d-kube-api-access-dv6z7\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.179866 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e60a1f5ea9c470502256a828f0ebd37f1595ad3a4fa47fcd3728c61b6ab807" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.179918 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-k2m6k" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.436099 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57350cb0-488c-4df8-808a-a9327d16816d" path="/var/lib/kubelet/pods/57350cb0-488c-4df8-808a-a9327d16816d/volumes" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.620170 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-57fqf/crc-debug-fgbqg"] Jan 30 06:52:59 crc kubenswrapper[4931]: E0130 06:52:59.621980 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57350cb0-488c-4df8-808a-a9327d16816d" containerName="container-00" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.622049 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="57350cb0-488c-4df8-808a-a9327d16816d" containerName="container-00" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.622303 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="57350cb0-488c-4df8-808a-a9327d16816d" containerName="container-00" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.623094 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.655262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-host\") pod \"crc-debug-fgbqg\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.655313 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nzgg\" (UniqueName: \"kubernetes.io/projected/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-kube-api-access-5nzgg\") pod \"crc-debug-fgbqg\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.757493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-host\") pod \"crc-debug-fgbqg\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.757538 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nzgg\" (UniqueName: \"kubernetes.io/projected/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-kube-api-access-5nzgg\") pod \"crc-debug-fgbqg\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.757679 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-host\") pod \"crc-debug-fgbqg\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.774107 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nzgg\" (UniqueName: \"kubernetes.io/projected/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-kube-api-access-5nzgg\") pod \"crc-debug-fgbqg\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:52:59 crc kubenswrapper[4931]: I0130 06:52:59.940957 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:53:00 crc kubenswrapper[4931]: I0130 06:53:00.188878 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/crc-debug-fgbqg" event={"ID":"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d","Type":"ContainerStarted","Data":"433b0aa6d9f40e4077322d1e7e4879e667697fc4e1ed460f9229365cd34aa89e"} Jan 30 06:53:01 crc kubenswrapper[4931]: I0130 06:53:01.198791 4931 generic.go:334] "Generic (PLEG): container finished" podID="4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" containerID="e650a4dea4b019515b8b62daf128b41c4a1b5ee09e0661b5a7e0c21439b0ecca" exitCode=1 Jan 30 06:53:01 crc kubenswrapper[4931]: I0130 06:53:01.198857 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/crc-debug-fgbqg" event={"ID":"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d","Type":"ContainerDied","Data":"e650a4dea4b019515b8b62daf128b41c4a1b5ee09e0661b5a7e0c21439b0ecca"} Jan 30 06:53:01 crc kubenswrapper[4931]: I0130 06:53:01.238962 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-57fqf/crc-debug-fgbqg"] Jan 30 06:53:01 crc kubenswrapper[4931]: I0130 06:53:01.249959 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-57fqf/crc-debug-fgbqg"] Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.333769 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.409523 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-host\") pod \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.409774 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nzgg\" (UniqueName: \"kubernetes.io/projected/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-kube-api-access-5nzgg\") pod \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\" (UID: \"4472ceea-bcfb-49b8-8c2c-60d6dbeae07d\") " Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.409885 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-host" (OuterVolumeSpecName: "host") pod "4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" (UID: "4472ceea-bcfb-49b8-8c2c-60d6dbeae07d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.410368 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-host\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.418609 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-kube-api-access-5nzgg" (OuterVolumeSpecName: "kube-api-access-5nzgg") pod "4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" (UID: "4472ceea-bcfb-49b8-8c2c-60d6dbeae07d"). InnerVolumeSpecName "kube-api-access-5nzgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.515231 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nzgg\" (UniqueName: \"kubernetes.io/projected/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d-kube-api-access-5nzgg\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.958681 4931 scope.go:117] "RemoveContainer" containerID="e4d54f2baf7c89b6f0656a4f82e037b9be01132659b5c11c20729732cbeb53c0" Jan 30 06:53:02 crc kubenswrapper[4931]: I0130 06:53:02.991027 4931 scope.go:117] "RemoveContainer" containerID="fc2609bf05101f454b500a411b2af7bec596ed7b2a503264f64b371462ed10d1" Jan 30 06:53:03 crc kubenswrapper[4931]: I0130 06:53:03.047677 4931 scope.go:117] "RemoveContainer" containerID="530188cbe277ff66c92e88116244fb7e483ee43f7966980292a116725f942bcd" Jan 30 06:53:03 crc kubenswrapper[4931]: I0130 06:53:03.095177 4931 scope.go:117] "RemoveContainer" containerID="e1e6f3089c65c6555f71402d2087406c6daee75232130f6c5b8762d180358f01" Jan 30 06:53:03 crc kubenswrapper[4931]: I0130 06:53:03.222291 4931 scope.go:117] "RemoveContainer" containerID="e650a4dea4b019515b8b62daf128b41c4a1b5ee09e0661b5a7e0c21439b0ecca" Jan 30 06:53:03 crc kubenswrapper[4931]: I0130 06:53:03.222432 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/crc-debug-fgbqg" Jan 30 06:53:03 crc kubenswrapper[4931]: I0130 06:53:03.434562 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" path="/var/lib/kubelet/pods/4472ceea-bcfb-49b8-8c2c-60d6dbeae07d/volumes" Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.066208 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-2clsb"] Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.077847 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-2clsb"] Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.363179 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.363237 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.363278 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.364190 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.364249 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" gracePeriod=600 Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.453630 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb" path="/var/lib/kubelet/pods/6c7503f1-c8e7-4b48-9dad-a4a221ebdbbb/volumes" Jan 30 06:53:27 crc kubenswrapper[4931]: E0130 06:53:27.484325 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.498580 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" exitCode=0 Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.498627 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d"} Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.498661 4931 scope.go:117] "RemoveContainer" containerID="fb726f99bee299533a0b10daf42c4d3c80f89e1b2459842bb36e1df7a3f9faa7" Jan 30 06:53:27 crc kubenswrapper[4931]: I0130 06:53:27.499365 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:53:27 crc kubenswrapper[4931]: E0130 06:53:27.499627 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.040155 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2d95k"] Jan 30 06:53:35 crc kubenswrapper[4931]: E0130 06:53:35.041768 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" containerName="container-00" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.041794 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" containerName="container-00" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.042206 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4472ceea-bcfb-49b8-8c2c-60d6dbeae07d" containerName="container-00" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.045118 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.086549 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d95k"] Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.167960 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-catalog-content\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.168146 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-kube-api-access-8jlm5\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.168709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-utilities\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.270213 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-catalog-content\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.270306 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-kube-api-access-8jlm5\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.270391 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-utilities\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.270845 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-catalog-content\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.270876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-utilities\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.297770 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-kube-api-access-8jlm5\") pod \"redhat-marketplace-2d95k\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.384949 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:35 crc kubenswrapper[4931]: I0130 06:53:35.904375 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d95k"] Jan 30 06:53:36 crc kubenswrapper[4931]: I0130 06:53:36.610634 4931 generic.go:334] "Generic (PLEG): container finished" podID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerID="968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218" exitCode=0 Jan 30 06:53:36 crc kubenswrapper[4931]: I0130 06:53:36.611044 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerDied","Data":"968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218"} Jan 30 06:53:36 crc kubenswrapper[4931]: I0130 06:53:36.611126 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerStarted","Data":"833fbe327aa6d698810c9ced1db040fd1f7ae267535aeb95838f24816cb7de3f"} Jan 30 06:53:37 crc kubenswrapper[4931]: I0130 06:53:37.628106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerStarted","Data":"0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe"} Jan 30 06:53:38 crc kubenswrapper[4931]: I0130 06:53:38.644400 4931 generic.go:334] "Generic (PLEG): container finished" podID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerID="0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe" exitCode=0 Jan 30 06:53:38 crc kubenswrapper[4931]: I0130 06:53:38.644512 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerDied","Data":"0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe"} Jan 30 06:53:39 crc kubenswrapper[4931]: I0130 06:53:39.425240 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:53:39 crc kubenswrapper[4931]: E0130 06:53:39.426322 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:53:39 crc kubenswrapper[4931]: I0130 06:53:39.660703 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerStarted","Data":"501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6"} Jan 30 06:53:39 crc kubenswrapper[4931]: I0130 06:53:39.699623 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2d95k" podStartSLOduration=2.22668543 podStartE2EDuration="4.699595311s" podCreationTimestamp="2026-01-30 06:53:35 +0000 UTC" firstStartedPulling="2026-01-30 06:53:36.617215743 +0000 UTC m=+6351.987126040" lastFinishedPulling="2026-01-30 06:53:39.090125634 +0000 UTC m=+6354.460035921" observedRunningTime="2026-01-30 06:53:39.685111183 +0000 UTC m=+6355.055021480" watchObservedRunningTime="2026-01-30 06:53:39.699595311 +0000 UTC m=+6355.069505608" Jan 30 06:53:45 crc kubenswrapper[4931]: I0130 06:53:45.385495 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:45 crc kubenswrapper[4931]: I0130 06:53:45.386248 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:45 crc kubenswrapper[4931]: I0130 06:53:45.473356 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:45 crc kubenswrapper[4931]: I0130 06:53:45.813319 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:45 crc kubenswrapper[4931]: I0130 06:53:45.887773 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d95k"] Jan 30 06:53:47 crc kubenswrapper[4931]: I0130 06:53:47.757131 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2d95k" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="registry-server" containerID="cri-o://501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6" gracePeriod=2 Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.291713 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.427691 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-utilities\") pod \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.427760 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-kube-api-access-8jlm5\") pod \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.427914 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-catalog-content\") pod \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\" (UID: \"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf\") " Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.429033 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-utilities" (OuterVolumeSpecName: "utilities") pod "92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" (UID: "92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.436725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-kube-api-access-8jlm5" (OuterVolumeSpecName: "kube-api-access-8jlm5") pod "92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" (UID: "92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf"). InnerVolumeSpecName "kube-api-access-8jlm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.458822 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" (UID: "92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.531303 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.531339 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jlm5\" (UniqueName: \"kubernetes.io/projected/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-kube-api-access-8jlm5\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.531354 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.800059 4931 generic.go:334] "Generic (PLEG): container finished" podID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerID="501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6" exitCode=0 Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.800104 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerDied","Data":"501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6"} Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.800134 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d95k" event={"ID":"92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf","Type":"ContainerDied","Data":"833fbe327aa6d698810c9ced1db040fd1f7ae267535aeb95838f24816cb7de3f"} Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.800155 4931 scope.go:117] "RemoveContainer" containerID="501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.800291 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d95k" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.839739 4931 scope.go:117] "RemoveContainer" containerID="0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.854512 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d95k"] Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.869978 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d95k"] Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.871963 4931 scope.go:117] "RemoveContainer" containerID="968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.921498 4931 scope.go:117] "RemoveContainer" containerID="501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6" Jan 30 06:53:48 crc kubenswrapper[4931]: E0130 06:53:48.921943 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6\": container with ID starting with 501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6 not found: ID does not exist" containerID="501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.921976 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6"} err="failed to get container status \"501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6\": rpc error: code = NotFound desc = could not find container \"501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6\": container with ID starting with 501a6df030fbae239f5d8272e091944ecadab7061abe37c89dc9bf780fd65ec6 not found: ID does not exist" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.921995 4931 scope.go:117] "RemoveContainer" containerID="0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe" Jan 30 06:53:48 crc kubenswrapper[4931]: E0130 06:53:48.922378 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe\": container with ID starting with 0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe not found: ID does not exist" containerID="0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.922399 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe"} err="failed to get container status \"0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe\": rpc error: code = NotFound desc = could not find container \"0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe\": container with ID starting with 0c0a77b03ffc1d974aec5d9744828fca06cfa63316b5bbba0f19f6465b5febbe not found: ID does not exist" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.922411 4931 scope.go:117] "RemoveContainer" containerID="968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218" Jan 30 06:53:48 crc kubenswrapper[4931]: E0130 06:53:48.922801 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218\": container with ID starting with 968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218 not found: ID does not exist" containerID="968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218" Jan 30 06:53:48 crc kubenswrapper[4931]: I0130 06:53:48.922833 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218"} err="failed to get container status \"968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218\": rpc error: code = NotFound desc = could not find container \"968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218\": container with ID starting with 968c183b425fecf3811291bc203004e271bc8bbe60568c2d9bc172c3e4840218 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4931]: E0130 06:53:49.011335 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b1c327_8fc5_4f2c_adf4_7c61aa9ff6cf.slice/crio-833fbe327aa6d698810c9ced1db040fd1f7ae267535aeb95838f24816cb7de3f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b1c327_8fc5_4f2c_adf4_7c61aa9ff6cf.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:53:49 crc kubenswrapper[4931]: I0130 06:53:49.452864 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" path="/var/lib/kubelet/pods/92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf/volumes" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.347643 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_faef005b-c58c-4b22-944c-defd3471fa32/init-config-reloader/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.540928 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_faef005b-c58c-4b22-944c-defd3471fa32/alertmanager/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.548208 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_faef005b-c58c-4b22-944c-defd3471fa32/init-config-reloader/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.575474 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_faef005b-c58c-4b22-944c-defd3471fa32/config-reloader/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.700242 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_890734fc-018f-4d2e-bc3e-ef4399f477da/aodh-api/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.823775 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_890734fc-018f-4d2e-bc3e-ef4399f477da/aodh-evaluator/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.829930 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_890734fc-018f-4d2e-bc3e-ef4399f477da/aodh-listener/0.log" Jan 30 06:53:50 crc kubenswrapper[4931]: I0130 06:53:50.876572 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_890734fc-018f-4d2e-bc3e-ef4399f477da/aodh-notifier/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.046135 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-cb1e-account-create-update-6drdw_51e6957d-e715-4a84-9952-19f773cfe882/mariadb-account-create-update/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.117525 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-create-fmv6s_681b527a-d511-4db8-8f19-1df02bbf9f61/mariadb-database-create/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.242614 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-sync-rq4fv_76eec61d-6ff6-4286-9102-758374c6fa27/aodh-db-sync/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.322182 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-847c6776d8-4sw8x_62d9ff65-c8d2-413f-b323-47a1db5ea2ed/barbican-api/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.422584 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:53:51 crc kubenswrapper[4931]: E0130 06:53:51.422844 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.463830 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-847c6776d8-4sw8x_62d9ff65-c8d2-413f-b323-47a1db5ea2ed/barbican-api-log/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.530739 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67d8db4f6b-c2v48_acbd1f75-958f-4fe5-8d52-f32c4d6c53f1/barbican-keystone-listener/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.603260 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67d8db4f6b-c2v48_acbd1f75-958f-4fe5-8d52-f32c4d6c53f1/barbican-keystone-listener-log/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.743288 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-784d6f7789-45xt8_a72d7303-20af-4fe7-be58-962eaa52c31a/barbican-worker-log/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.792063 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-784d6f7789-45xt8_a72d7303-20af-4fe7-be58-962eaa52c31a/barbican-worker/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.970431 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d49c4b54-cfb7-4264-a6b8-9ee32cc53de7/ceilometer-central-agent/0.log" Jan 30 06:53:51 crc kubenswrapper[4931]: I0130 06:53:51.972944 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d49c4b54-cfb7-4264-a6b8-9ee32cc53de7/ceilometer-notification-agent/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.048084 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d49c4b54-cfb7-4264-a6b8-9ee32cc53de7/proxy-httpd/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.102002 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d49c4b54-cfb7-4264-a6b8-9ee32cc53de7/sg-core/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.203017 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_abe3ac27-91a6-4c8d-880c-b94ad5bd7aea/cinder-api/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.344287 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_abe3ac27-91a6-4c8d-880c-b94ad5bd7aea/cinder-api-log/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.534632 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9be12b3c-c79f-4719-ab10-e3370519fbe3/cinder-backup/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.580629 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9be12b3c-c79f-4719-ab10-e3370519fbe3/probe/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.660592 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_41673f24-5c01-4401-839f-55da60930b4d/cinder-scheduler/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.832959 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_41673f24-5c01-4401-839f-55da60930b4d/probe/0.log" Jan 30 06:53:52 crc kubenswrapper[4931]: I0130 06:53:52.963088 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc/cinder-volume/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.006201 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_6ac5ad1d-f9ee-4fe6-8625-d30e49c099fc/probe/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.074552 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59977785bf-q4vw9_b05cd1de-6848-4de5-92f4-399913835db3/init/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.204261 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59977785bf-q4vw9_b05cd1de-6848-4de5-92f4-399913835db3/init/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.237762 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-59977785bf-q4vw9_b05cd1de-6848-4de5-92f4-399913835db3/dnsmasq-dns/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.303369 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_94f5b24a-840b-4206-a190-63cd6339ed70/glance-httpd/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.388582 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_94f5b24a-840b-4206-a190-63cd6339ed70/glance-log/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.493044 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf/glance-httpd/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.522138 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ee8afaf3-b244-4273-8aa5-cd9bfbf10ddf/glance-log/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.694703 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-0207-account-create-update-nwwgb_4110f6ea-5daa-4a1f-8fc2-f9497b7024f7/mariadb-account-create-update/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.741815 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-795f886c68-gphf9_e3a9064f-a3e2-4734-8b77-9e42deff080a/heat-api/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.909342 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7944f98bdf-sfnzs_7094dd36-79d9-4c63-9441-1753815af4a7/heat-cfnapi/0.log" Jan 30 06:53:53 crc kubenswrapper[4931]: I0130 06:53:53.938074 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-create-md7t7_65b44b5a-7476-44a4-b7ca-e6c246e9afdc/mariadb-database-create/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.106966 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-sync-t75hv_eae9c157-1120-45ac-8d6c-cc417f364b1f/heat-db-sync/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.139985 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6d6f44c564-6wts7_78cdbc3b-0ff9-4204-b62e-bc784e3fcb87/heat-engine/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.267363 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d459c77c7-fncxw_4254a5c6-88bc-4b8f-a425-79d9bea9eb6d/horizon/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.347131 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d459c77c7-fncxw_4254a5c6-88bc-4b8f-a425-79d9bea9eb6d/horizon-log/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.439538 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_eb1cdd0a-4520-49ce-8bc6-686dba45e7e8/kube-state-metrics/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.445949 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6bc679c867-wth9b_18835617-9ad2-4502-bbda-d4ac538081bd/keystone-api/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.682594 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-65c4-account-create-update-rfndg_037161b5-dad9-4d8f-9be4-f980ee947129/mariadb-account-create-update/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.730732 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_50c75c49-3fc8-4f3e-9af2-66535e3b49a9/manila-api/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.767174 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_50c75c49-3fc8-4f3e-9af2-66535e3b49a9/manila-api-log/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.870909 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-45ct9_448719bb-ff8e-4d9e-982b-a8425f907a15/mariadb-database-create/0.log" Jan 30 06:53:54 crc kubenswrapper[4931]: I0130 06:53:54.923974 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-ctjj7_2f518288-3c69-4f3a-9e32-9f9211cab22a/manila-db-sync/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.066157 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_b9eb637a-1c6e-47f5-87ec-fa28c244db0b/probe/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.173836 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_b9eb637a-1c6e-47f5-87ec-fa28c244db0b/manila-scheduler/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.201223 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70/manila-share/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.286272 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_3e8f1fa3-1b0e-4bc4-bde3-48a12ea6de70/probe/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.380217 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_371cff3f-3d31-4dc6-98eb-b03f2d967337/adoption/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.807495 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78978fdd5c-pqg87_ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5/neutron-httpd/0.log" Jan 30 06:53:55 crc kubenswrapper[4931]: I0130 06:53:55.978688 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78978fdd5c-pqg87_ef26e9fb-2e6d-4582-a140-1ebd8eebc9e5/neutron-api/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.239174 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d940c452-f401-4c40-accd-cb3178bc0490/nova-api-api/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.272992 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d940c452-f401-4c40-accd-cb3178bc0490/nova-api-log/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.349269 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_79469af6-a764-49c6-beaf-b49185c1028a/nova-cell0-conductor-conductor/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.661856 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a12d7d7a-0b33-425e-98be-5a28ef924b22/nova-cell1-conductor-conductor/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.693390 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c9e49a5c-323c-46de-b34f-2fef9465e277/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.983389 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_36294ba3-fcdd-45cd-b4ff-20ee280751da/nova-metadata-metadata/0.log" Jan 30 06:53:56 crc kubenswrapper[4931]: I0130 06:53:56.986539 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_36294ba3-fcdd-45cd-b4ff-20ee280751da/nova-metadata-log/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.237142 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ee8c9751-e3b5-4031-bb46-a7e5fae46f4e/nova-scheduler-scheduler/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.243096 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-79684d7c94-4r69m_eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395/init/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.398047 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-79684d7c94-4r69m_eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395/init/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.476784 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-79684d7c94-4r69m_eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395/octavia-api-provider-agent/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.610498 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-k6c7h_2c0bd14d-9378-4c91-87e8-4ec9681103e0/init/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.646841 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-79684d7c94-4r69m_eb1ddf4d-a2c5-4eae-8cd0-6eb3702c3395/octavia-api/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.849783 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-k6c7h_2c0bd14d-9378-4c91-87e8-4ec9681103e0/octavia-healthmanager/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.880493 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-fc9fv_8e6a5234-c995-4b65-afb5-e59eedb65e7f/init/0.log" Jan 30 06:53:57 crc kubenswrapper[4931]: I0130 06:53:57.908707 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-k6c7h_2c0bd14d-9378-4c91-87e8-4ec9681103e0/init/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.094987 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-fc9fv_8e6a5234-c995-4b65-afb5-e59eedb65e7f/init/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.110375 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-fc9fv_8e6a5234-c995-4b65-afb5-e59eedb65e7f/octavia-housekeeping/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.181266 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-g99g6_2c16935c-c83b-4b45-b4cd-b61f20ee764f/init/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.388677 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-g99g6_2c16935c-c83b-4b45-b4cd-b61f20ee764f/init/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.459966 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-g99g6_2c16935c-c83b-4b45-b4cd-b61f20ee764f/octavia-rsyslog/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.471058 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-68w9j_5a81fa26-7f20-43ef-922e-a9e63ee73709/init/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.730812 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-68w9j_5a81fa26-7f20-43ef-922e-a9e63ee73709/init/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.822991 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_66165b19-dfc8-403f-ae09-30299db6b19f/mysql-bootstrap/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.895331 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-68w9j_5a81fa26-7f20-43ef-922e-a9e63ee73709/octavia-worker/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.938867 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_66165b19-dfc8-403f-ae09-30299db6b19f/mysql-bootstrap/0.log" Jan 30 06:53:58 crc kubenswrapper[4931]: I0130 06:53:58.977728 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_66165b19-dfc8-403f-ae09-30299db6b19f/galera/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.126891 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3/mysql-bootstrap/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: E0130 06:53:59.249544 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.573542 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3/galera/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.585582 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a9261635-8331-44ae-88d1-df73db930d2d/openstackclient/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.621006 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_01b91d81-d7c5-4bd6-8cdd-c852a5cd2fc3/mysql-bootstrap/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.811319 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-cgvfd_9bf15e4b-1a09-401b-87e9-97cff0ee8c91/ovn-controller/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.880971 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cs66w_608bb576-83fd-4c7c-b8b3-a4f9ff46b661/openstack-network-exporter/0.log" Jan 30 06:53:59 crc kubenswrapper[4931]: I0130 06:53:59.997542 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tgfr_56a7c911-151f-42ff-b005-58bdaecd5d8b/ovsdb-server-init/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.235838 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tgfr_56a7c911-151f-42ff-b005-58bdaecd5d8b/ovsdb-server-init/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.236407 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tgfr_56a7c911-151f-42ff-b005-58bdaecd5d8b/ovs-vswitchd/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.273100 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5tgfr_56a7c911-151f-42ff-b005-58bdaecd5d8b/ovsdb-server/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.490287 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe/openstack-network-exporter/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.509303 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_49126964-dfd0-4103-a3fd-5244d9b49b9d/adoption/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.577030 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c1e8d282-d2e1-4053-ad7e-51ee97f7ffbe/ovn-northd/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.682831 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a05ee1a7-c012-4766-8d48-3b508d4f8cd2/openstack-network-exporter/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.727543 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a05ee1a7-c012-4766-8d48-3b508d4f8cd2/ovsdbserver-nb/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.944579 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_6e651c73-1761-4cda-83b7-5a80fa3af6f4/openstack-network-exporter/0.log" Jan 30 06:54:00 crc kubenswrapper[4931]: I0130 06:54:00.968466 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_6e651c73-1761-4cda-83b7-5a80fa3af6f4/ovsdbserver-nb/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.080510 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_60983391-7945-4efe-ae6d-7c6ae80e2df8/openstack-network-exporter/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.158721 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_60983391-7945-4efe-ae6d-7c6ae80e2df8/ovsdbserver-nb/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.247356 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ffc86399-3f01-4c6a-942d-b255a957dc52/openstack-network-exporter/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.306415 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ffc86399-3f01-4c6a-942d-b255a957dc52/ovsdbserver-sb/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.336658 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f5c3365d-6967-42e2-b00c-887a82a1b73e/memcached/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.469776 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_78634c9d-d8d8-4eed-adc7-fe9fdbf69a11/openstack-network-exporter/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.476469 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_78634c9d-d8d8-4eed-adc7-fe9fdbf69a11/ovsdbserver-sb/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.528392 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_83787678-4305-4893-8aa4-d1ddd8c15343/openstack-network-exporter/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.659146 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_83787678-4305-4893-8aa4-d1ddd8c15343/ovsdbserver-sb/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.678002 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b467bdbbb-ds8j4_6359f2c1-ac0c-4084-969e-7cff11e8b4d8/placement-api/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.735766 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b467bdbbb-ds8j4_6359f2c1-ac0c-4084-969e-7cff11e8b4d8/placement-log/0.log" Jan 30 06:54:01 crc kubenswrapper[4931]: I0130 06:54:01.847775 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_201626a3-bc04-48ab-859c-5a7ffe97670e/init-config-reloader/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.009875 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_201626a3-bc04-48ab-859c-5a7ffe97670e/init-config-reloader/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.010260 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_201626a3-bc04-48ab-859c-5a7ffe97670e/thanos-sidecar/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.041360 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_201626a3-bc04-48ab-859c-5a7ffe97670e/config-reloader/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.054053 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_201626a3-bc04-48ab-859c-5a7ffe97670e/prometheus/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.214496 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ea4042a1-4ebc-4b11-a7e4-e695a668aa81/setup-container/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.362145 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ea4042a1-4ebc-4b11-a7e4-e695a668aa81/setup-container/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.400148 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8dabfefe-4927-44d0-b370-f7e28f2a4f57/setup-container/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.421917 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:54:02 crc kubenswrapper[4931]: E0130 06:54:02.422168 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.456443 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ea4042a1-4ebc-4b11-a7e4-e695a668aa81/rabbitmq/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.577987 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8dabfefe-4927-44d0-b370-f7e28f2a4f57/setup-container/0.log" Jan 30 06:54:02 crc kubenswrapper[4931]: I0130 06:54:02.788171 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8dabfefe-4927-44d0-b370-f7e28f2a4f57/rabbitmq/0.log" Jan 30 06:54:03 crc kubenswrapper[4931]: I0130 06:54:03.217108 4931 scope.go:117] "RemoveContainer" containerID="c4a34d96918d76961993d44bfa88235147b0d786fa8c13e4e74a51d5c91d0e97" Jan 30 06:54:03 crc kubenswrapper[4931]: I0130 06:54:03.246039 4931 scope.go:117] "RemoveContainer" containerID="870048acdaf12d315c0defcb750ddf7659f2ce415320432c25b469719ef6d1d5" Jan 30 06:54:13 crc kubenswrapper[4931]: I0130 06:54:13.459468 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:54:13 crc kubenswrapper[4931]: E0130 06:54:13.460294 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.283939 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/util/0.log" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.476820 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/util/0.log" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.499635 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/pull/0.log" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.499941 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/pull/0.log" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.678305 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/util/0.log" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.685745 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/pull/0.log" Jan 30 06:54:25 crc kubenswrapper[4931]: I0130 06:54:25.688945 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d18p2ff_7fc41231-569f-429f-bcc3-d7d63888874b/extract/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.002313 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-4wv6z_80b25db7-e1c2-4787-89f4-952cd7e845ba/manager/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.007825 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-mttxk_eb76dd84-30db-4769-852c-9a42814949d7/manager/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.107150 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-bf56z_dea1ae69-0c15-4228-a323-dc6f762e3c82/manager/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.344490 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-nsn26_2773429e-ccbb-43a4-a88a-a1cd41a63e10/manager/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.369482 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-lmgq2_d806e5bf-8346-46c0-a3de-5f8412e92b4f/manager/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.478708 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-l5dv2_ce7feb31-22f3-42d9-83b1-cd9155abae99/manager/0.log" Jan 30 06:54:26 crc kubenswrapper[4931]: I0130 06:54:26.743795 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-v9fgj_33b18ace-2da3-4bad-b093-d7db2aad7f50/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.022180 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-ddtbw_cc5025a4-0807-478d-831a-c6ed424628a9/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.032571 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-5sgtg_a3f6ed4d-518f-4415-9378-73fca072d431/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.164578 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-tzxqv_29ae7a52-ff32-4f97-8f6c-830ac4e4b40b/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.282963 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-fsdvn_8553945b-dfe3-4c77-bb73-dce58c6ad3ba/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.421659 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:54:27 crc kubenswrapper[4931]: E0130 06:54:27.422188 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.464964 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-wssqz_5e6de10d-baf2-4ef4-9acf-d093ee65c4fd/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.691572 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-5l9jv_2b83a9b3-5579-438f-8f65-effa382b726c/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.772869 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-kndp7_456074da-531d-471b-92d3-cb4ea156bfae/manager/0.log" Jan 30 06:54:27 crc kubenswrapper[4931]: I0130 06:54:27.879701 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86dfb79cc7hfxjp_47b128c8-46ef-422c-aabc-1220f85fef83/manager/0.log" Jan 30 06:54:28 crc kubenswrapper[4931]: I0130 06:54:28.127094 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-757f46c65d-rscb9_27c443b8-82d2-41c1-b747-b89e6cb44f16/operator/0.log" Jan 30 06:54:28 crc kubenswrapper[4931]: I0130 06:54:28.718396 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7znpc_2efa198c-4fe6-4ed2-9627-14a9ce525363/registry-server/0.log" Jan 30 06:54:28 crc kubenswrapper[4931]: I0130 06:54:28.839176 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-mkk7j_a536697c-8056-4907-a09e-b23aa129435d/manager/0.log" Jan 30 06:54:29 crc kubenswrapper[4931]: I0130 06:54:29.004747 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-t4scx_59634caa-7fe0-49a1-98bf-dbc61a15f495/manager/0.log" Jan 30 06:54:29 crc kubenswrapper[4931]: I0130 06:54:29.082596 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-v4vnz_ad890bc5-5b72-4833-86d5-2c022cd87e4a/operator/0.log" Jan 30 06:54:29 crc kubenswrapper[4931]: I0130 06:54:29.470025 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-gqvgs_3d63764e-5f26-4a63-870a-af0e86eb5d23/manager/0.log" Jan 30 06:54:29 crc kubenswrapper[4931]: I0130 06:54:29.732519 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-vccxr_9e5eb1e9-111a-4230-92d6-5b1fbc332ada/manager/0.log" Jan 30 06:54:29 crc kubenswrapper[4931]: I0130 06:54:29.780314 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-gqv2m_8e470db6-3785-4da2-9b83-5242d6712d6a/manager/0.log" Jan 30 06:54:29 crc kubenswrapper[4931]: I0130 06:54:29.823319 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-vqp2s_6d92f2e0-367c-428a-bcd5-cf6e5846046f/manager/0.log" Jan 30 06:54:30 crc kubenswrapper[4931]: I0130 06:54:30.080621 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b6f655c79-qpp9f_5852e12a-376e-420f-a0fd-efecae7ef623/manager/0.log" Jan 30 06:54:41 crc kubenswrapper[4931]: I0130 06:54:41.423051 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:54:41 crc kubenswrapper[4931]: E0130 06:54:41.424340 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:54:53 crc kubenswrapper[4931]: I0130 06:54:53.173329 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w2zzb_e521b474-9f29-4841-a365-ed1589358607/control-plane-machine-set-operator/0.log" Jan 30 06:54:53 crc kubenswrapper[4931]: I0130 06:54:53.359304 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k9mcd_177d163e-7881-411f-a61b-a00e9c8bc9dc/kube-rbac-proxy/0.log" Jan 30 06:54:53 crc kubenswrapper[4931]: I0130 06:54:53.419030 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k9mcd_177d163e-7881-411f-a61b-a00e9c8bc9dc/machine-api-operator/0.log" Jan 30 06:54:54 crc kubenswrapper[4931]: I0130 06:54:54.422652 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:54:54 crc kubenswrapper[4931]: E0130 06:54:54.423077 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:55:08 crc kubenswrapper[4931]: I0130 06:55:08.423937 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:55:08 crc kubenswrapper[4931]: E0130 06:55:08.424767 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:55:08 crc kubenswrapper[4931]: I0130 06:55:08.546046 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-8l2w4_34b0cb15-9c48-4bb3-89e7-85efd5b8b76c/cert-manager-controller/0.log" Jan 30 06:55:08 crc kubenswrapper[4931]: I0130 06:55:08.782348 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-hsrfm_5049b2a6-f85e-4250-9b12-c70705adaf35/cert-manager-webhook/0.log" Jan 30 06:55:08 crc kubenswrapper[4931]: I0130 06:55:08.791988 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-qvz8h_39da06e0-e9ea-4570-b486-3c0d2fe79820/cert-manager-cainjector/0.log" Jan 30 06:55:20 crc kubenswrapper[4931]: I0130 06:55:20.422098 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:55:20 crc kubenswrapper[4931]: E0130 06:55:20.422817 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:55:24 crc kubenswrapper[4931]: I0130 06:55:24.480095 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-mwzz2_8800ae15-51ee-4310-889d-3608008986bd/nmstate-console-plugin/0.log" Jan 30 06:55:24 crc kubenswrapper[4931]: I0130 06:55:24.686182 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6mhzq_66e77bed-ca3a-4cfe-874c-d6874c52ab0e/nmstate-handler/0.log" Jan 30 06:55:24 crc kubenswrapper[4931]: I0130 06:55:24.788251 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-2z4jr_01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c/kube-rbac-proxy/0.log" Jan 30 06:55:24 crc kubenswrapper[4931]: I0130 06:55:24.870903 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-2z4jr_01e6ed8f-a69f-4e32-b275-6ea9a5cebf1c/nmstate-metrics/0.log" Jan 30 06:55:24 crc kubenswrapper[4931]: I0130 06:55:24.982885 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-5tdhq_1c291268-6fc4-48a1-94dc-1e9e052e7bc6/nmstate-operator/0.log" Jan 30 06:55:25 crc kubenswrapper[4931]: I0130 06:55:25.059008 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-krf2l_a115b68a-a9ad-44db-90f5-1f016556956a/nmstate-webhook/0.log" Jan 30 06:55:33 crc kubenswrapper[4931]: I0130 06:55:33.422628 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:55:33 crc kubenswrapper[4931]: E0130 06:55:33.423281 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:55:41 crc kubenswrapper[4931]: I0130 06:55:41.032912 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-lx27m_2668098b-064f-4807-b2ee-7efb5dc89fb8/prometheus-operator/0.log" Jan 30 06:55:41 crc kubenswrapper[4931]: I0130 06:55:41.242304 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr_38907dab-62b6-4364-b48c-8300b1fa2ad2/prometheus-operator-admission-webhook/0.log" Jan 30 06:55:41 crc kubenswrapper[4931]: I0130 06:55:41.299938 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z_119a1b91-5877-408e-8721-dccac5a05367/prometheus-operator-admission-webhook/0.log" Jan 30 06:55:41 crc kubenswrapper[4931]: I0130 06:55:41.448319 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-qm276_c63c0b3f-7290-4318-8db6-a1ae150b22e0/operator/0.log" Jan 30 06:55:41 crc kubenswrapper[4931]: I0130 06:55:41.497855 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gw297_55072f8e-c1ef-45fd-9ec3-43e74afed3a7/perses-operator/0.log" Jan 30 06:55:45 crc kubenswrapper[4931]: I0130 06:55:45.428230 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:55:45 crc kubenswrapper[4931]: E0130 06:55:45.428996 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.286970 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-g5mxs_c9c06e8c-f207-490b-8bea-d6a742d63e72/kube-rbac-proxy/0.log" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.422060 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:55:58 crc kubenswrapper[4931]: E0130 06:55:58.422611 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.542369 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-frr-files/0.log" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.803013 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-metrics/0.log" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.806110 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-g5mxs_c9c06e8c-f207-490b-8bea-d6a742d63e72/controller/0.log" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.810014 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-reloader/0.log" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.817854 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-frr-files/0.log" Jan 30 06:55:58 crc kubenswrapper[4931]: I0130 06:55:58.995923 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-reloader/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.202978 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-frr-files/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.234397 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-reloader/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.236899 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-metrics/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.237214 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-metrics/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.480465 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-metrics/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.499116 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-reloader/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.499156 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/cp-frr-files/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.538666 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/controller/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.696389 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/frr-metrics/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.759456 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/kube-rbac-proxy/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.785218 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/kube-rbac-proxy-frr/0.log" Jan 30 06:55:59 crc kubenswrapper[4931]: I0130 06:55:59.895027 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/reloader/0.log" Jan 30 06:56:00 crc kubenswrapper[4931]: I0130 06:56:00.027074 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-56ftz_3ed7b42a-5ea1-4c7e-b3bc-a0431a72696e/frr-k8s-webhook-server/0.log" Jan 30 06:56:00 crc kubenswrapper[4931]: I0130 06:56:00.184923 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6969d469fc-rzjqg_164111f5-1bd4-4fc2-84f5-7418ee6e7e62/manager/0.log" Jan 30 06:56:00 crc kubenswrapper[4931]: I0130 06:56:00.352999 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7659bb7b4d-ssrqf_47321851-ef2d-47a3-949a-58f2e87df8dd/webhook-server/0.log" Jan 30 06:56:00 crc kubenswrapper[4931]: I0130 06:56:00.503922 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rcpl2_f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18/kube-rbac-proxy/0.log" Jan 30 06:56:01 crc kubenswrapper[4931]: I0130 06:56:01.245858 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rcpl2_f9e7a8e0-b04b-4d38-a05c-a9baa66d2c18/speaker/0.log" Jan 30 06:56:02 crc kubenswrapper[4931]: I0130 06:56:02.042563 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-768qr_be5b19a8-200f-462e-b8f2-fc956ec52080/frr/0.log" Jan 30 06:56:10 crc kubenswrapper[4931]: I0130 06:56:10.422654 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:56:10 crc kubenswrapper[4931]: E0130 06:56:10.424546 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:56:11 crc kubenswrapper[4931]: I0130 06:56:11.065631 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-0207-account-create-update-nwwgb"] Jan 30 06:56:11 crc kubenswrapper[4931]: I0130 06:56:11.077506 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-md7t7"] Jan 30 06:56:11 crc kubenswrapper[4931]: I0130 06:56:11.090781 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-md7t7"] Jan 30 06:56:11 crc kubenswrapper[4931]: I0130 06:56:11.101499 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-0207-account-create-update-nwwgb"] Jan 30 06:56:11 crc kubenswrapper[4931]: I0130 06:56:11.435967 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4110f6ea-5daa-4a1f-8fc2-f9497b7024f7" path="/var/lib/kubelet/pods/4110f6ea-5daa-4a1f-8fc2-f9497b7024f7/volumes" Jan 30 06:56:11 crc kubenswrapper[4931]: I0130 06:56:11.438087 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b44b5a-7476-44a4-b7ca-e6c246e9afdc" path="/var/lib/kubelet/pods/65b44b5a-7476-44a4-b7ca-e6c246e9afdc/volumes" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.006629 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/util/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.155005 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/pull/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.157201 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/util/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.197543 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/pull/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.348211 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/pull/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.362911 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/util/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.372208 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6bjp6_150d0383-4876-424e-b189-6ce3cceccb72/extract/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.529580 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/util/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.704611 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/pull/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.710090 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/pull/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.724787 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/util/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.895945 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/util/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.915313 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/pull/0.log" Jan 30 06:56:16 crc kubenswrapper[4931]: I0130 06:56:16.946785 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7134nvt4_52241d6a-5526-4d2b-baeb-e1fd0361a188/extract/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.107850 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/util/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.289820 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/pull/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.337921 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/pull/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.347530 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/util/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.506930 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/util/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.524629 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/extract/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.547719 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5tdhwd_686d3bad-998e-4688-a556-c25a0770810a/pull/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.701649 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/util/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.873015 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/util/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.908577 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/pull/0.log" Jan 30 06:56:17 crc kubenswrapper[4931]: I0130 06:56:17.914051 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/pull/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.030939 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/util/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.058266 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/pull/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.092043 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gknll_8db6c802-44ea-48b4-a63f-c6c43492e6bc/extract/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.222963 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/extract-utilities/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.399812 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/extract-content/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.427228 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/extract-utilities/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.434664 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/extract-content/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.572498 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/extract-utilities/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.591083 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/extract-content/0.log" Jan 30 06:56:18 crc kubenswrapper[4931]: I0130 06:56:18.790897 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/extract-utilities/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.026537 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/extract-utilities/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.059089 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/extract-content/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.097118 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/extract-content/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.256177 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/extract-content/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.295160 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/extract-utilities/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.327210 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wn8rd_f88493be-1e8e-47b8-9ac7-d035ba0b6e36/registry-server/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.465212 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ng75v_29014adb-d772-451f-b4bf-9fdb5d417d1e/marketplace-operator/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.541272 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/extract-utilities/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.746352 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/extract-content/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.755820 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/extract-content/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.766262 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lw8bs_9591e541-c3a7-4565-a829-b3da700f84ff/registry-server/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.798542 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/extract-utilities/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.964312 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/extract-utilities/0.log" Jan 30 06:56:19 crc kubenswrapper[4931]: I0130 06:56:19.977664 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/extract-utilities/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.015853 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/extract-content/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.186891 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-w6b74_5aacb80d-976e-4059-9c84-857aab618f4e/registry-server/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.236261 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/extract-utilities/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.236562 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/extract-content/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.246658 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/extract-content/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.414280 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/extract-content/0.log" Jan 30 06:56:20 crc kubenswrapper[4931]: I0130 06:56:20.425205 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/extract-utilities/0.log" Jan 30 06:56:21 crc kubenswrapper[4931]: I0130 06:56:21.142541 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kg222_4c0c107d-a03c-479f-b127-2824affd9b35/registry-server/0.log" Jan 30 06:56:23 crc kubenswrapper[4931]: I0130 06:56:23.051544 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-t75hv"] Jan 30 06:56:23 crc kubenswrapper[4931]: I0130 06:56:23.063871 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-t75hv"] Jan 30 06:56:23 crc kubenswrapper[4931]: I0130 06:56:23.435088 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae9c157-1120-45ac-8d6c-cc417f364b1f" path="/var/lib/kubelet/pods/eae9c157-1120-45ac-8d6c-cc417f364b1f/volumes" Jan 30 06:56:24 crc kubenswrapper[4931]: I0130 06:56:24.423807 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:56:24 crc kubenswrapper[4931]: E0130 06:56:24.424248 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:56:34 crc kubenswrapper[4931]: I0130 06:56:34.063543 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-689f7d46dd-nd6wr_38907dab-62b6-4364-b48c-8300b1fa2ad2/prometheus-operator-admission-webhook/0.log" Jan 30 06:56:34 crc kubenswrapper[4931]: I0130 06:56:34.069098 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-lx27m_2668098b-064f-4807-b2ee-7efb5dc89fb8/prometheus-operator/0.log" Jan 30 06:56:34 crc kubenswrapper[4931]: I0130 06:56:34.086451 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-689f7d46dd-vv89z_119a1b91-5877-408e-8721-dccac5a05367/prometheus-operator-admission-webhook/0.log" Jan 30 06:56:34 crc kubenswrapper[4931]: I0130 06:56:34.258328 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gw297_55072f8e-c1ef-45fd-9ec3-43e74afed3a7/perses-operator/0.log" Jan 30 06:56:34 crc kubenswrapper[4931]: I0130 06:56:34.271517 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-qm276_c63c0b3f-7290-4318-8db6-a1ae150b22e0/operator/0.log" Jan 30 06:56:37 crc kubenswrapper[4931]: I0130 06:56:37.422947 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:56:37 crc kubenswrapper[4931]: E0130 06:56:37.423702 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:56:50 crc kubenswrapper[4931]: I0130 06:56:50.422478 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:56:50 crc kubenswrapper[4931]: E0130 06:56:50.423411 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:57:01 crc kubenswrapper[4931]: E0130 06:57:01.722725 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.179:47192->38.102.83.179:45103: write tcp 38.102.83.179:47192->38.102.83.179:45103: write: broken pipe Jan 30 06:57:03 crc kubenswrapper[4931]: I0130 06:57:03.423370 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:57:03 crc kubenswrapper[4931]: E0130 06:57:03.424189 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:57:03 crc kubenswrapper[4931]: I0130 06:57:03.459654 4931 scope.go:117] "RemoveContainer" containerID="0c6e2269ccd94b91b1bc61c0d6038a0f312e1ff979dd42b9e25c99edd027ce3a" Jan 30 06:57:03 crc kubenswrapper[4931]: I0130 06:57:03.520455 4931 scope.go:117] "RemoveContainer" containerID="21f9a5808ddea5a133a2d53b441ae213a916b33040640a7576f4d7163df3f19d" Jan 30 06:57:03 crc kubenswrapper[4931]: I0130 06:57:03.565247 4931 scope.go:117] "RemoveContainer" containerID="461da2cabb65077a09c290e33233aae28ff5843458cdbe68b5fe17f6c78dd05f" Jan 30 06:57:18 crc kubenswrapper[4931]: I0130 06:57:18.421847 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:57:18 crc kubenswrapper[4931]: E0130 06:57:18.422726 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:57:30 crc kubenswrapper[4931]: I0130 06:57:30.422304 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:57:30 crc kubenswrapper[4931]: E0130 06:57:30.423270 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:57:44 crc kubenswrapper[4931]: I0130 06:57:44.423127 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:57:44 crc kubenswrapper[4931]: E0130 06:57:44.424161 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:57:59 crc kubenswrapper[4931]: I0130 06:57:59.422203 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:57:59 crc kubenswrapper[4931]: E0130 06:57:59.423294 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:58:06 crc kubenswrapper[4931]: I0130 06:58:06.898695 4931 generic.go:334] "Generic (PLEG): container finished" podID="1075a448-992c-4364-842b-06dda255cd42" containerID="f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92" exitCode=0 Jan 30 06:58:06 crc kubenswrapper[4931]: I0130 06:58:06.898924 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-57fqf/must-gather-v9ths" event={"ID":"1075a448-992c-4364-842b-06dda255cd42","Type":"ContainerDied","Data":"f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92"} Jan 30 06:58:06 crc kubenswrapper[4931]: I0130 06:58:06.899880 4931 scope.go:117] "RemoveContainer" containerID="f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92" Jan 30 06:58:07 crc kubenswrapper[4931]: I0130 06:58:07.378237 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-57fqf_must-gather-v9ths_1075a448-992c-4364-842b-06dda255cd42/gather/0.log" Jan 30 06:58:14 crc kubenswrapper[4931]: I0130 06:58:14.421953 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:58:14 crc kubenswrapper[4931]: E0130 06:58:14.422701 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:58:14 crc kubenswrapper[4931]: I0130 06:58:14.976192 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-57fqf/must-gather-v9ths"] Jan 30 06:58:14 crc kubenswrapper[4931]: I0130 06:58:14.976480 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-57fqf/must-gather-v9ths" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="copy" containerID="cri-o://b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942" gracePeriod=2 Jan 30 06:58:14 crc kubenswrapper[4931]: I0130 06:58:14.987269 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-57fqf/must-gather-v9ths"] Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.454009 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-57fqf_must-gather-v9ths_1075a448-992c-4364-842b-06dda255cd42/copy/0.log" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.456081 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.565064 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcnzx\" (UniqueName: \"kubernetes.io/projected/1075a448-992c-4364-842b-06dda255cd42-kube-api-access-wcnzx\") pod \"1075a448-992c-4364-842b-06dda255cd42\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.565400 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1075a448-992c-4364-842b-06dda255cd42-must-gather-output\") pod \"1075a448-992c-4364-842b-06dda255cd42\" (UID: \"1075a448-992c-4364-842b-06dda255cd42\") " Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.574782 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1075a448-992c-4364-842b-06dda255cd42-kube-api-access-wcnzx" (OuterVolumeSpecName: "kube-api-access-wcnzx") pod "1075a448-992c-4364-842b-06dda255cd42" (UID: "1075a448-992c-4364-842b-06dda255cd42"). InnerVolumeSpecName "kube-api-access-wcnzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.667935 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcnzx\" (UniqueName: \"kubernetes.io/projected/1075a448-992c-4364-842b-06dda255cd42-kube-api-access-wcnzx\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.740758 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1075a448-992c-4364-842b-06dda255cd42-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1075a448-992c-4364-842b-06dda255cd42" (UID: "1075a448-992c-4364-842b-06dda255cd42"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.769711 4931 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1075a448-992c-4364-842b-06dda255cd42-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.998226 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-57fqf_must-gather-v9ths_1075a448-992c-4364-842b-06dda255cd42/copy/0.log" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.998716 4931 generic.go:334] "Generic (PLEG): container finished" podID="1075a448-992c-4364-842b-06dda255cd42" containerID="b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942" exitCode=143 Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.998852 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-57fqf/must-gather-v9ths" Jan 30 06:58:15 crc kubenswrapper[4931]: I0130 06:58:15.998884 4931 scope.go:117] "RemoveContainer" containerID="b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942" Jan 30 06:58:16 crc kubenswrapper[4931]: I0130 06:58:16.034895 4931 scope.go:117] "RemoveContainer" containerID="f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92" Jan 30 06:58:16 crc kubenswrapper[4931]: I0130 06:58:16.134370 4931 scope.go:117] "RemoveContainer" containerID="b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942" Jan 30 06:58:16 crc kubenswrapper[4931]: E0130 06:58:16.135175 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942\": container with ID starting with b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942 not found: ID does not exist" containerID="b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942" Jan 30 06:58:16 crc kubenswrapper[4931]: I0130 06:58:16.135286 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942"} err="failed to get container status \"b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942\": rpc error: code = NotFound desc = could not find container \"b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942\": container with ID starting with b26cd51fcb25ad66a348c9e69c9d67a17e9cbab697a628c6367a7c23833c1942 not found: ID does not exist" Jan 30 06:58:16 crc kubenswrapper[4931]: I0130 06:58:16.135366 4931 scope.go:117] "RemoveContainer" containerID="f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92" Jan 30 06:58:16 crc kubenswrapper[4931]: E0130 06:58:16.135822 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92\": container with ID starting with f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92 not found: ID does not exist" containerID="f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92" Jan 30 06:58:16 crc kubenswrapper[4931]: I0130 06:58:16.135900 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92"} err="failed to get container status \"f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92\": rpc error: code = NotFound desc = could not find container \"f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92\": container with ID starting with f882eeb17f43412896a11902692366cd99cf2b3a65c45cfc4ea200e7ab314d92 not found: ID does not exist" Jan 30 06:58:17 crc kubenswrapper[4931]: I0130 06:58:17.438346 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1075a448-992c-4364-842b-06dda255cd42" path="/var/lib/kubelet/pods/1075a448-992c-4364-842b-06dda255cd42/volumes" Jan 30 06:58:26 crc kubenswrapper[4931]: I0130 06:58:26.423007 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:58:26 crc kubenswrapper[4931]: E0130 06:58:26.424111 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfdxs_openshift-machine-config-operator(189be3dc-d439-47c2-b1f2-7413fc4b5e85)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.190067 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6fq22"] Jan 30 06:58:27 crc kubenswrapper[4931]: E0130 06:58:27.191283 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="extract-utilities" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191301 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="extract-utilities" Jan 30 06:58:27 crc kubenswrapper[4931]: E0130 06:58:27.191328 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="copy" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191336 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="copy" Jan 30 06:58:27 crc kubenswrapper[4931]: E0130 06:58:27.191353 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="registry-server" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191362 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="registry-server" Jan 30 06:58:27 crc kubenswrapper[4931]: E0130 06:58:27.191375 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="extract-content" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191383 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="extract-content" Jan 30 06:58:27 crc kubenswrapper[4931]: E0130 06:58:27.191417 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="gather" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191460 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="gather" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191722 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b1c327-8fc5-4f2c-adf4-7c61aa9ff6cf" containerName="registry-server" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191734 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="copy" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.191755 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1075a448-992c-4364-842b-06dda255cd42" containerName="gather" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.195558 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.219552 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fq22"] Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.360810 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-utilities\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.360949 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-catalog-content\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.361012 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbxb\" (UniqueName: \"kubernetes.io/projected/81aaff5d-9686-458d-bd32-221d0ae71038-kube-api-access-hcbxb\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.463452 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-utilities\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.463567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-catalog-content\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.463598 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbxb\" (UniqueName: \"kubernetes.io/projected/81aaff5d-9686-458d-bd32-221d0ae71038-kube-api-access-hcbxb\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.464253 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-utilities\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.464319 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-catalog-content\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.486180 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbxb\" (UniqueName: \"kubernetes.io/projected/81aaff5d-9686-458d-bd32-221d0ae71038-kube-api-access-hcbxb\") pod \"certified-operators-6fq22\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:27 crc kubenswrapper[4931]: I0130 06:58:27.521639 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:28 crc kubenswrapper[4931]: I0130 06:58:28.133537 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fq22"] Jan 30 06:58:28 crc kubenswrapper[4931]: I0130 06:58:28.156086 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerStarted","Data":"44778904158eedde6f889cfc7e7aa952fb69ead9dd240f8b4485e99e02abc043"} Jan 30 06:58:29 crc kubenswrapper[4931]: I0130 06:58:29.168210 4931 generic.go:334] "Generic (PLEG): container finished" podID="81aaff5d-9686-458d-bd32-221d0ae71038" containerID="f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b" exitCode=0 Jan 30 06:58:29 crc kubenswrapper[4931]: I0130 06:58:29.168302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerDied","Data":"f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b"} Jan 30 06:58:29 crc kubenswrapper[4931]: I0130 06:58:29.171294 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:58:31 crc kubenswrapper[4931]: I0130 06:58:31.203081 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerStarted","Data":"af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0"} Jan 30 06:58:31 crc kubenswrapper[4931]: I0130 06:58:31.990629 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cc279"] Jan 30 06:58:31 crc kubenswrapper[4931]: I0130 06:58:31.996855 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.006644 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cc279"] Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.089499 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-catalog-content\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.089646 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dgkk\" (UniqueName: \"kubernetes.io/projected/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-kube-api-access-7dgkk\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.089978 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-utilities\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.192104 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-utilities\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.192306 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-catalog-content\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.192342 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dgkk\" (UniqueName: \"kubernetes.io/projected/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-kube-api-access-7dgkk\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.192800 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-utilities\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.192947 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-catalog-content\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.213280 4931 generic.go:334] "Generic (PLEG): container finished" podID="81aaff5d-9686-458d-bd32-221d0ae71038" containerID="af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0" exitCode=0 Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.213331 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerDied","Data":"af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0"} Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.224574 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dgkk\" (UniqueName: \"kubernetes.io/projected/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-kube-api-access-7dgkk\") pod \"redhat-operators-cc279\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.332093 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:32 crc kubenswrapper[4931]: I0130 06:58:32.826524 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cc279"] Jan 30 06:58:33 crc kubenswrapper[4931]: I0130 06:58:33.227637 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerDied","Data":"b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c"} Jan 30 06:58:33 crc kubenswrapper[4931]: I0130 06:58:33.227495 4931 generic.go:334] "Generic (PLEG): container finished" podID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerID="b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c" exitCode=0 Jan 30 06:58:33 crc kubenswrapper[4931]: I0130 06:58:33.231061 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerStarted","Data":"be5ecc3b88dee9a9394ab880d76a6331d57c03091370e6e5e590da9fe25fb187"} Jan 30 06:58:33 crc kubenswrapper[4931]: I0130 06:58:33.235076 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerStarted","Data":"89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4"} Jan 30 06:58:33 crc kubenswrapper[4931]: I0130 06:58:33.289250 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6fq22" podStartSLOduration=2.641569818 podStartE2EDuration="6.289207806s" podCreationTimestamp="2026-01-30 06:58:27 +0000 UTC" firstStartedPulling="2026-01-30 06:58:29.171079184 +0000 UTC m=+6644.540989441" lastFinishedPulling="2026-01-30 06:58:32.818717172 +0000 UTC m=+6648.188627429" observedRunningTime="2026-01-30 06:58:33.270349486 +0000 UTC m=+6648.640259753" watchObservedRunningTime="2026-01-30 06:58:33.289207806 +0000 UTC m=+6648.659118073" Jan 30 06:58:35 crc kubenswrapper[4931]: I0130 06:58:35.257785 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerStarted","Data":"7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe"} Jan 30 06:58:37 crc kubenswrapper[4931]: I0130 06:58:37.522523 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:37 crc kubenswrapper[4931]: I0130 06:58:37.523153 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:37 crc kubenswrapper[4931]: I0130 06:58:37.601705 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:38 crc kubenswrapper[4931]: I0130 06:58:38.384115 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:38 crc kubenswrapper[4931]: I0130 06:58:38.423000 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" Jan 30 06:58:39 crc kubenswrapper[4931]: I0130 06:58:39.312050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"e8299e6538c113448d4bfd5d8d4cfd1ff286a1ac24b25d9c331a09fd1c9ae275"} Jan 30 06:58:40 crc kubenswrapper[4931]: I0130 06:58:40.325493 4931 generic.go:334] "Generic (PLEG): container finished" podID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerID="7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe" exitCode=0 Jan 30 06:58:40 crc kubenswrapper[4931]: I0130 06:58:40.325832 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerDied","Data":"7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe"} Jan 30 06:58:40 crc kubenswrapper[4931]: I0130 06:58:40.370344 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fq22"] Jan 30 06:58:40 crc kubenswrapper[4931]: I0130 06:58:40.370789 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6fq22" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="registry-server" containerID="cri-o://89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4" gracePeriod=2 Jan 30 06:58:40 crc kubenswrapper[4931]: I0130 06:58:40.985951 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.129243 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-catalog-content\") pod \"81aaff5d-9686-458d-bd32-221d0ae71038\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.129500 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcbxb\" (UniqueName: \"kubernetes.io/projected/81aaff5d-9686-458d-bd32-221d0ae71038-kube-api-access-hcbxb\") pod \"81aaff5d-9686-458d-bd32-221d0ae71038\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.129680 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-utilities\") pod \"81aaff5d-9686-458d-bd32-221d0ae71038\" (UID: \"81aaff5d-9686-458d-bd32-221d0ae71038\") " Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.130670 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-utilities" (OuterVolumeSpecName: "utilities") pod "81aaff5d-9686-458d-bd32-221d0ae71038" (UID: "81aaff5d-9686-458d-bd32-221d0ae71038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.148133 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81aaff5d-9686-458d-bd32-221d0ae71038-kube-api-access-hcbxb" (OuterVolumeSpecName: "kube-api-access-hcbxb") pod "81aaff5d-9686-458d-bd32-221d0ae71038" (UID: "81aaff5d-9686-458d-bd32-221d0ae71038"). InnerVolumeSpecName "kube-api-access-hcbxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.185957 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81aaff5d-9686-458d-bd32-221d0ae71038" (UID: "81aaff5d-9686-458d-bd32-221d0ae71038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.233149 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.233199 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcbxb\" (UniqueName: \"kubernetes.io/projected/81aaff5d-9686-458d-bd32-221d0ae71038-kube-api-access-hcbxb\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.233224 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81aaff5d-9686-458d-bd32-221d0ae71038-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.340119 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerStarted","Data":"4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973"} Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.345236 4931 generic.go:334] "Generic (PLEG): container finished" podID="81aaff5d-9686-458d-bd32-221d0ae71038" containerID="89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4" exitCode=0 Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.345289 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerDied","Data":"89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4"} Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.345321 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fq22" event={"ID":"81aaff5d-9686-458d-bd32-221d0ae71038","Type":"ContainerDied","Data":"44778904158eedde6f889cfc7e7aa952fb69ead9dd240f8b4485e99e02abc043"} Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.345342 4931 scope.go:117] "RemoveContainer" containerID="89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.345686 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fq22" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.374323 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cc279" podStartSLOduration=2.80266755 podStartE2EDuration="10.374306865s" podCreationTimestamp="2026-01-30 06:58:31 +0000 UTC" firstStartedPulling="2026-01-30 06:58:33.230893805 +0000 UTC m=+6648.600804062" lastFinishedPulling="2026-01-30 06:58:40.80253313 +0000 UTC m=+6656.172443377" observedRunningTime="2026-01-30 06:58:41.372270518 +0000 UTC m=+6656.742180785" watchObservedRunningTime="2026-01-30 06:58:41.374306865 +0000 UTC m=+6656.744217122" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.376872 4931 scope.go:117] "RemoveContainer" containerID="af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.396547 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fq22"] Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.412866 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6fq22"] Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.428459 4931 scope.go:117] "RemoveContainer" containerID="f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.438172 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" path="/var/lib/kubelet/pods/81aaff5d-9686-458d-bd32-221d0ae71038/volumes" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.464842 4931 scope.go:117] "RemoveContainer" containerID="89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4" Jan 30 06:58:41 crc kubenswrapper[4931]: E0130 06:58:41.465224 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4\": container with ID starting with 89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4 not found: ID does not exist" containerID="89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.465265 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4"} err="failed to get container status \"89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4\": rpc error: code = NotFound desc = could not find container \"89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4\": container with ID starting with 89295dd9de1f3812fa2d886ac85237f5a1f7e9aeb59b7748f94d7dfed35106a4 not found: ID does not exist" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.465289 4931 scope.go:117] "RemoveContainer" containerID="af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0" Jan 30 06:58:41 crc kubenswrapper[4931]: E0130 06:58:41.465880 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0\": container with ID starting with af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0 not found: ID does not exist" containerID="af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.465904 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0"} err="failed to get container status \"af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0\": rpc error: code = NotFound desc = could not find container \"af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0\": container with ID starting with af4105548c114f658f29af4ea5a153d29b084d50d483937fb38f078b1c091db0 not found: ID does not exist" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.465920 4931 scope.go:117] "RemoveContainer" containerID="f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b" Jan 30 06:58:41 crc kubenswrapper[4931]: E0130 06:58:41.466222 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b\": container with ID starting with f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b not found: ID does not exist" containerID="f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b" Jan 30 06:58:41 crc kubenswrapper[4931]: I0130 06:58:41.466263 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b"} err="failed to get container status \"f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b\": rpc error: code = NotFound desc = could not find container \"f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b\": container with ID starting with f78ce1b9e6636a9258d6ae0a8b9483212143bc4b1f12d27b352d9926d4e7c28b not found: ID does not exist" Jan 30 06:58:42 crc kubenswrapper[4931]: I0130 06:58:42.332943 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:42 crc kubenswrapper[4931]: I0130 06:58:42.333011 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:43 crc kubenswrapper[4931]: I0130 06:58:43.396307 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cc279" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="registry-server" probeResult="failure" output=< Jan 30 06:58:43 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Jan 30 06:58:43 crc kubenswrapper[4931]: > Jan 30 06:58:52 crc kubenswrapper[4931]: I0130 06:58:52.397930 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:52 crc kubenswrapper[4931]: I0130 06:58:52.489549 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:53 crc kubenswrapper[4931]: I0130 06:58:53.365595 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cc279"] Jan 30 06:58:53 crc kubenswrapper[4931]: I0130 06:58:53.486597 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cc279" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="registry-server" containerID="cri-o://4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973" gracePeriod=2 Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.067331 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.245484 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-utilities\") pod \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.245724 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-catalog-content\") pod \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.245836 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dgkk\" (UniqueName: \"kubernetes.io/projected/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-kube-api-access-7dgkk\") pod \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\" (UID: \"dc9d97e5-c723-4f3e-b6f4-9e123f907f07\") " Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.246871 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-utilities" (OuterVolumeSpecName: "utilities") pod "dc9d97e5-c723-4f3e-b6f4-9e123f907f07" (UID: "dc9d97e5-c723-4f3e-b6f4-9e123f907f07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.247484 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.258017 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-kube-api-access-7dgkk" (OuterVolumeSpecName: "kube-api-access-7dgkk") pod "dc9d97e5-c723-4f3e-b6f4-9e123f907f07" (UID: "dc9d97e5-c723-4f3e-b6f4-9e123f907f07"). InnerVolumeSpecName "kube-api-access-7dgkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.349395 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dgkk\" (UniqueName: \"kubernetes.io/projected/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-kube-api-access-7dgkk\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.429112 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc9d97e5-c723-4f3e-b6f4-9e123f907f07" (UID: "dc9d97e5-c723-4f3e-b6f4-9e123f907f07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.451151 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9d97e5-c723-4f3e-b6f4-9e123f907f07-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.495929 4931 generic.go:334] "Generic (PLEG): container finished" podID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerID="4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973" exitCode=0 Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.495967 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerDied","Data":"4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973"} Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.495991 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cc279" event={"ID":"dc9d97e5-c723-4f3e-b6f4-9e123f907f07","Type":"ContainerDied","Data":"be5ecc3b88dee9a9394ab880d76a6331d57c03091370e6e5e590da9fe25fb187"} Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.496007 4931 scope.go:117] "RemoveContainer" containerID="4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.496115 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cc279" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.528391 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cc279"] Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.530383 4931 scope.go:117] "RemoveContainer" containerID="7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.536507 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cc279"] Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.553743 4931 scope.go:117] "RemoveContainer" containerID="b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.596684 4931 scope.go:117] "RemoveContainer" containerID="4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973" Jan 30 06:58:54 crc kubenswrapper[4931]: E0130 06:58:54.600852 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973\": container with ID starting with 4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973 not found: ID does not exist" containerID="4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.600897 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973"} err="failed to get container status \"4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973\": rpc error: code = NotFound desc = could not find container \"4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973\": container with ID starting with 4dcb7c8bd35030c7351b2cf2466e370f7e0c1d928b94b7b9c1c6f286ea77b973 not found: ID does not exist" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.600922 4931 scope.go:117] "RemoveContainer" containerID="7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe" Jan 30 06:58:54 crc kubenswrapper[4931]: E0130 06:58:54.601158 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe\": container with ID starting with 7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe not found: ID does not exist" containerID="7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.601212 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe"} err="failed to get container status \"7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe\": rpc error: code = NotFound desc = could not find container \"7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe\": container with ID starting with 7c15be5aea842f2fe5e46c76deae434bd626900eab27b89366443b81f3113abe not found: ID does not exist" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.601228 4931 scope.go:117] "RemoveContainer" containerID="b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c" Jan 30 06:58:54 crc kubenswrapper[4931]: E0130 06:58:54.601479 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c\": container with ID starting with b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c not found: ID does not exist" containerID="b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c" Jan 30 06:58:54 crc kubenswrapper[4931]: I0130 06:58:54.601532 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c"} err="failed to get container status \"b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c\": rpc error: code = NotFound desc = could not find container \"b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c\": container with ID starting with b264fc6a6786432333a6351c5fbe74da5fe62b0440da5331f8bd02e9cb9d8e9c not found: ID does not exist" Jan 30 06:58:55 crc kubenswrapper[4931]: I0130 06:58:55.449000 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" path="/var/lib/kubelet/pods/dc9d97e5-c723-4f3e-b6f4-9e123f907f07/volumes" Jan 30 06:59:03 crc kubenswrapper[4931]: I0130 06:59:03.718074 4931 scope.go:117] "RemoveContainer" containerID="a1eade7d298ab8964a28bb2ee51b88b39f3ffa30229e7d9814ffc7f1e58b96ec" Jan 30 06:59:12 crc kubenswrapper[4931]: I0130 06:59:12.070451 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-fmv6s"] Jan 30 06:59:12 crc kubenswrapper[4931]: I0130 06:59:12.083738 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-cb1e-account-create-update-6drdw"] Jan 30 06:59:12 crc kubenswrapper[4931]: I0130 06:59:12.094574 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-fmv6s"] Jan 30 06:59:12 crc kubenswrapper[4931]: I0130 06:59:12.104821 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-cb1e-account-create-update-6drdw"] Jan 30 06:59:13 crc kubenswrapper[4931]: I0130 06:59:13.441550 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e6957d-e715-4a84-9952-19f773cfe882" path="/var/lib/kubelet/pods/51e6957d-e715-4a84-9952-19f773cfe882/volumes" Jan 30 06:59:13 crc kubenswrapper[4931]: I0130 06:59:13.445063 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="681b527a-d511-4db8-8f19-1df02bbf9f61" path="/var/lib/kubelet/pods/681b527a-d511-4db8-8f19-1df02bbf9f61/volumes" Jan 30 06:59:24 crc kubenswrapper[4931]: I0130 06:59:24.050411 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-rq4fv"] Jan 30 06:59:24 crc kubenswrapper[4931]: I0130 06:59:24.058702 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-rq4fv"] Jan 30 06:59:25 crc kubenswrapper[4931]: I0130 06:59:25.446218 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76eec61d-6ff6-4286-9102-758374c6fa27" path="/var/lib/kubelet/pods/76eec61d-6ff6-4286-9102-758374c6fa27/volumes" Jan 30 06:59:49 crc kubenswrapper[4931]: I0130 06:59:49.088556 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-65c4-account-create-update-rfndg"] Jan 30 06:59:49 crc kubenswrapper[4931]: I0130 06:59:49.109309 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-45ct9"] Jan 30 06:59:49 crc kubenswrapper[4931]: I0130 06:59:49.124691 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-65c4-account-create-update-rfndg"] Jan 30 06:59:49 crc kubenswrapper[4931]: I0130 06:59:49.131830 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-45ct9"] Jan 30 06:59:49 crc kubenswrapper[4931]: I0130 06:59:49.437910 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="037161b5-dad9-4d8f-9be4-f980ee947129" path="/var/lib/kubelet/pods/037161b5-dad9-4d8f-9be4-f980ee947129/volumes" Jan 30 06:59:49 crc kubenswrapper[4931]: I0130 06:59:49.438739 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="448719bb-ff8e-4d9e-982b-a8425f907a15" path="/var/lib/kubelet/pods/448719bb-ff8e-4d9e-982b-a8425f907a15/volumes" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.184360 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9"] Jan 30 07:00:00 crc kubenswrapper[4931]: E0130 07:00:00.185290 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="extract-content" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185305 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="extract-content" Jan 30 07:00:00 crc kubenswrapper[4931]: E0130 07:00:00.185321 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="extract-utilities" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185329 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="extract-utilities" Jan 30 07:00:00 crc kubenswrapper[4931]: E0130 07:00:00.185347 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="extract-content" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185355 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="extract-content" Jan 30 07:00:00 crc kubenswrapper[4931]: E0130 07:00:00.185387 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="extract-utilities" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185395 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="extract-utilities" Jan 30 07:00:00 crc kubenswrapper[4931]: E0130 07:00:00.185409 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185417 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4931]: E0130 07:00:00.185464 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185472 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185716 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="81aaff5d-9686-458d-bd32-221d0ae71038" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.185737 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9d97e5-c723-4f3e-b6f4-9e123f907f07" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.186571 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.189500 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.189513 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.211064 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9"] Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.365192 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-config-volume\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.365261 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-secret-volume\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.365309 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdzp\" (UniqueName: \"kubernetes.io/projected/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-kube-api-access-5tdzp\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.467579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-config-volume\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.467628 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-secret-volume\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.467656 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdzp\" (UniqueName: \"kubernetes.io/projected/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-kube-api-access-5tdzp\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.470296 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-config-volume\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.479193 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-secret-volume\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.495536 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdzp\" (UniqueName: \"kubernetes.io/projected/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-kube-api-access-5tdzp\") pod \"collect-profiles-29495940-2zjc9\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:00 crc kubenswrapper[4931]: I0130 07:00:00.511300 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:01 crc kubenswrapper[4931]: I0130 07:00:01.046873 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9"] Jan 30 07:00:01 crc kubenswrapper[4931]: I0130 07:00:01.246404 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" event={"ID":"6677b63a-e1ff-4d6f-9b6f-d74e60885d60","Type":"ContainerStarted","Data":"cc7e299fb616f245f3fd8ebbffc8bbb401b10a331200fb770a781891bf8cf695"} Jan 30 07:00:02 crc kubenswrapper[4931]: I0130 07:00:02.259222 4931 generic.go:334] "Generic (PLEG): container finished" podID="6677b63a-e1ff-4d6f-9b6f-d74e60885d60" containerID="6d6a598573685f7bcd09f1bb9e1195b900bc135c468f8a7919e2b7f988b0e7aa" exitCode=0 Jan 30 07:00:02 crc kubenswrapper[4931]: I0130 07:00:02.259287 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" event={"ID":"6677b63a-e1ff-4d6f-9b6f-d74e60885d60","Type":"ContainerDied","Data":"6d6a598573685f7bcd09f1bb9e1195b900bc135c468f8a7919e2b7f988b0e7aa"} Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.555037 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-97qh7"] Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.558235 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.584007 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-97qh7"] Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.747968 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvnj\" (UniqueName: \"kubernetes.io/projected/d11b6770-bb41-4825-90a4-3b4af2daecd9-kube-api-access-8tvnj\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.748455 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-catalog-content\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.748644 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-utilities\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.750120 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.850035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-config-volume\") pod \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.850275 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-secret-volume\") pod \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.850315 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tdzp\" (UniqueName: \"kubernetes.io/projected/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-kube-api-access-5tdzp\") pod \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\" (UID: \"6677b63a-e1ff-4d6f-9b6f-d74e60885d60\") " Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.850759 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-catalog-content\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.850869 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-utilities\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.850962 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvnj\" (UniqueName: \"kubernetes.io/projected/d11b6770-bb41-4825-90a4-3b4af2daecd9-kube-api-access-8tvnj\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.851072 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-config-volume" (OuterVolumeSpecName: "config-volume") pod "6677b63a-e1ff-4d6f-9b6f-d74e60885d60" (UID: "6677b63a-e1ff-4d6f-9b6f-d74e60885d60"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.851286 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-catalog-content\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.851344 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-utilities\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.851607 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.857133 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-kube-api-access-5tdzp" (OuterVolumeSpecName: "kube-api-access-5tdzp") pod "6677b63a-e1ff-4d6f-9b6f-d74e60885d60" (UID: "6677b63a-e1ff-4d6f-9b6f-d74e60885d60"). InnerVolumeSpecName "kube-api-access-5tdzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.857628 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6677b63a-e1ff-4d6f-9b6f-d74e60885d60" (UID: "6677b63a-e1ff-4d6f-9b6f-d74e60885d60"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.878511 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvnj\" (UniqueName: \"kubernetes.io/projected/d11b6770-bb41-4825-90a4-3b4af2daecd9-kube-api-access-8tvnj\") pod \"community-operators-97qh7\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.883485 4931 scope.go:117] "RemoveContainer" containerID="0e4d3615364adb9fc327ac5ce20cdd4fecf281a043a844859ed5dca539ce5720" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.901225 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.954044 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:03 crc kubenswrapper[4931]: I0130 07:00:03.954073 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tdzp\" (UniqueName: \"kubernetes.io/projected/6677b63a-e1ff-4d6f-9b6f-d74e60885d60-kube-api-access-5tdzp\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.034123 4931 scope.go:117] "RemoveContainer" containerID="7c8becae24c7a8a33bf584e1ab34512a30cd0f1208b8f42cb257da9c6245e6c8" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.079998 4931 scope.go:117] "RemoveContainer" containerID="113fabb6410c6ac50615d981ed7da97e3148c88e8fc0cf34f88de6f851a2a62e" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.139018 4931 scope.go:117] "RemoveContainer" containerID="030be6de81f263d984b02a8d10e7722844ea7978d675c59a14a66ccbbd2666b2" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.168074 4931 scope.go:117] "RemoveContainer" containerID="80ab6efc7f6dcfb70eed703ea54962d42118f91ddd843c75b9238af6658827ba" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.285538 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" event={"ID":"6677b63a-e1ff-4d6f-9b6f-d74e60885d60","Type":"ContainerDied","Data":"cc7e299fb616f245f3fd8ebbffc8bbb401b10a331200fb770a781891bf8cf695"} Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.285580 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc7e299fb616f245f3fd8ebbffc8bbb401b10a331200fb770a781891bf8cf695" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.285600 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-2zjc9" Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.413340 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-97qh7"] Jan 30 07:00:04 crc kubenswrapper[4931]: W0130 07:00:04.417045 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b6770_bb41_4825_90a4_3b4af2daecd9.slice/crio-530b9dd20a883a453a06f3f485065d62bc9fa1539abe4bb60cd79097ceb450b3 WatchSource:0}: Error finding container 530b9dd20a883a453a06f3f485065d62bc9fa1539abe4bb60cd79097ceb450b3: Status 404 returned error can't find the container with id 530b9dd20a883a453a06f3f485065d62bc9fa1539abe4bb60cd79097ceb450b3 Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.835734 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8"] Jan 30 07:00:04 crc kubenswrapper[4931]: I0130 07:00:04.848768 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-64nq8"] Jan 30 07:00:05 crc kubenswrapper[4931]: I0130 07:00:05.300177 4931 generic.go:334] "Generic (PLEG): container finished" podID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerID="f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b" exitCode=0 Jan 30 07:00:05 crc kubenswrapper[4931]: I0130 07:00:05.300266 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerDied","Data":"f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b"} Jan 30 07:00:05 crc kubenswrapper[4931]: I0130 07:00:05.300364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerStarted","Data":"530b9dd20a883a453a06f3f485065d62bc9fa1539abe4bb60cd79097ceb450b3"} Jan 30 07:00:05 crc kubenswrapper[4931]: I0130 07:00:05.436840 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d1c4a0-d36c-47d4-b603-3320c87f7c8e" path="/var/lib/kubelet/pods/f7d1c4a0-d36c-47d4-b603-3320c87f7c8e/volumes" Jan 30 07:00:07 crc kubenswrapper[4931]: I0130 07:00:07.322888 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerStarted","Data":"7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a"} Jan 30 07:00:08 crc kubenswrapper[4931]: I0130 07:00:08.337370 4931 generic.go:334] "Generic (PLEG): container finished" podID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerID="7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a" exitCode=0 Jan 30 07:00:08 crc kubenswrapper[4931]: I0130 07:00:08.337589 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerDied","Data":"7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a"} Jan 30 07:00:09 crc kubenswrapper[4931]: I0130 07:00:09.051045 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-ctjj7"] Jan 30 07:00:09 crc kubenswrapper[4931]: I0130 07:00:09.060757 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-ctjj7"] Jan 30 07:00:09 crc kubenswrapper[4931]: I0130 07:00:09.351528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerStarted","Data":"1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8"} Jan 30 07:00:09 crc kubenswrapper[4931]: I0130 07:00:09.377090 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-97qh7" podStartSLOduration=2.967603374 podStartE2EDuration="6.377065167s" podCreationTimestamp="2026-01-30 07:00:03 +0000 UTC" firstStartedPulling="2026-01-30 07:00:05.301798352 +0000 UTC m=+6740.671708609" lastFinishedPulling="2026-01-30 07:00:08.711260145 +0000 UTC m=+6744.081170402" observedRunningTime="2026-01-30 07:00:09.372440367 +0000 UTC m=+6744.742350624" watchObservedRunningTime="2026-01-30 07:00:09.377065167 +0000 UTC m=+6744.746975434" Jan 30 07:00:09 crc kubenswrapper[4931]: I0130 07:00:09.433165 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f518288-3c69-4f3a-9e32-9f9211cab22a" path="/var/lib/kubelet/pods/2f518288-3c69-4f3a-9e32-9f9211cab22a/volumes" Jan 30 07:00:13 crc kubenswrapper[4931]: I0130 07:00:13.902101 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:13 crc kubenswrapper[4931]: I0130 07:00:13.902565 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:13 crc kubenswrapper[4931]: I0130 07:00:13.992694 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:14 crc kubenswrapper[4931]: I0130 07:00:14.475375 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:14 crc kubenswrapper[4931]: I0130 07:00:14.539391 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-97qh7"] Jan 30 07:00:16 crc kubenswrapper[4931]: I0130 07:00:16.430796 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-97qh7" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="registry-server" containerID="cri-o://1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8" gracePeriod=2 Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.014353 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.171133 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-utilities\") pod \"d11b6770-bb41-4825-90a4-3b4af2daecd9\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.171457 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-catalog-content\") pod \"d11b6770-bb41-4825-90a4-3b4af2daecd9\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.171550 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvnj\" (UniqueName: \"kubernetes.io/projected/d11b6770-bb41-4825-90a4-3b4af2daecd9-kube-api-access-8tvnj\") pod \"d11b6770-bb41-4825-90a4-3b4af2daecd9\" (UID: \"d11b6770-bb41-4825-90a4-3b4af2daecd9\") " Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.172259 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-utilities" (OuterVolumeSpecName: "utilities") pod "d11b6770-bb41-4825-90a4-3b4af2daecd9" (UID: "d11b6770-bb41-4825-90a4-3b4af2daecd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.189601 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11b6770-bb41-4825-90a4-3b4af2daecd9-kube-api-access-8tvnj" (OuterVolumeSpecName: "kube-api-access-8tvnj") pod "d11b6770-bb41-4825-90a4-3b4af2daecd9" (UID: "d11b6770-bb41-4825-90a4-3b4af2daecd9"). InnerVolumeSpecName "kube-api-access-8tvnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.275815 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tvnj\" (UniqueName: \"kubernetes.io/projected/d11b6770-bb41-4825-90a4-3b4af2daecd9-kube-api-access-8tvnj\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.275852 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.417853 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d11b6770-bb41-4825-90a4-3b4af2daecd9" (UID: "d11b6770-bb41-4825-90a4-3b4af2daecd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.456731 4931 generic.go:334] "Generic (PLEG): container finished" podID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerID="1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8" exitCode=0 Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.456776 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerDied","Data":"1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8"} Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.456807 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97qh7" event={"ID":"d11b6770-bb41-4825-90a4-3b4af2daecd9","Type":"ContainerDied","Data":"530b9dd20a883a453a06f3f485065d62bc9fa1539abe4bb60cd79097ceb450b3"} Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.456826 4931 scope.go:117] "RemoveContainer" containerID="1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.457010 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97qh7" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.481295 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b6770-bb41-4825-90a4-3b4af2daecd9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.508089 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-97qh7"] Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.512823 4931 scope.go:117] "RemoveContainer" containerID="7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.515588 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-97qh7"] Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.536264 4931 scope.go:117] "RemoveContainer" containerID="f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.588905 4931 scope.go:117] "RemoveContainer" containerID="1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8" Jan 30 07:00:17 crc kubenswrapper[4931]: E0130 07:00:17.589403 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8\": container with ID starting with 1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8 not found: ID does not exist" containerID="1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.589544 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8"} err="failed to get container status \"1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8\": rpc error: code = NotFound desc = could not find container \"1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8\": container with ID starting with 1ca690e5fa631c6708487e76962fb8c404eb1fe8ac6981253fd5468444515de8 not found: ID does not exist" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.589570 4931 scope.go:117] "RemoveContainer" containerID="7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a" Jan 30 07:00:17 crc kubenswrapper[4931]: E0130 07:00:17.589955 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a\": container with ID starting with 7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a not found: ID does not exist" containerID="7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.589981 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a"} err="failed to get container status \"7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a\": rpc error: code = NotFound desc = could not find container \"7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a\": container with ID starting with 7fa3fd6b0b8a185936b360a7a6337b2b60aaadfd47734a302a50db94babeb38a not found: ID does not exist" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.589996 4931 scope.go:117] "RemoveContainer" containerID="f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b" Jan 30 07:00:17 crc kubenswrapper[4931]: E0130 07:00:17.590281 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b\": container with ID starting with f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b not found: ID does not exist" containerID="f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b" Jan 30 07:00:17 crc kubenswrapper[4931]: I0130 07:00:17.590314 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b"} err="failed to get container status \"f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b\": rpc error: code = NotFound desc = could not find container \"f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b\": container with ID starting with f7ec50cafb0866f182190f0d39f8e5ed6d5a522707e1a662b4113d83ea844e3b not found: ID does not exist" Jan 30 07:00:19 crc kubenswrapper[4931]: I0130 07:00:19.438233 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" path="/var/lib/kubelet/pods/d11b6770-bb41-4825-90a4-3b4af2daecd9/volumes" Jan 30 07:00:57 crc kubenswrapper[4931]: I0130 07:00:57.362562 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:00:57 crc kubenswrapper[4931]: I0130 07:00:57.363226 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.159641 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29495941-bp296"] Jan 30 07:01:00 crc kubenswrapper[4931]: E0130 07:01:00.160389 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="registry-server" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.160409 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="registry-server" Jan 30 07:01:00 crc kubenswrapper[4931]: E0130 07:01:00.160472 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="extract-utilities" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.160481 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="extract-utilities" Jan 30 07:01:00 crc kubenswrapper[4931]: E0130 07:01:00.160512 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6677b63a-e1ff-4d6f-9b6f-d74e60885d60" containerName="collect-profiles" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.160521 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6677b63a-e1ff-4d6f-9b6f-d74e60885d60" containerName="collect-profiles" Jan 30 07:01:00 crc kubenswrapper[4931]: E0130 07:01:00.160535 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="extract-content" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.160543 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="extract-content" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.160787 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11b6770-bb41-4825-90a4-3b4af2daecd9" containerName="registry-server" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.160813 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6677b63a-e1ff-4d6f-9b6f-d74e60885d60" containerName="collect-profiles" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.161825 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.189793 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29495941-bp296"] Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.245606 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzd8\" (UniqueName: \"kubernetes.io/projected/c3963310-007b-4e75-9a1f-6e84507084c3-kube-api-access-pjzd8\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.245663 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-fernet-keys\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.245765 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-combined-ca-bundle\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.245796 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-config-data\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.348739 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-combined-ca-bundle\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.348827 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-config-data\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.349037 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzd8\" (UniqueName: \"kubernetes.io/projected/c3963310-007b-4e75-9a1f-6e84507084c3-kube-api-access-pjzd8\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.349092 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-fernet-keys\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.357773 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-fernet-keys\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.358537 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-config-data\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.359728 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-combined-ca-bundle\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.371856 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzd8\" (UniqueName: \"kubernetes.io/projected/c3963310-007b-4e75-9a1f-6e84507084c3-kube-api-access-pjzd8\") pod \"keystone-cron-29495941-bp296\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.545125 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:00 crc kubenswrapper[4931]: I0130 07:01:00.992513 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29495941-bp296"] Jan 30 07:01:01 crc kubenswrapper[4931]: I0130 07:01:01.997994 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-bp296" event={"ID":"c3963310-007b-4e75-9a1f-6e84507084c3","Type":"ContainerStarted","Data":"09e69bd89e7ad5ed2585b6b1cd788defcf2c6ec2028c23926ae8ca0f752a9f20"} Jan 30 07:01:02 crc kubenswrapper[4931]: I0130 07:01:01.998874 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-bp296" event={"ID":"c3963310-007b-4e75-9a1f-6e84507084c3","Type":"ContainerStarted","Data":"6cdb452b95e20b6acbd95299d7131d4cf88076d5ac5b21f2922a236dafd43211"} Jan 30 07:01:02 crc kubenswrapper[4931]: I0130 07:01:02.025457 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29495941-bp296" podStartSLOduration=2.025413828 podStartE2EDuration="2.025413828s" podCreationTimestamp="2026-01-30 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 07:01:02.023546736 +0000 UTC m=+6797.393457023" watchObservedRunningTime="2026-01-30 07:01:02.025413828 +0000 UTC m=+6797.395324125" Jan 30 07:01:04 crc kubenswrapper[4931]: I0130 07:01:04.282868 4931 scope.go:117] "RemoveContainer" containerID="d9b4e7ab55cdff59c11d247a35f15b900c5c2d23ac2be2cf5caa19378305d01e" Jan 30 07:01:04 crc kubenswrapper[4931]: I0130 07:01:04.329239 4931 scope.go:117] "RemoveContainer" containerID="e300d33068406baea942af2b5b021d10a35ce639099354dd534b82d9b9278f4c" Jan 30 07:01:05 crc kubenswrapper[4931]: I0130 07:01:05.026590 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3963310-007b-4e75-9a1f-6e84507084c3" containerID="09e69bd89e7ad5ed2585b6b1cd788defcf2c6ec2028c23926ae8ca0f752a9f20" exitCode=0 Jan 30 07:01:05 crc kubenswrapper[4931]: I0130 07:01:05.026644 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-bp296" event={"ID":"c3963310-007b-4e75-9a1f-6e84507084c3","Type":"ContainerDied","Data":"09e69bd89e7ad5ed2585b6b1cd788defcf2c6ec2028c23926ae8ca0f752a9f20"} Jan 30 07:01:06 crc kubenswrapper[4931]: E0130 07:01:06.307730 4931 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.504362 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.597470 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-fernet-keys\") pod \"c3963310-007b-4e75-9a1f-6e84507084c3\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.609448 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c3963310-007b-4e75-9a1f-6e84507084c3" (UID: "c3963310-007b-4e75-9a1f-6e84507084c3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.616481 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-combined-ca-bundle\") pod \"c3963310-007b-4e75-9a1f-6e84507084c3\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.616556 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-config-data\") pod \"c3963310-007b-4e75-9a1f-6e84507084c3\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.616687 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzd8\" (UniqueName: \"kubernetes.io/projected/c3963310-007b-4e75-9a1f-6e84507084c3-kube-api-access-pjzd8\") pod \"c3963310-007b-4e75-9a1f-6e84507084c3\" (UID: \"c3963310-007b-4e75-9a1f-6e84507084c3\") " Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.617870 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.631871 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3963310-007b-4e75-9a1f-6e84507084c3-kube-api-access-pjzd8" (OuterVolumeSpecName: "kube-api-access-pjzd8") pod "c3963310-007b-4e75-9a1f-6e84507084c3" (UID: "c3963310-007b-4e75-9a1f-6e84507084c3"). InnerVolumeSpecName "kube-api-access-pjzd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.660599 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3963310-007b-4e75-9a1f-6e84507084c3" (UID: "c3963310-007b-4e75-9a1f-6e84507084c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.711539 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-config-data" (OuterVolumeSpecName: "config-data") pod "c3963310-007b-4e75-9a1f-6e84507084c3" (UID: "c3963310-007b-4e75-9a1f-6e84507084c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.719938 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.720003 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963310-007b-4e75-9a1f-6e84507084c3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:06 crc kubenswrapper[4931]: I0130 07:01:06.720031 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzd8\" (UniqueName: \"kubernetes.io/projected/c3963310-007b-4e75-9a1f-6e84507084c3-kube-api-access-pjzd8\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:07 crc kubenswrapper[4931]: I0130 07:01:07.058308 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-bp296" event={"ID":"c3963310-007b-4e75-9a1f-6e84507084c3","Type":"ContainerDied","Data":"6cdb452b95e20b6acbd95299d7131d4cf88076d5ac5b21f2922a236dafd43211"} Jan 30 07:01:07 crc kubenswrapper[4931]: I0130 07:01:07.058364 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cdb452b95e20b6acbd95299d7131d4cf88076d5ac5b21f2922a236dafd43211" Jan 30 07:01:07 crc kubenswrapper[4931]: I0130 07:01:07.058495 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-bp296" Jan 30 07:01:27 crc kubenswrapper[4931]: I0130 07:01:27.362919 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:01:27 crc kubenswrapper[4931]: I0130 07:01:27.363351 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.362730 4931 patch_prober.go:28] interesting pod/machine-config-daemon-wfdxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.363386 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.363501 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.364586 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8299e6538c113448d4bfd5d8d4cfd1ff286a1ac24b25d9c331a09fd1c9ae275"} pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.364668 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" podUID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerName="machine-config-daemon" containerID="cri-o://e8299e6538c113448d4bfd5d8d4cfd1ff286a1ac24b25d9c331a09fd1c9ae275" gracePeriod=600 Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.723153 4931 generic.go:334] "Generic (PLEG): container finished" podID="189be3dc-d439-47c2-b1f2-7413fc4b5e85" containerID="e8299e6538c113448d4bfd5d8d4cfd1ff286a1ac24b25d9c331a09fd1c9ae275" exitCode=0 Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.723278 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerDied","Data":"e8299e6538c113448d4bfd5d8d4cfd1ff286a1ac24b25d9c331a09fd1c9ae275"} Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.723875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfdxs" event={"ID":"189be3dc-d439-47c2-b1f2-7413fc4b5e85","Type":"ContainerStarted","Data":"22b60e2f05f30864d2598292b5a5136957888853e8221f9434b6c9fef14ee97c"} Jan 30 07:01:57 crc kubenswrapper[4931]: I0130 07:01:57.723911 4931 scope.go:117] "RemoveContainer" containerID="1ff15d5aac3f4c0764845266c3dd85db97427d881e5c701d217d7599df18276d" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515137053612024450 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015137053613017366 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015137035700016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015137035700015456 5ustar corecore